Publication

NAM – Unsupervised Cross-Domain Image Mapping without Cycles or GANs

International Conference on Learning Representations (ICLR)


Abstract

Several methods were recently proposed for Unsupervised Domain Mapping, which is the task of translating images between domains without prior knowledge of correspondences. Current approaches suffer from an instability in training due to relying on GANs which are powerful but highly sensitive to hyper-parameters and suffer from mode collapse. In addition, most methods rely heavily onĀ  “cycle” relationships between the domains, which enforce a one-to-one mapping. In this work, we introduce an alternative method: NAM. NAM relies on a pre-trained generative model of the source domain, and aligns each target image with an image sampled from the source distribution while jointly optimizing the domain mapping function. Experiments are presented validating the effectiveness of our method.

Related Publications

All Publications

NeurIPS - December 6, 2020

High-Dimensional Contextual Policy Search with Unknown Context Rewards using Bayesian Optimization

Qing Feng, Benjamin Letham, Hongzi Mao, Eytan Bakshy

Innovative Technology at the Interface of Finance and Operations - March 31, 2021

Market Equilibrium Models in Large-Scale Internet Markets

Christian Kroer, Nicolas E. Stier-Moses

Human Interpretability Workshop at ICML - July 17, 2020

Investigating Effects of Saturation in Integrated Gradients

Vivek Miglani, Bilal Alsallakh, Narine Kokhlikyan, Orion Reblitz-Richardson

ICASSP - June 6, 2021

Multi-Channel Speech Enhancement Using Graph Neural Networks

Panagiotis Tzirakis, Anurag Kumar, Jacob Donley

To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookies Policy