Publication

Inspirational Adversarial Image Generation

IEEE Transactions on Image Processing Journal


Abstract

The task of image generation started receiving some attention from artists and designers, providing inspiration for new creations. However, exploiting the results of deep generative models such as Generative Adversarial Networks can be long and tedious given the lack of existing tools. In this work, we propose a simple strategy to inspire creators with new generations learned from a dataset of their choice, while providing some control over the output. We design a simple optimization method to find the optimal latent parameters corresponding to the closest generation to any input inspirational image. Specifically, we allow the generation given an inspirational image of the user’s choosing by performing several optimization steps to recover optimal parameters from the model’s latent space. We tested several exploration methods from classical gradient descents to gradient-free optimizers. Many gradient-free optimizers just need comparisons (better/worse than another image), so they can even be used without numerical criterion nor inspirational image, only with human preferences. Thus, by iterating on one’s preferences we can make robust facial composite or fashion generation algorithms. Our results on four datasets of faces, fashion images, and textures show that satisfactory images are effectively retrieved in most cases.

Related Publications

All Publications

Interspeech - October 12, 2021

LiRA: Learning Visual Speech Representations from Audio through Self-supervision

Pingchuan Ma, Rodrigo Mira, Stavros Petridis, Björn W. Schuller, Maja Pantic

ICML - July 18, 2021

Latency-Aware Neural Architecture Search with Multi-Objective Bayesian Optimization

David Eriksson, Pierce I-Jen Chuang, Samuel Daulton, Peng Xia, Akshat Shrivastava, Arun Babu, Shicong Zhao, Ahmed Aly, Ganesh Venkatesh, Maximilian Balandat

ICML - July 12, 2020

Lookahead-Bounded Q-Learning

Ibrahim El Shar, Daniel Jiang

ICML - July 18, 2021

Variational Auto-Regressive Gaussian Processes for Continual Learning

Sanyam Kapoor, Theofanis Karaletsos, Thang D. Bui

To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookies Policy