November 1, 2019
FlowSeq: Non-Autoregressive Conditional Sequence Generation with Generative Flow
Conference on Empirical Methods in Natural Language Processing (EMNLP)
In this paper, we propose a simple, efficient, and effective model for non-autoregressive sequence generation using latent variable models. Specifically, we turn to generative flow, an elegant technique to model complex distributions using neural networks, and design several layers of flow tailored for modeling the conditional density of sequential latent variables.
By: Xuezhe Ma, Chunting Zhou, Xian Li, Graham Neubig, Eduard Hovy
Facebook AI Research
Natural Language Processing & Speech