Research Area
Year Published

870 Results

July 29, 2019

Keeping Notes: Conditional Natural Language Generation with a Scratchpad Mechanism

Association for Computational Linguistics (ACL)

We introduce the Scratchpad Mechanism, a novel addition to the sequence-to-sequence (seq2seq) neural network architecture and demonstrate its effectiveness in improving the overall fluency of seq2seq models for natural language generation tasks.

By: Ryan Y. Benmalek, Madian Khabsa, Suma Desu, Claire Cardie, Michele Banko

July 28, 2019

CoDraw: Collaborative Drawing as a Testbed for Grounded Goal-driven Communication

Association for Computational Linguistics (ACL)

In this work, we propose a goal-driven collaborative task that combines language, perception, and action. Specifically, we develop a Collaborative image-Drawing game between two agents, called CoDraw. Our game is grounded in a virtual world that contains movable clip art objects.

By: Jin-Hwa Kim, Nikita Kitaev, Xinlei Chen, Marcus Rohrbach, Byoung-Tak Zhang, Yuandong Tian, Dhruv Batra, Devi Parikh

July 28, 2019

ELI5: Long Form Question Answering

Association for Computational Linguistics (ACL)

We introduce the first large-scale corpus for long-form question answering, a task requiring elaborate and in-depth answers to open-ended questions. The dataset comprises 270K threads from the Reddit forum “Explain Like I’m Five” (ELI5) where an online community provides answers to questions which are comprehensible by five year olds.

By: Angela Fan, Yacine Jernite, Ethan Perez, David Grangier, Jason Weston, Michael Auli

July 28, 2019

Miss Tools and Mr Fruit: Emergent communication in agents learning about object affordances

Association for Computational Linguistics (ACL)

Recent research studies communication emergence in communities of deep network agents assigned a joint task, hoping to gain insights on human language evolution. We propose here a new task capturing crucial aspects of the human environment, such as natural object affordances, and of human conversation, such as full symmetry among the participants.

By: Diane Bouchacourt, Marco Baroni

July 28, 2019

Inferring Concept Hierarchies from Text Corpora via Hyperbolic Embeddings

Association for Computational Linguistics (ACL)

We consider the task of inferring is-a relationships from large text corpora. For this purpose, we propose a new method combining hyperbolic embeddings and Hearst patterns. This approach allows us to set appropriate constraints for inferring concept hierarchies from distributional contexts while also being able to predict missing is-a-relationships and to correct wrong extractions.

By: Matt Le, Stephen Roller, Laetitia Papaxanthos, Douwe Kiela, Maximilian Nickel

July 28, 2019

Training Hybrid Language Models by Marginalizing over Segmentations

Association for Computational Linguistics (ACL)

In this paper, we study the problem of hybrid language modeling, that is using models which can predict both characters and larger units such as character ngrams or words. Using such models, multiple potential segmentations usually exist for a given string, for example one using words and one using characters only.

By: Edouard Grave, Sainbayar Sukhbaatar, Piotr Bojanowski, Armand Joulin

July 28, 2019

How to Get Past Sesame Street: Sentence-Level Pretraining Beyond Language Modeling

Association for Computational Linguistics (ACL)

Natural language understanding has recently seen a surge of progress with the use of sentence encoders like ELMo (Peters et al., 2018a) and BERT (Devlin et al., 2019) which are pretrained on variants of language modeling. We conduct the first large-scale systematic study of candidate pretraining tasks, comparing 19 different tasks both as alternatives and complements to language modeling.

By: Alex Wang, Jan Hula, Patrick Xia, Raghavendra Pappagari, R. Thomas McCoy, Roma Patel, Najoung Kim, Ian Tenney, Yinghui Huang, Katherin Yu, Shuning Jin, Berlin Chen, Benjamin Van Durme, Edouard Grave, Ellie Pavlick, Samuel R. Bowman

July 28, 2019

Translating Translationese: A Two-Step Approach to Unsupervised Machine Translation

Association for Computational Linguistics (ACL)

Given a rough, word-by-word gloss of a source language sentence, target language natives can uncover the latent, fully-fluent rendering of the translation. In this work we explore this intuition by breaking translation into a two step process: generating a rough gloss by means of a dictionary and then ‘translating’ the resulting pseudo-translation, or ‘Translationese’ into a fully fluent translation.

By: Nima Pourdamghani, Nada Aldarrab, Marjan Ghazvininejad, Kevin Knight, Jonathan May

July 28, 2019

Learning to Optimize Halide with Tree Search and Random Programs

ACM SIGGRAPH

We present a new algorithm to automatically schedule Halide programs for high-performance image processing and deep learning. We significantly improve upon the performance of previous methods, which considered a limited subset of schedules.

By: Andrew Adams, Karima Ma, Luke Anderson, Riyadh Baghdadi, Tzu-Mao Li, Michaël Gharbi, Benoit Steiner, Steven Johnson, Kayvon Fatahalian, Frédo Durand, Jonathan Ragan-Kelley

July 28, 2019

What makes a good conversation? How controllable attributes affect human judgments

North American Chapter of the Association for Computational Linguistics (NAACL)

In this work, we examine two controllable neural text generation methods, conditional training and weighted decoding, in order to control four important attributes for chitchat dialogue: repetition, specificity, response-relatedness and question-asking.

By: Abigail See, Stephen Roller, Douwe Kiela, Jason Weston