Publication

Extending Neural Generative Conversational Model using External Knowledge Sources

Empirical Methods in Natural Language Processing (EMNLP)


Abstract

The use of connectionist approaches in conversational agents has been progressing rapidly due to the availability of large corpora. However current generative dialogue models often lack coherence and are content poor. This work proposes an architecture to incorporate unstructured knowledge sources to enhance the next utterance prediction in chit-chat type of generative dialogue models. We focus on Sequence-to-Sequence (Seq2Seq) conversational agents trained with the Reddit News dataset, and consider incorporating external knowledge from Wikipedia summaries as well as from the NELL knowledge base. Our experiments show faster training time and improved perplexity when leveraging external knowledge.

Related Publications

All Publications

EACL - April 20, 2021

FEWS: Large-Scale, Low-Shot Word Sense Disambiguation with the Dictionary

Terra Blevins, Mandar Joshi, Luke Zettlemoyer

The Springer Series on Challenges in Machine Learning - December 12, 2019

The Second Conversational Intelligence Challenge (ConvAI2)

Emily Dinan, Varvara Logacheva, Valentin Malykh, Alexander Miller, Kurt Shuster, Jack Urbanek, Douwe Kiela, Arthur Szlam, Iulian Serban, Ryan Lowe, Shrimai Prabhumoye, Alan W. Black, Alexander Rudnicky, Jason Williams, Joelle Pineau, Jason Weston

ICLR - May 4, 2021

Combining Label Propagation and Simple Models Out-performs Graph Neural Networks

Qian Huang, Horace He, Abhay Singh, Ser-Nam Lim, Austin Benson

ICLR - May 3, 2021

Creative Sketch Generation

Songwei Ge, Vedanuj Goswami, Larry Zitnick, Devi Parikh

To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookies Policy