December 12, 2019
Compositional generalization through meta sequence-to-sequence learning
Neural Information Processing Systems (NeurIPS)
People can learn a new concept and use it compositionally, understanding how to “blicket twice” after learning how to “blicket.” In contrast, powerful sequence-to-sequence (seq2seq) neural networks fail such tests of compositionality, especially when composing new concepts together with existing concepts. In this paper, I show how memory-augmented neural networks can be trained to generalize compositionally through meta seq2seq learning.
By: Brenden Lake
Facebook AI Research
Natural Language Processing & Speech