Dynamic Meta-Embeddings for Improved Sentence Representations

Empirical Methods in Natural Language Processing (EMNLP)

By: Douwe Kiela, Changhan Wang, Kyunghyun Cho

Abstract

While one of the first steps in many NLP systems is selecting what pre-trained word embeddings to use, we argue that such a step is better left for neural networks to figure out by themselves. To that end, we introduce dynamic meta-embeddings, a simple yet effective method for the supervised learning of embedding ensembles, which leads to state-of-the-art performance within the same model class on a variety of tasks. We subsequently show how the technique can be used to shed new light on the usage of word embeddings in NLP systems.