Facebook AI Research has two papers accepted at EMNLP this year to be held in Qatar in October

#TagSpace: Semantic Embeddings from Hashtags by Jason Weston, Sumit Chopra and Keith Adams

This paper describes a convolutional neural network that learns feature representations for short textual posts using hashtags as a supervised signal. The proposed approach is trained on up to 5.5 billion words predicting 100,000 possible hashtags. As well as strong performance on the hashtag prediction task itself, it is shown that its learned representation of text (ignoring the hashtag labels) is useful for other tasks as well. To that end, results are presented on a document recommendation task, where it also outperforms a number of baselines.

Question Answering with Subgraph Embeddings by Antoine Bordes, Sumit Chopra and Jason Weston

This paper presents a system which learns to answer questions on a broad range of topics from a knowledge base using few hand-crafted features. Our model learns low-dimensional embeddings of words and knowledge base constituents; these representations are used to score natural language questions against candidate answers. Training our system using pairs of questions and structured representations of their answers, and pairs of question paraphrases, yields competitive results on a recent benchmark of the literature.

Antoine Bordes and Jason Weston will also give a tutorial at the conference on “Embedding Methods for Natural Language Processing”:

To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy