Facebook AI Research Sequence-to-Sequence Toolkit

The FAIR Sequence-to-Sequence toolkit implements a fully convolutional model for text generation.


StarSpace is a general-purpose neural model for efficient learning of entity embeddings for solving a wide variety of problems.

Target Propagation for Recurrent Neural Networks (TPRNN)

This code allows you to reproduce our results on two language modeling datasets, Penntree Bank (character and word) and wikitext, using various training methods.


SentEval is a library for evaluating the quality of sentence embeddings.


InferSent is a sentence embeddings method that provides semantic sentence representations.


This is a PyTorch implementation of the DrQA system.


ParlAI, is a unified platform, implemented in Python, for training and evaluating AI models on a variety of openly available dialog datasets using open-sourced learning agents.


The CommAI project aims at developing new data-sets and algorithms to develop and evaluate general-purpose artificial agents that rely on a linguistic interface, and are capable of quickly adapting to a never-ending stream of tasks.


Self contained software accompanying the paper titled: Learning Longer Memory in Recurrent Neural Networks.


Code to reproduce results described in the paper “Sequence Level Training with RNNs” ICLR 2016.


Torch is a scientific computing framework with wide support for machine learning algorithms. It is easy to use and efficient, thanks to an easy and fast scripting language, LuaJIT, and an underlying C/CUDA implementation.

Stack RNN

Stack RNN is a project gathering the code from the paper Inferring Algorithmic Patterns with Stack-Augmented Recurrent Nets by Armand Joulin and Tomas Mikolov.


FastText is a library for text representation and classification.


The bAbI Project is organized towards the goal of automatic text understanding and reasoning.