Edouard Grave

Research Scientist

I am a research scientist at Facebook Artificial Intelligence Research, working on machine learning and natural language processing. Before joining Facebook, I was a postdoctoral research scientist at Columbia University, working with Noémie Elhadad and Chris Wiggins and at UC Berkeley, working with Laurent El Ghaoui. I obtained my PhD in computer science from Université Paris VI in 2014, working under the supervision of Francis Bach and Guillaume Obozinski, and graduated from Ecole Polytechnique with an MS in machine learning and computer vision in 2010.


Machine learning, natural language processing and optimization

Latest Publications

ICLR - April 26, 2020

Depth-Adaptive Transformer

Maha Elbayad, Jiatao Gu, Edouard Grave, Michael Auli

ICLR - April 26, 2020

Reducing Transformer Depth on Demand with Structured Dropout

Angela Fan, Edouard Grave, Armand Joulin

EMNLP - November 3, 2019

Don’t Forget the Long Tail! A Comprehensive Analysis of Morphological Generalization in Bilingual Lexicon Induction

Paula Czarnowska, Sebastian Ruder, Edouard Grave, Ryan Cotterell, Ann Copestake

ACL - July 28, 2019

Training Hybrid Language Models by Marginalizing over Segmentations

Edouard Grave, Sainbayar Sukhbaatar, Piotr Bojanowski, Armand Joulin

ACL - July 28, 2019

How to Get Past Sesame Street: Sentence-Level Pretraining Beyond Language Modeling

Alex Wang, Jan Hula, Patrick Xia, Raghavendra Pappagari, R. Thomas McCoy, Roma Patel, Najoung Kim, Ian Tenney, Yinghui Huang, Katherin Yu, Shuning Jin, Berlin Chen, Benjamin Van Durme, Edouard Grave, Ellie Pavlick, Samuel R. Bowman

ACL - July 27, 2019

Adaptive Attention Span in Transformers

Sainbayar Sukhbaatar, Edouard Grave, Piotr Bojanowski, Armand Joulin

ICLR - May 17, 2019

Unsupervised Hyper-alignment for Multilingual Word Embeddings

Jean Alaux, Edouard Grave, Marco Cuturi, Armand Joulin

LREC - April 30, 2019

Learning Word Vectors for 157 Languages

Edouard Grave, Piotr Bojanowski, Prakhar Gupta, Armand Joulin, Tomas Mikolov

EMNLP 2018 - October 30, 2018

Loss in Translation: Learning Bilingual Word Mapping with a Retrieval Criterion

Armand Joulin, Piotr Bojanowski, Tomas Mikolov, Hervé Jégou, Edouard Grave

NAACL 2018 - June 1, 2018

Colorless Green Recurrent Networks Dream Hierarchically

Kristina Gulordava, Piotr Bojanowski, Edouard Grave, Tal Linzen, Marco Baroni

LREC 2018 - May 7, 2018

Advances in Pre-Training Distributed Word Representations

Tomas Mikolov, Edouard Grave, Piotr Bojanowski, Christian Puhrsch, Armand Joulin

AAAI 2018 - February 2, 2018

Efficient Large-Scale Multi-Modal Classification

Douwe Kiela, Edouard Grave, Armand Joulin, Tomas Mikolov

NIPS 2017 - December 4, 2017

Unbounded Cache Model for Online Language Modeling with Open Vocabulary

Edouard Grave, Moustapha Cisse, Armand Joulin

ICML 2017 - August 6, 2017

Efficient Softmax Approximation for GPUs

Edouard Grave, Armand Joulin, Moustapha Cisse, David Grangier, Hervé Jégou

TACL - July 31, 2017

Enriching Word Vectors with Subword Information

Piotr Bojanowski, Edouard Grave, Armand Joulin, Tomas Mikolov

ICLR 2017 - April 24, 2017

Improving Neural Language Models with a Continuous Cache

Edouard Grave, Armand Joulin, Nicolas Usunier

ICLR 2017 - April 24, 2017

Variable Computation in Recurrent Neural Networks

Yacine Jernite, Edouard Grave, Armand Joulin, Tomas Mikolov

EACL 2017 - April 3, 2017

Bag of Tricks for Efficient Text Classification

Armand Joulin, Edouard Grave, Piotr Bojanowski, Tomas Mikolov