Barlas Oğuz

Research Scientist

I’m currently a research scientist at the Language and Translation Technologies group in Facebook AI. Recently, I worked on unsupervised pre-training of universal representations and cross-lingual models for NLP. My current interests include question answering, language generation and weakly supervised learning. Previously I was an applied researcher in the language modeling team at Microsoft, Sunnyvale. During my PhD I worked in the area of information theory and networking at UC Berkeley, where I investigated the information theoretic properties of heavy tailed stochastic processes.


Natural language processing, language modeling, multi-lingual models, question answering, few shot / zero shot learning

Latest Publications

ICLR - May 3, 2021

Answering Complex Open-Domain Questions with Multi-Hop Dense Retrieval

Wenhan Xiong, Xiang Lorraine Li, Srinivasan Iyer, Jingfei Du, Patrick Lewis, William Wang, Yashar Mehdad, Wen-tau Yih, Sebastian Riedel, Douwe Kiela, Barlas Oğuz

EMNLP - November 16, 2020

Dense Passage Retrieval for Open-Domain Question Answering

Vladimir Karpukhin, Barlas Oğuz, Sewon Min, Patrick Lewis, Ledell Wu, Sergey Edunov, Danqi Chen, Wen-tau Yih

December 14, 2018

PyText: A seamless path from NLP research to production

Ahmed Aly, Kushal Lakhotia, Shicong Zhao, Mrinal Mohit, Barlas Oğuz, Abhinav Arora, Sonal Gupta, Christopher Dewan, Stef Nelson-Lindall, Rushin Shah

RepL4NLP Workshop at ACL 2018 - July 13, 2018

Multilingual seq2seq training with similarity loss for cross-lingual document classification

Katherin Yu, Haoran Li, Barlas Oğuz