People

Barlas Oğuz

Research Scientist

I’m currently a research scientist at the Language and Translation Technologies group in Facebook AI. Recently, I worked on unsupervised pre-training of universal representations and cross-lingual models for NLP. My current interests include question answering, language generation and weakly supervised learning. Previously I was an applied researcher in the language modeling team at Microsoft, Sunnyvale. During my PhD I worked in the area of information theory and networking at UC Berkeley, where I investigated the information theoretic properties of heavy tailed stochastic processes.

Interests

Natural language processing, language modeling, multi-lingual models, question answering, few shot / zero shot learning

Latest Publications

PyText: A seamless path from NLP research to production

Ahmed Aly, Kushal Lakhotia, Shicong Zhao, Mrinal Mohit, Barlas Oğuz, Abhinav Arora, Sonal Gupta, Christopher Dewan, Stef Nelson-Lindall, Rushin Shah

December 14, 2018

Multilingual seq2seq training with similarity loss for cross-lingual document classification

Katherin Yu, Haoran Li, Barlas Oğuz

RepL4NLP Workshop at ACL 2018 - July 13, 2018