I’m currently a research scientist at the Language and Translation Technologies group in Facebook AI. Recently, I worked on unsupervised pre-training of universal representations and cross-lingual models for NLP. My current interests include question answering, language generation and weakly supervised learning. Previously I was an applied researcher in the language modeling team at Microsoft, Sunnyvale. During my PhD I worked in the area of information theory and networking at UC Berkeley, where I investigated the information theoretic properties of heavy tailed stochastic processes.
Natural language processing, language modeling, multi-lingual models, question answering, few shot / zero shot learning
EMNLP - November 16, 2020
Vladimir Karpukhin, Barlas Oğuz, Sewon Min, Patrick Lewis, Ledell Wu, Sergey Edunov, Danqi Chen, Wen-tau Yih
December 14, 2018
Ahmed Aly, Kushal Lakhotia, Shicong Zhao, Mrinal Mohit, Barlas Oğuz, Abhinav Arora, Sonal Gupta, Christopher Dewan, Stef Nelson-Lindall, Rushin Shah
RepL4NLP Workshop at ACL 2018 - July 13, 2018
Katherin Yu, Haoran Li, Barlas Oğuz