Blog

Low-Resource NMT Awards

Facebook is pleased to announce the research award recipients for the Low-resource Neural Machine Translation (NMT) call for proposals. This effort is expected to contribute to the field of NMT through research into novel, strongly performing models under low-resource training conditions and/or comparable corpora mining techniques for low-resource language pairs.

Facebook selected the top 5 proposals. Of these, 3 were focused on low-resource modeling and 2 were focused on data mining approaches. The Principal Investigators are:

Trevor Cohn, University of Melbourne, Australia
Nearest neighbor search over vector space representations of massive corpora: An application to low-resource NMT

Victor O.K. Li, The University of Hong Kong, Hong Kong
Population-Based Meta-learning for Low-Resource Neural Machine Translation

David McAllester. Toyota Technological Institute at Chicago, USA
Phrase Based Unsupervised Machine Translation

Alexander Rush, Harvard University, USA
More Embeddings, Less Parameters: Unsupervised NMT by Learning to Reorder

William Wang, University of California, Santa Barbara, USA
Hierarchical Deep Reinforcement Learning for Semi-Supervised Low-Resource Comparable Corpora Mining

To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookies Policy