Publication

Learning Multilingual Joint Sentence Embeddings with Neural Machine Translation

ACL workshop on Representation Learning for NLP (ACL)


Abstract

In this paper, we use the framework of neural machine translation to learn joint sentence representations across six very different languages. Our aim is that a representation which is independent of the language, is likely to capture the underlying semantics. We define a new cross-lingual similarity measure, compare up to 1.4M sentence representations and study the characteristics of close sentences. We provide experimental evidence that sentences that are close in embedding space are indeed semantically highly related, but often have quite different structure and syntax. These relations also hold when comparing sentences in different languages.

Related Publications

All Publications

Electronics (MDPI) Journal - November 4, 2021

Performance Evaluation of Offline Speech Recognition on Edge Devices

Santosh Gondi, Vineel Pratap

NeurIPS - December 6, 2021

Parallel Bayesian Optimization of Multiple Noisy Objectives with Expected Hypervolume Improvement

Samuel Daulton, Maximilian Balandat, Eytan Bakshy

EMNLP Conference on Machine Translation (WMT) - October 1, 2020

BERGAMOT-LATTE Submissions for the WMT20 Quality Estimation Shared Task

Marina Fomicheva, Shuo Sun, Lisa Yankovskaya, Frédéric Blain, Vishrav Chaudhary, Mark Fishel, Francisco Guzmán, Lucia Specia

Electronics (MDPI) Journal - November 10, 2021

Performance and Efficiency Evaluation of ASR Inference on the Edge

Santosh Gondi, Vineel Pratap

To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy