Publication

Masked Language Modeling and the Distributional Hypothesis: Order Word Matters Pre-training for Little

Conference on Empirical Methods in Natural Language Processing (EMNLP)


Abstract

A possible explanation for the impressive performance of masked language model (MLM) pre-training is that such models have learned to represent the syntactic structures prevalent in classical NLP pipelines. In this paper, we propose a different explanation: MLMs succeed on downstream tasks mostly due to their ability to model higher-order word co-occurrence statistics. To demonstrate this, we pre-train MLMs on sentences with randomly shuffled word order, and show that these models still achieve high accuracy after fine-tuning on many downstream tasks – including tasks specifically designed to be challenging for models that ignore word order. Our models also perform surprisingly well according to some parametric syntactic probes, indicating possible deficiencies in how we test representations for syntactic information. Overall, our results show that purely distributional information largely explains the success of pretraining, and they underscore the importance of curating challenging evaluation datasets that require deeper linguistic knowledge.

Related Publications

All Publications

Interspeech - October 12, 2021

LiRA: Learning Visual Speech Representations from Audio through Self-supervision

Pingchuan Ma, Rodrigo Mira, Stavros Petridis, Björn W. Schuller, Maja Pantic

ICML - July 18, 2021

Latency-Aware Neural Architecture Search with Multi-Objective Bayesian Optimization

David Eriksson, Pierce I-Jen Chuang, Samuel Daulton, Peng Xia, Akshat Shrivastava, Arun Babu, Shicong Zhao, Ahmed Aly, Ganesh Venkatesh, Maximilian Balandat

IEEE Transactions on Image Processing Journal - March 9, 2021

Inspirational Adversarial Image Generation

Baptiste Rozière, Morgane Rivière, Olivier Teytaud, Jérémy Rapin, Yann LeCun, Camille Couprie

ICML - July 12, 2020

Lookahead-Bounded Q-Learning

Ibrahim El Shar, Daniel Jiang

To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookies Policy