Publication

AIPNet: Generative Adversarial Pre-Training of Accent-Invariant Network for End-to-End Speech Recognition

International Conference on Acoustics, Speech, and Signal Processing (ICASSP)


Abstract

As one of the major sources in speech variability, accents have posed a grand challenge to the robustness of speech recognition systems. In this paper, our goal is to build a unified end-to-end speech recognition system that generalizes well across accents. For this purpose, we propose a novel pre-training framework AIPNet based on generative adversarial nets (GAN) for accent-invariant representation learning: Accent Invariant Pre-training Networks. We pre-train AIPNet to disentangle accent-invariant and accent-specific characteristics from acoustic features through adversarial training on accented data for which transcriptions are not necessarily available. We further fine-tune AIPNet by connecting the accent-invariant module with an attention-based encoder-decoder model for multiaccent speech recognition. In the experiments, our approach is compared against four baselines including both accent-dependent and accent-independent models. Experimental results on 9 English accents show that the proposed approach outperforms all the baselines by 2.3 ∼ 4.5% relative reduction on average WER when transcriptions are available in all accents and by 1.6 ∼ 6.1% relative reduction when transcriptions are only available in US accent.

Related Publications

All Publications

ICLR - May 4, 2021

Combining Label Propagation and Simple Models Out-performs Graph Neural Networks

Qian Huang, Horace He, Abhay Singh, Ser-Nam Lim, Austin Benson

ICLR - May 3, 2021

Creative Sketch Generation

Songwei Ge, Vedanuj Goswami, Larry Zitnick, Devi Parikh

EMNLP - November 15, 2020

Intrinsic Probing through Dimension Selection

Lucas Torroba Hennigen, Adina Williams, Ryan Cotterell

ICLR - May 3, 2021

Learning advanced mathematical computations from examples

François Charton, Amaury Hayat, Guillaume Lample

To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookies Policy