Publication

Equi-normalization of Neural Networks

International Conference on Learning Representations (ICLR)


Abstract

Modern neural networks are over-parametrized. In particular, each rectified linear hidden unit can be modified by a multiplicative factor by adjusting input and output weights, without changing the rest of the network. Inspired by the Sinkhorn-Knopp algorithm, we introduce a fast iterative method for minimizing the l2 norm of the weights, equivalently the weight decay regularizer. It provably converges to a unique solution. Interleaving our algorithm with SGD during training improves the test accuracy. For small batches, our approach offers an alternative to batch- and group- normalization on CIFAR-10 and ImageNet with a ResNet-18.

Related Publications

All Publications

NeurIPS - December 5, 2021

Interpretable agent communication from scratch (with a generic visual processor emerging on the side)

Roberto Dessì, Eugene Kharitonov, Marco Baroni

Workshop on Online Abuse and Harms (WHOAH) at ACL - November 30, 2021

Findings of the WOAH 5 Shared Task on Fine Grained Hateful Memes Detection

Lambert Mathias, Shaoliang Nie, Bertie Vidgen, Aida Davani, Zeerak Waseem, Douwe Kiela, Vinodkumar Prabhakaran

Journal of Big Data - November 6, 2021

A graphical method of cumulative differences between two subpopulations

Mark Tygert

NeurIPS - December 6, 2021

Parallel Bayesian Optimization of Multiple Noisy Objectives with Expected Hypervolume Improvement

Samuel Daulton, Maximilian Balandat, Eytan Bakshy

To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy