Gradient Descent Learns One-hidden-layer CNN: Don’t be Afraid of Spurious Local Minima

International Conference on Machine Learning (ICML)


We consider the problem of learning a one-hidden-layer neural network with non-overlapping convolutional layer and ReLU activation function, i.e., f(Z; w, a) = Σj ajσ(wT Zj), in which both the convolutional weights w and the output weights a are parameters to be learned. We prove that with Gaussian input Z, there is a spurious local minimum that is not a global minimum. Surprisingly, in the presence of local minimum, starting from randomly initialized weights, gradient descent with weight normalization can still be proven to recover the true parameters with constant probability (which can be boosted to arbitrarily high accuracy with multiple restarts). We also show that with constant probability, the same procedure could also converge to the spurious local minimum, showing that the local minimum plays a non-trivial role in the dynamics of gradient descent. Furthermore, a quantitative analysis shows that the gradient descent dynamics has two phases: it starts off slow, but converges much faster after several iterations.

Related Publications

All Publications

Federated Learning for User Privacy and Data Confidentiality Workshop At ICML - July 24, 2021

Federated Learning with Buffered Asynchronous Aggregation

John Nguyen, Kshitiz Malik, Hongyuan Zhan, Ashkan Yousefpour, Michael Rabbat, Mani Malek, Dzmitry Huba

UAI - July 28, 2021

A Nonmyopic Approach to Cost-Constrained Bayesian Optimization

Eric Hans Lee, David Eriksson, Valerio Perrone, Matthias Seeger

ACM MM - October 20, 2021

EVRNet: Efficient Video Restoration on Edge Devices

Sachin Mehta, Amit Kumar, Fitsum Reda, Varun Nasery, Vikram Mulukutla, Rakesh Ranjan, Vikas Chandra

ICCV - October 11, 2021

Egocentric Pose Estimation from Human Vision Span

Hao Jiang, Vamsi Krishna Ithapu

To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookies Policy