Learning by Random Features: Sharp Asymptotics and Universality Laws

Dec 8, 2021, 4:30 pm4:30 pm


  • Center for Statistics and Machine Learning
  • Electrical and Computer Engineering
Event Description

Speaker: Yue M. Lu, Harvard University
Title: Learning by Random Features: Sharp Asymptotics and Universality Laws
Day: December 8, 2021
Time: 4:30 pm
Zoom Link: Please Register HERE
Host: Yuxin Chen and Jason Lee

Many new random matrix ensembles arise in learning and modern signal processing. As shown in recent studies, the spectral properties of these matrices help answer crucial questions regarding the training and generalization performance of neural networks, and the fundamental limits of high-dimensional signal recovery. As a result, there has been growing interest in precisely understanding the spectra and other asymptotic properties of these matrices. Unlike their classical counterparts, these new random matrices are often highly structured and are the result of nonlinear transformations. This combination of structure and nonlinearity leads to substantial technical challenges when applying existing tools from random matrix theory to these new random matrix ensembles.
In this talk, we will consider learning by random feature models and the related problem of kernel ridge regression. In each case, a nonlinear random matrix plays a prominent role. We provide an exact characterization of the asymptotic training and generalization errors of these models. These results reveal the important roles played by the regularization, the loss function and the activation function in the mitigation of the "double descent phenomenon" in learning. The asymptotic analysis is made possible by a general universality theorem, which establishes the asymptotic equivalence between the nonlinear random matrices and a surrogate linear random matrix ensemble that is much easier to work with.

Yue M. Lu attended the University of Illinois at Urbana-Champaign, where he received the M.Sc. degree in mathematics and the Ph.D. degree in electrical engineering, both in 2007. After his postdoctoral training at the Audiovisual Communications Laboratory at Ecole Polytechnique Fédérale de Lausanne (EPFL), Switzerland, he joined Harvard University, where he is currently Gordon McKay Professor of Electrical Engineering and of Applied Mathematics at the John A. Paulson School of Engineering and Applied Sciences. He is also fortunate to have held visiting appointments at Duke University in 2016 and at the École Normale Supérieure (ENS) in 2019. His research interests include theoretical and algorithmic aspects of high-dimensional signal and information processing.
This seminar is supported by CSML and ECE Korhammer Lecture Series Funds.