Math & Stats Colloquium Series - Feb. 13, 2026 (Dr. Mufan Li)
Date and Time
Location
SSC 3317
Details
Speaker: Dr. Mufan Li
Affiliation: University of Waterloo
Title: The Proportional Scaling Limit of Neural Networks
Abstract: Recent advances in deep learning performance have all relied on scaling up the number of parameters within neural networks, consequently making asymptotic scaling limits a compelling approach to theoretical analysis. In this talk, we explore the proportional infinite-depth-and-width limit, where the role of depth can be adequately studied, and the limit remains a great model of finite size networks. At initialization, we characterize the limiting distribution of the network via a stochastic differential equation (SDE) for the feature covariance matrix. Furthermore, in the linear network setting, we characterize the spectrum of the covariance matrix in the large data limit via a geometric variant of Dyson Brownian motions. Finally, we will briefly discuss ongoing work towards analyzing training dynamics.