Random Matrix Theory for Sparse Graphs and Statistical Learning

Jiaoyang Huang – New York University


The success of random matrices in modeling lies in the universality phenomenon of their eigenvalue and eigenvector statistics. The universality phenomenon is well understood for Wigner matrices, which are dense and have independent entries. In applications to large real-world networks, it is important to study adjacency matrices of sparse random graphs with a specified valence for each node. In the first part of the talk, I will discuss universality results for eigenvalue statistics of sparse random matrices with dependent entries. This can be used to characterize the geometric properties of large networks.

Many statistical learning models, for example, spiked models and deep neural networks, are built out of random matrices. In the second part of the talk, I will discuss the spiked tensor model, where we need to extract information from a noisy high-dimensional data tensor. The tensor unfolding algorithm gives a spiked random matrix where the number of rows (columns) grows polynomially in the number of columns (rows). By analyzing its spectrum, we obtain an exact threshold for the tensor unfolding algorithm, which is independent of the unfolding procedure. In addition, I will also briefly mention my work on analyzing the training dynamics of deep neural networks using tools from random matrix theory.

There are four relevant papers:

  1. Bulk eigenvalue statistics for random regular graphs, with Roland Bauerschmidt, Antti Knowles and Horng-Tzer Yau. https://arxiv.org/abs/1505.06700
  2. Edge rigidity and universality of random regular graphs of intermediate degree, with Roland Bauerschmidt, Antti Knowles and Horng-Tzer Yau.  https://arxiv.org/abs/1910.10121
  3. Long random matrices and tensor unfolding, with Gérard Ben Arous and Daniel Zhengyu Huang. https://arxiv.org/abs/2110.10210
  4. Dynamics of deep neural networks and neural tangent hierarchy, with Horng-Tzer Yau. https://arxiv.org/abs/1909.08156