About
I am a third year math PhD student at UCLA advised by Guido Montúfar. I’m interested in deep learning theory, and particularly in determining when and why neural networks optimize and generalize well. My recent work has been on characterizing benign overfitting under relaxed data assumptions, and understanding the loss landscape and initialization of neural networks. I am also interested in graph neural networks (GNNs). Some of my past work focused on alleviating the issues of oversquashing and oversmoothing in GNNs using spectral properties of the underlying graph. Before coming to UCLA, I completed my undergraduate degree in mathematics at Penn State. My résumé is available here.
News
10/2024: Two of our papers, Benign overfitting in leaky ReLU networks with moderate input dimension and Bounds for the smallest eigenvalue of the NTK for arbitrary spherical data of arbitrary dimension, were accepted to NeurIPS 2024!
06/2024: This summer I will be a quantitative research intern at SIG.
06/2024: Our paper Mildly Overparameterized ReLU Networks Have a Favorable Loss Landscape was accepted to TMLR.
01/2024: I attended the Advanced Studies Institute in Mathematics of Data Science and Machine Learning in Uzbekistan.
06/2023: I will be spending the summer as a visiting researcher at the Max Planck Institute for Mathematics in the Sciences in Leipzig, Germany.
01/2023: Our paper FoSR: First-order spectral rewiring for addressing oversquashing in GNNs was accepted to ICLR 2023.
08/2022: Our paper Oversquashing in GNNs through the lens of information contraction and graph expansion was accepted to Allerton 2022.
09/2021: I passed my qualifying exams in Analysis and Algebra as well as the basic exam.