Selected Invited Talks
  • "Optimization, Robustness and Attention in Deep Learning: Insights from Random and NTK Feature Models", Machine learning for theories and theories of machine learning, Rovinj, October 2024.

  • —, Mathematics of Machine Learning, Cortona, September 2024

  • —, ROccella Conference on INference and AI – ROCKIN’ AI, Roccella Jonica, September 2024.

  • —, The Mathematics of Machine Learning Workshop, ETH Zurich, June 2024.

  • "Two Vignettes on PDE Methods for Deep Learning", PDE Methods in Machine Learning: from Continuum Dynamics to Algorithms workshop, Banff International Research Station, Institute of Mathematics at the University of Granada (BIRS-IMAG), Granada, June 2024.

  • "Phase Transitions for Spectral Estimators in Generalized Linear Models via Approximate Message Passing", Italian Meeting on Probability and Mathematical Statistics, Rome, June 2024.

  • "Optimization, Robustness and Attention in Deep Neural Networks: Insights from Random and NTK Feature Models", ISL Colloquium, Stanford, April 2024.

  • "From Spectral Estimators to Approximate Message Passing... And Back", Algorithmic Structures for Uncoordinated Communications and Statistical Inference in Exceedingly Large Spaces workshop, Banff International Research Station, March 2024.

  • "Phase Transitions for Spectral Estimators in Generalized Linear Models via Approximate Message Passing", International Zurich Seminar on Information and Communication (IZS), Zurich, March 2024.

  • "Fundamental Limits of Autoencoders, and Achieving Them with Gradient Methods", Learning and Information Theory (LITH) Workshop, EPFL, March 2024.

  • "Towards Understanding the Word Sensitivity of Attention Layers: A Study via Random Features", Information Theory and Applications (ITA) Workshop, UCSD, San Diego, February 2024.

  • "From Spectral Estimators to Approximate Message Passing... And Back", The Mathematics of Data, National University of Singapore (NUS), January 2024.

  • "Optimization, Robustness and Privacy in Deep Neural Networks: Insights from the Neural Tangent Kernel", Joint TILOS and OPTML++ seminar, MIT, November 2023 (Online).

  • "From Spectral Estimators to Approximate Message Passing... And Back", Probabilitas seminar, Harvard, November 2023 (Online).

  • "Three Vignettes on the Mean-Field Analysis of Neural Networks", Workshop on Analytical Approaches for Neural Network Dynamics, Institute Henri Poincaré (IHP), Paris, October 2023.

  • "Optimization, Robustness and Privacy in Deep Neural Networks: Insights from the Neural Tangent Kernel", Mathematical Information Science Workshop, Lagrange Mathematics and Computing Research Center (LMCRC), Paris, October 2023.

  • "Mean-field Analysis of Piecewise Linear Solutions for Wide ReLU Networks", International Congress on Industrial and Applied Mathematics (ICIAM), Waseda University, August 2023.

  • "Precise Asymptotics for Spectral Methods in Generalized Linear Models with Correlated Gaussian Designs", Joint Statistical Meeting (JSM), Toronto, August 2023.

  • "Inference in High Dimensions", European School of Information Theory (ESIT), University of Bristol, July 2023.

  • "From Spectral Estimators to Approximate Message Passing... And Back", Workshop on Learning and Inference from Structured Data: Universality, Correlations and Beyond, International Centre for Theoretical Physics (ICTP), July 2023.

  • "Three Vignettes on the Mean-Field Analysis of Neural Networks", Workshop on Optimal Transport, Mean-Field Models, and Machine Learning, Institute for Advanced Studies (IAS), Technical University of Munich (TUM), April 2023.

  • "Fundamental Limits of Two-layer Autoencoders, and Achieving Them with Gradient Methods", Information Theory and Applications (ITA) Workshop, UCSD, San Diego, February 2023.

  • "Inference in High Dimensions for (Mixed) Generalized Linear Models: the Linear, the Spectral and the Approximate", Information Theory and Data Science Workshop, National University of Singapore (NUS), January 2023.

  • "Understanding gradient descent for over-parameterized deep neural networks: Insights from mean-field theory and the neural tangent kernel", Colloquium of the Department of Mathematics, Technical University of Munich (TUM), November 2022.

  • "Inference in High Dimensions for (Mixed) Generalized Linear Models: the Linear, the Spectral and the Approximate", Stochastics and Statistics Seminar, MIT, November 2022.

  • "Gradient Descent for Deep Neural Networks: New Perspectives from Mean-field and NTK", University of Pennsylvania, November 2022.

  • "Inference in High Dimensions for Generalized Linear Models: the Linear, the Spectral and the Approximate", Canadian Workshop on Information Theory (CWIT), Ottawa, June 2022.

  • "Mean-field Analysis of Piecewise Linear Solutions for Wide ReLU Networks", Information Theory and Applications (ITA) Workshop, UCSD, San Diego, May 2022.

  • "Understanding Gradient Descent for Over-parameterized Deep Neural Networks", EPFL, April 2022.

  • —, ASU LIONS Seminar Series, Arizona State University, March 2022 (Online).

  • "Gradient Descent for Deep Neural Networks: New Perspectives from Mean-field and NTK", Math ML seminar MPI MiS + UCLA, MPI Leipzig, March 2022.

  • "Landscape Connectivity in Deep Neural Networks: Mean-field and Beyond", Loss Landscape of Neural Networks: theoretical insights and practical implications, EPFL, February 2022 (Online).

  • "Tight Bounds on the Smallest Eigenvalue of the Neural Tangent Kernel for Deep Neural Networks", DeepMind, December 2021 (Online).

  • "Analysis of a Two-Layer Neural Network via Displacement Convexity", Geometric Methods in Optimization and Sampling, Working Group: Mean Field NN, Simons Institute for the Theory of Computing, Berkeley, October 2021 (Online).

  • —, Theory of Neural Nets Seminar, EPFL, June 2021 (Online).

  • "Inference in High Dimensions for Generalized Linear Models: the Linear, the Spectral and the Approximate", ISOR Colloquium, University of Vienna, May 2021 (Online).

  • "Mode Connectivity and Convergence of Gradient Descent for (Not So) Over-parameterized Deep Neural Networks", International School for Advanced Studies (SISSA), March 2021 (Online).

  • "Understanding Gradient Descent for Over-parameterized Deep Neural Networks", Mathematics of Data Seminar, Max Planck Institute for Mathematics in the Sciences (MPI MiS), Leipzig, August 2020 (Online).

  • —, Youth in High Dimensions, International Centre for Theoretical Physics (ICTP), July 2020 (Online).

  • —, Technical University of Munich (TUM), June 2020 (Online).

  • "Landscape Connectivity and Dropout Stability of SGD Solutions for Over-parameterized Neural Networks", Information Theory and Applications (ITA) Workshop, UCSD, San Diego, February 2020.

  • "Analysis of a Two-Layer Neural Network via Displacement Convexity", Foundations of Data Science Reunion, Simons Institute for the Theory of Computing, Berkeley, December 2019.

  • —, Deep Learning Seminar, University of Vienna, October 2019.