The ANR project Mathematical Analysis of Pattern Learning and Extraction (MAPLE) aims to address one of the most pressing questions in modern machine learning: how neural networks learn features from data and how it impacts their generalisation abilities. Despite their widespread success, the mathematical mechanisms that drive feature learning in deep neural networks remain poorly understood. This gap hinders our ability to design more efficient, scalable models, especially in an era where computational resources are becoming increasingly scarce. MAPLE seeks to establish a comprehensive mathematical framework that explains how neural networks adapt to data during training, with a particular focus on representational learning. The project addresses three core questions:
How do neural networks adapt to the structure of data?
How do architectural choices, such as convolutional networks or transformers, influence the representations learned?
How does the emergent capabilities of neural networks arise from feature learning?
@misc{defilippis2026optimalscalinglawslearning,title={Optimal scaling laws in learning hierarchical multi-index models},author={Defilippis, Leonardo and Krzakala, Florent and Loureiro, Bruno and Maillard, Antoine},year={2026},eprint={2602.05846},archiveprefix={arXiv},primaryclass={stat.ML},url={https://arxiv.org/abs/2602.05846},}
A Random Matrix Theory of Masked Self-Supervised Regression
Arie Wortsman Zurich , Federica Gerace , Bruno Loureiro , and 1 more author
@misc{zurich2026randommatrixtheorymasked,title={A Random Matrix Theory of Masked Self-Supervised Regression},author={Zurich, Arie Wortsman and Gerace, Federica and Loureiro, Bruno and Lu, Yue M.},year={2026},eprint={2601.23208},archiveprefix={arXiv},primaryclass={stat.ML},url={https://arxiv.org/abs/2601.23208},}
ICLR
Scaling Laws and Spectra of Shallow Neural Networks in the Feature Learning Regime
Leonardo Defilippis , Yizhou Xu , Julius Girardin , and 5 more authors
In The Fourteenth International Conference on Learning Representations , 2026
@inproceedings{defilippis2025scalinglawsspectrashallow,title={Scaling Laws and Spectra of Shallow Neural Networks in the Feature Learning Regime},author={Defilippis, Leonardo and Xu, Yizhou and Girardin, Julius and Erba, Vittorio and Troiani, Emanuele and Zdeborov{\'a}, Lenka and Loureiro, Bruno and Krzakala, Florent},booktitle={The Fourteenth International Conference on Learning Representations},year={2026},url={https://openreview.net/forum?id=Q3yLIIkt7z},}
2025
Kernel ridge regression under power-law data: spectrum and generalization
@misc{wortsman2025kernelridge,title={Kernel ridge regression under power-law data: spectrum and generalization},author={Wortsman, Arie and Loureiro, Bruno},year={2025},eprint={2510.04780},archiveprefix={arXiv},primaryclass={stat.ML},url={https://arxiv.org/abs/2510.04780},}