MAPLE logo

The ANR project Mathematical Analysis of Pattern Learning and Extraction (MAPLE) aims to address one of the most pressing questions in modern machine learning: how neural networks learn features from data and how it impacts their generalisation abilities. Despite their widespread success, the mathematical mechanisms that drive feature learning in deep neural networks remain poorly understood. This gap hinders our ability to design more efficient, scalable models, especially in an era where computational resources are becoming increasingly scarce. MAPLE seeks to establish a comprehensive mathematical framework that explains how neural networks adapt to data during training, with a particular focus on representational learning. The project addresses three core questions:

  1. How do neural networks adapt to the structure of data?
  2. How do architectural choices, such as convolutional networks or transformers, influence the representations learned?
  3. How does the emergent capabilities of neural networks arise from feature learning?

Principal Investigator

Bruno Loureiro
Principal Investigator

Project members

Arie Wortsman
PhD student
Pierre Mergny
Pierre Mergny
Postdoc

Project publications

2026

  1. Optimal scaling laws in learning hierarchical multi-index models
    Leonardo Defilippis ,  Florent Krzakala ,  Bruno Loureiro , and 1 more author
    2026
  2. A Random Matrix Theory of Masked Self-Supervised Regression
    Arie Wortsman Zurich ,  Federica Gerace ,  Bruno Loureiro , and 1 more author
    2026
  3. ICLR
    Scaling Laws and Spectra of Shallow Neural Networks in the Feature Learning Regime
    Leonardo Defilippis ,  Yizhou Xu ,  Julius Girardin , and 5 more authors
    In The Fourteenth International Conference on Learning Representations , 2026

2025

  1. Kernel ridge regression under power-law data: spectrum and generalization
    Arie Wortsman ,  and  Bruno Loureiro
    2025

Funding

ANR