# teaching

**Courses**

### Fall 2024 - Statistical Learning II

Third year undergraduate (L3) machine learning course for students of the Double Licence Intelligence Artificielle et Sciences des organisations (IASO) at the Université Paris-Dauphine. The course material can be found at the dedicated course webpage.

### Statistical Physics For Optimization and Learning

Between 2020-2022 I was a teaching assistant (TA) for this advanced course on statistical physics methods for learning, constraint satisfaction and inference problems taught by Florent Krzakala and Lenka Zdeborová at EPFL. Lecture notes, exercises and corrections can be found in the course webpage.

**Summer schools and tutorials**

### Bangalore School on Statistical Physics XV

Advanced course on statistical physics of learning that I taught at the Bangalore School on Statistical Physics. The topics covered are a selection from my lecture notes.

### Princeton Machine Learning Theory Summer School 2024

Course that I have taught at the Princeton ML theory summer school on “Statistical physics for machine learning”. You might be interested in the lecture notes.

### Archimedes Workshop on the Foundations of Modern AI

Three lectures tutorial I gave at the Archimedes Workshop on the Foundations of Modern AI in Athens from 03-04 July 2024. The slides can be found here

### PSL Week “Statistical Physics and Machine Learning”

Week’s long advanced course on selected topics from Machine Learning theory and their relationship to Statistical Physics, co-organised with Francis Bach and Giulio Biroli. You might be interested in the lecture notes I have prepared for the SGD part of the week.

### Wonders of high-dimensions: the maths and physics of Machine Learning

A 3h long tutorial I gave at ACDL 2023. You can find the slides I have prepared here: Lecture 1, Lecture 2, Lecture 3.

### Statistical Physics view of theory of Machine Learning

Short 1h30 tutorial I gave at ACDL 2022. I have prepared some high-level slides and covered one detailed classic calculation in the field, namely the deterministic limit of one-pass SGD dynamics for two-layers neural networks, highlighting also some recent developments from this work that appeared NeurIPS 2022. You can this in the lecture notes