Jul 4, 2023 |
I will present our paper Lazy vs hasty: linearization in deep networks impacts learning schedule based on example difficulty at Conférence sur l’Apprentissage Automatique (CAp) in Strasbourg
|
May 2, 2023 |
I am now a postdoctoral researcher at Orange Labs in Vincent Lemaire’s group.
|
Apr 20, 2023 |
I successfully defended my PhD Thesis: Deep networks training and generalization: insights from linearization (manuscript, slides) 🥳
|
Dec 21, 2022 |
Our paper Lazy vs hasty: linearization in deep networks impacts learning schedule based on example difficulty was accepted to TMLR. (code)
|
Sep 23, 2022 |
New pre-print Lazy vs hasty: linearization in deep networks impacts learning schedule based on example difficulty available.
|
Jul 22, 2022 |
I will present our recent work Lazy vs hasty: linearization in deep networks impacts learning schedule based on example difficulty (paper, poster) at the SCIS workshop at ICML 2022.
|
Jul 23, 2021 |
Presentation of our workshop paper Continual learning and Deep Networks: an Analysis of the Last Layer with Timothée Lesort at the Theory of Continual Learning workshop at ICML.
|
Jun 14, 2021 |
Presentation of Implicit Regularization via Neural Feature Alignment at Conférence sur l’Apprentissage Automatique 2021
|
Apr 21, 2021 |
I will present NNGeometry at the Pytorch Ecosystem Day
|
Feb 19, 2021 |
I will give a talk titled Optimization and generalization through the lens of the linearization of neural networks training dynamics in front of Roger Grosse’s group at Vector Institute Toronto.
|