Mathematics for IA 1 (2024) (Mathématiques pour l'Intelligence Artificielle 1)
Kernel and operator-theoretic methods in machine learning (2024)
- Lecture Notes (Work in progress, likely many typos and errors left!)
- List of papers to be presented by students:
- Concentration Inequalities and Moment Bounds for Sample Covariance Operators.
Vladimir Koltchinskii, Karim Lounici
https://arxiv.org/abs/1405.2468
- Optimally tackling covariate shift in RKHS-based nonparametric regression.
Cong Ma, Reese Pathak, Martin J. Wainwright
https://arxiv.org/abs/2205.02986
- Statistical Learning Theory for Neural Operators.
Niklas Reinhardt, Sven Wang, Jakob Zech
https://arxiv.org/abs/2412.17582
- Physics-informed machine learning as a kernel method.
Nathan Doumèche, Francis Bach, Claire Boyer, Gérard Biau
https://arxiv.org/abs/2304.13202
- Optimal Convergence Rates for Neural Operators
Mike Nguyen, Nicole Mücke
https://arxiv.org/abs/2412.17518
- Nyström Kernel Mean Embeddings
Antoine Chatalic, Nicolas Schreuder, Lorenzo Rosasco, Alessandro Rudi
https://proceedings.mlr.press/v162/chatalic22a.html
and
Mean Nyström Embeddings for Adaptive Compressive Learning
Antoine Chatalic, Luigi Carratino, Ernesto De Vito, Lorenzo Rosasco
https://proceedings.mlr.press/v151/chatalic22a.html
-
A kernel-based analysis of Laplacian Eigenmaps
Martin Wahl
https://arxiv.org/abs/2402.16481