Séminaire Probabilités et Statistiques
Robust estimation and regression with MMD
07
Dec. 2023
Dec. 2023
Intervenant : | Pierre Alquier |
Institution : | ESSEC Asia-Pacific |
Heure : | 15h45 - 16h45 |
Lieu : | 3L15 |
Maximum likelihood estimation (MLE) enjoys strong optimality properties for statistical estimation, under strong assumptions. However, when these assumptions are not satisfied, MLE can be extremely unreliable. In this talk, we will explore alternative estimators based on the minimization of well chosen distances. In particular, we will see that the Maximum Mean Discrepancy (MMD, based on suitable kernels) leads to estimation procedures that are consistent without any assumption on the model nor on the data-generating process. This leads to strong robustness properties in practice, and this method was already used in complex models with promising results: estimation of SDE coefficients, ccopulas, data compression, generative models in AI...
In the second part of this talk, I will discuss the extension of this method to the estimation of conditional distributions, which allows to use MMD-estimators in various regression models. On the contrary to mean embeddings, very technical conditions are required for the existence of a conditional mean embedding that allows defining an estimator. In most papers, these conditions are often assumed, but rarely checked. It turns out that, in most generalized linear regression models, we proved that these conditions can be met, at the cost of more restrictions on the kernel choice.
This is based on joint works with: Badr-Eddine Chérief-Abdellatif (CNRS, Paris), Mathieu Gerber (University of Bristol), Daniele Durante (Bocconi University), Sirio Legramanti (University of Bergamo), Jean-David Fermanian (ENSAE Paris), Alexis Derumigny (TU Delft), Geoffrey Wolfer (RIKEN-AIP, Tokyo).