[Last Update 10/2024] [Home]

Gilles Blanchard's publications

Other possibly more up-to-date sources include Google Scholar, ArXiv and HAL.
Copyright notice: The material made available below is presented to ensure timely dissemination of scholarly and technical work and may differ to some extent from the final published versions. Copyright and all rights therein are retained by authors or by other copyright holders.
Preprints
  • S. Gaucher, G. Blanchard, F. Chazal. Supervised contamination detection, with flow cytometry application. [HAL]
  • G. Blanchard, G. Durand, A. Marandon-Carlhian, R. Périer. FDR control and FDP bounds for conformal link prediction. [HAL]
  • G. Blanchard, J-B. Fermanian, H. Marienwald. Estimation of multiple mean vectors in high dimension. [HAL]
  • O. Hacquard, G. Blanchard, C. Levrard. Statistical learning on measures: an application to persistence diagrams [arXiv]
  • O. Zadorozhnyi, G. Blanchard, A. Carpentier. Restless dependent bandits with fading memory. [arXiv]
  • F. Göbel, G. Blanchard. Volume doubling condition and a local Poincaré inequality on unweighted random geometric graphs. [arXiv]
  • G. Blanchard, P. Mathé, N. Mücke. Lepskii principle in supervised learning. [arXiv]
  • E. Saad, G. Blanchard, S. Arlot. Online orthogonal matching pursuit. [arXiv]
Journal papers, Book sections (Published/To appear)
  • I. Meah, G. Blanchard, E. Roquain. False discovery proportion envelopes with m-consistency. JMLR 25 (270), 1-52, 2024. [JMLR]
  • G. Blanchard, A. Carpentier, O. Zadorozhnyi. Moment inequalities for sums of weakly dependent random fields. Bernoulli 30(3): 2501-2520, 2024. [arXiv]
  • M. Perrot-Dockès, G. Blanchard, P. Neuvial, E. Roquain. Post hoc false discovery proportion inference under a hidden Markov model. TEST 32: 1365-1391, 2023. [HAL]
  • G. Blanchard, J-B. Fermanian. Nonasymptotic one-and two-sample tests in high dimension with unknown covariance structure. Foundations of Modern Statistics (Festschrift in honor of V. Spokoiny), D. Belomestny, C. Butucea, E. Mammen, E. Moulines editors, 121-162, Springer, 2023. [HAL]
  • G. Blanchard, P. Neuvial, E. Roquain. On agnostic post hoc approaches to false positive control. In Handbook of multiple comparisons, X. Pui, T. Dickhaus, Y. Ding, J. Hsu editors, 211-232, Chapman et Hall/CRC, 2022. [arXiv]
  • O. Hacquard, K. Balasubramanian, G. Blanchard, C. Levrard, W. Polonik. Topologically penalized regression on manifolds. Journal of machine learning research, 23(161):1-39, 2022. [JMLR]
  • T. Mary-Huard, V. Perduca, M-L. Martin-Magniette, G. Blanchard. Error rate control for classification rules in multiclass mixture models. The International Journal of Biostatistics, 2021. [HAL]
  • R. Gribonval, G. Blanchard, N. Keriven, Y. Traonmilin. Compressive statistical learning with random feature moments. Mathematical Statistics and Learning, 113-164, 2021. [arXiv]
  • R. Gribonval, G. Blanchard, N. Keriven, Y. Traonmilin. Statistical learning guarantees for compressive clustering and compressive mixture modeling. Mathematical Statistics and Learning, 165-257, 2021. [arXiv]
  • G. Blanchard, A. Deshmukh, U. Dogan, G. Lee, C. Scott. Domain generalization by marginal transfer learning. Journal of Machine Learning Research 22 (2), 1-55, 2021. [JMLR]
  • G. Blanchard, P. Neuvial, E. Roquain. Post hoc confidence bounds on false positives using reference families. Annals of Statistics 48 (3), 1281-1303, 2020. [arXiv]
  • G. Durand, G. Blanchard, P. Neuvial, E. Roquain. Post hoc false positive control for structured hypotheses. Scandinavian Journal of Statistics 47: 1114-1148, 2020. [arXiv]
  • G. Blanchard, N. Mücke Kernel regression, minimax rates and effective dimensionality: beyond the regular case. Analysis and Applications 18 (4): 683-693, 2020. [arXiv]
  • Abhishake Rastogi, G. Blanchard, P. Mathé. Convergence analysis of Tikhonov regularization for non-linear statistical inverse learning problems. Electronic Journal of Statistics 14 (2): 2798-2841, 2020. [arXiv]
  • G. Blanchard, O. Zadorozhnyi. Concentration of weakly dependent Banach-valued sums and applications to kernel learning methods. Bernoulli, 25(4B): 3421-3458, 2019. [arXiv]
  • J. Katz-Samuels, G. Blanchard, C. Scott. Decontamination of mutual contamination models. Journal of Machine Learning Research 20 (41):1-57, 2019. [JMLR]
  • G. Blanchard, N. Mücke. Parallelizing spectral algorithms for kernel learning. Journal of Machine Learning Research 19 (30):1-29, 2018. [JMLR]
  • G. Blanchard, A. Carpentier, M. Gutzeit. Minimax Euclidean separation rates for testing convex hypotheses in Rd. Electron. J. Statist. 12 (2): 3713-3735, 2018. [Project Euclid Open Access]
  • F. Bachoc, G. Blanchard, P. Neuvial. On the post selection inference constant under restricted isometry properties. Electron. J. Statist. 12(2): 3736-3757, 2018. [DOI, Project Euclid Open Access]
  • G. Blanchard, M. Hoffmann, M. Reiß. Early stopping for statistical inverse problems via truncated SVD estimation. Electron. J. Statist. 12 (2):3204-3231, 2018. [DOI, Project Euclid Open Access]
  • G. Blanchard, M. Hoffmann, M. Reiß. Optimal adaptation for early stopping in statistical inverse problems. SIAM/ASA Journal on Uncertainty Quantification 6(3): 1043-1075, 2018. [DOI] [arXiv]
  • G. Blanchard, F. Göbel, U. von Luxburg. Construction of tight frames on graphs and application to denoising. In Handbook of Big Data Analytics, Härdle, W., Lu, H.-S. and Xen, S. editors, Chapter 20, pp. 503-522, Springer, 2018. [DOI] [arXiv]
  • G. Blanchard, N. Mücke. Optimal rates for regularization of statistical inverse learning problems. Foundations of Computational Mathematics 18 (4): 971-1013, 2018 (first online: 2017). [DOI] [arXiv]
  • G. Blanchard, N. Krämer. Convergence rates of Kernel Conjugate Gradient for random design regression. Analysis and Applications 14 (6): 763-794, 2016. [DOI] [arXiv]
  • B. Mieth, M. Kloft, J. A. Rodríguez, S. Sonnenburg, R. Vobruba, C. Morcillo-Suárez, X. Farré, U.M. Marigorta, E. Fehr, T. Dickhaus, G. Blanchard, D. Schunk, A. Navarro & K.-R. Müller. Combining Multiple Hypothesis Testing with Machine Learning Increases the Statistical Power of Genome-wide Association Studies. Scientific Reports 6: 36671, 2016. [DOI, Nature Open Access]
  • G. Blanchard, M. Flaska, G. Handy, S. Pozzi, C. Scott. Classification with asymmetric label noise: Consistency and maximal denoising. Electronic Journal of Statistics 10 (2): 2780-2824, 2016. [Project Euclid Open access]
    With corrigendum: Electronic Journal of Statistics 12 (1): 1779-1781, 2018. [Project Euclid Open access]
  • A. Beinrucker, Ü. Dogan, G. Blanchard. Extensions of stability selection using subsamples of observations and covariates. Statistics and Computing 26: 1059-1077, 2016 (First online 2015). [DOI] [arXiv]
  • G. Blanchard, S. Delattre, E. Roquain. Testing over a continuum of null hypotheses with false discovery rate control. Bernoulli 20(1): 304-333, 2014. [DOI, Open Access]
  • G. Blanchard, T. Dickhaus, E. Roquain, F. Villers . On least favorable configurations for step-up-down tests. Statistica Sinica 24(1): 1-23, 2014. [DOI] [Supplement] [arXiv]
  • G. Blanchard, P. Mathé. Discrepancy principle for statistical inverse problems with application to conjugate gradient iteration. Inverse Problems 28 (11): 115011, 2012. [DOI] [UPotsdam preprint]
  • M. Kloft, G. Blanchard. The local Rademacher complexity of lp-norm multiple kernel learning. Journal of Machine Learning Research 3: 2465-2502, 2012. [JMLR] [arXiv]
  • G. Blanchard, P. Mathé. Conjugate gradient regularization under general smoothness and noise assumptions. Journal of Inverse and Ill-posed Problems 18(6): 701-726, 2010. [DOI]
  • G. Blanchard, G. Lee, C. Scott. Semi-supervised novelty detection. Journal of Machine Learning Research 11(Nov): 2973-3009, 2010. [JMLR]
  • S. Arlot, G. Blanchard, E. Roquain. Some non-asymptotic results on resampling in high dimension, I: Confidence regions. Annals of Statistics 38(1): 51-82, 2010. [arXiv]
  • S. Arlot, G. Blanchard, E. Roquain. Some non-asymptotic results on resampling in high dimension, II: Multiple tests. Annals of Statistics 38(1): 83-99, 2010. [arXiv]
  • G. Blanchard, E. Roquain. Adaptive FDR control under independence and dependence. Journal of Machine Learning Research 10:2837-2871, 2009. [JMLR]
  • A. Schwaighofer, T. Schröter, S. Mika, G. Blanchard. How wrong can we get? A review of machine learning approaches and error bars. Combinatorial Chemistry & High Throughput Screening , 12 (5): 453-468, 2009. [DOI]
  • G. Blanchard, E. Roquain. Two simple sufficient conditions for FDR control. Electronic Journal of Statistics, 2: 963-992, 2008. [Project Euclid Open Access]
  • G. Blanchard, L. Zwald. Finite dimensional projection for classification and statistical learning. IEEE transactions on Information Theory, 54 (9): 4169-4182, 2008. [DOI]
  • G. Blanchard, O. Bousquet, P. Massart. Statistical performance of support vector machines. Annals of Statistics, 36 (2): 489-531, 2008. [arXiv]
  • M. Sugiyama, M. Kawanabe, G. Blanchard, K.-R. Müller. Approximating the best linear unbiased estimator of non-Gaussian signals with Gaussian noise. IEICE Transactions on Information and Systems, E91-D (5): 1577-1580, 2008.
  • G. Blanchard, C. Schäfer, Y. Rozenholc, K-R. Müller. Optimal dyadic decision trees. Machine Learning, 66(2-3): 209-242, 2007.
  • M. Kawanabe, M. Sugiyama, G. Blanchard, K-R. Müller. A new algorithm of non-Gaussian component analysis with radial kernel functions. Annals of the Institute of Statistical Mathematics, 59(1):57-75, 2007.
  • G. Blanchard, O. Bousquet, L. Zwald. Statistical properties of kernel principal component analysis. Machine Learning, 66(2-3): 259-294, 2007.
  • G. Blanchard, P. Massart. Discussion of V.Koltchinskii's 2004 IMS Medallion Lecture paper, "Local Rademacher complexities and oracle inequalities in risk minimization". Annals of Statistics , 34(6), 2006. [arXiv]
  • G. Blanchard, M. Kawanabe, M. Sugiyama, V. Spokoiny, K.-R. Müller.
  • In search of non-Gaussian components of a high-dimensional distribution. Journal of Machine Learning Research, 7:247-282, 2006. [JMLR]
  • G. Blanchard, D. Geman. Hierarchical testing designs for pattern recognition. Annals of Statistics, 33(3):1155-1202, 2005. (This is a shortened and revised version of the technical report below). [arXiv]
  • G. Blanchard. Different paradigms for choosing sequential reweighting algorithms. Neural Computation, 16:811-836, 2004.
  • G. Blanchard, B. Blankertz. BCI competition 2003 - data set IIa: Spatial patterns of self-controlled brain rhythm modulations. IEEE Trans. Biomed. Eng., 51(6):1062-1066, 2004.
  • G. Blanchard. Un algorithme accéléré d'échantillonnage Bayésien pour le modèle CART. Revue d'Intelligence artificielle, 18(3):383-410, 2004.
    English version (somewhat outdated): A new algorithm for MCMC bayesian CART sampling. [gzipped ps]
  • G. Blanchard, G. Lugosi, N. Vayatis. On the rate of convergence of regularized boosting classifiers. Journal of Machine Learning Research (Special issue on learning theory), 4:861-894, 2003. [JMLR]
  • G. Blanchard. Generalization error bounds for aggregate classifiers. In Nonlinear Estimation and Classification, Denison, D. D. , Hansen, M. , Holmes, C. C. , Mallick, B. and Yu, B . editors, Lectures notes in Statistics (171), Springer, 357-368, 2003.
  • G. Blanchard, M. Olsen. Le système des renvois dans l'Encyclopédie : une cartographie des structures de connaissance au XVIIIème siècle. Recherches sur Diderot et l'Encyclopédie, 45-70, 2002. [Site of RDE, full PDF]
  • G. Blanchard. The "progressive mixture" estimator for regression trees. Annales de l'I.H.P., 35(6):793-820, 1999. [NUMDAM link]
  • G. Blanchard. L'estimateur de «mélange progressif» appliqué aux arbres de décision. C.R.A.S.,328, Série I:925-928, 1999.
Conference Proceedings (peer reviewed)
  • U. Gazin, G. Blanchard, E. Roquain. Transductive conformal inference with adaptive scores. Proceedings of The 27th International Conference on Artificial Intelligence and Statistics (AISTATS 2024) , PMLR 238:1504-1512, 2024. [PMLR] [arXiv]
  • E.M. Saad, G. Blanchard, N. Verzelen. Covariance-adaptive best arm identification Advances in Neural Information Processing Systems 36 (NeurIPS 2023), p. 73287-73298, Curran Associates, 2023. [NeurIPS] [arXiv]
  • B. Dussap, G. Blanchard, B.-E. Chérief-Abdellatif. Label shift quantification with robustness guarantees via distribution feature matching. Machine Learning and Knowledge Discovery in Databases: Research Track (ECML/PKDD 2023), Part V, (LNCS, volume 14173), p. 69-85, Springer, 2023. [arXiv]
  • E. Saad, G. Blanchard. Constant regret for sequence prediction with limited advice. Algorithmic Learning Theory (ALT 2023). [HAL]
  • E. Saad, G. Blanchard. Fast rates for prediction with limited expert advice. Advances in Neural Information Processing Systems (NeurIPS 2021). [HAL]
  • H. Marienwald, J-B. Fermanian, G. Blanchard. High-Dimensional Multi-Task Averaging and Application to Kernel Mean Embedding. Artificial Intelligence and Statistics (AISTATS 2021). [arXiv]
  • J. Achddou, J. Lam, A. Carpentier, G. Blanchard A minimax near-optimal algorithm for adaptive rejection sampling. Algorithmic Learning Theory (ALT 2019). [arXiv]
  • I. Tolstikhin, N. Zhivotovskiy, G. Blanchard Permutational Rademacher complexity. Algorithmic Learning Theory (Proc. ALT 2015), Springer Lecture Notes in Artificial Intelligence (9355), 209-223, 2015. [arXiv]
  • S. Kurras, U. von Luxburg, G. Blanchard The f-adjusted graph Laplacian: a diagonal modification with a geometric interpretation. Proc. ICML 2014, JMLR Workshop and Conference Proceedings 32:1530-1538, 2014. [JMLR]
  • I. Tolstikhin, G. Blanchard, M. Kloft Localized complexities for transductive learning. Proc. COLT 2014, JMLR Workshop and Conference Proceedings 35: 857-884, 2014. [JMLR]
  • G. Blanchard, C. Scott. Decontamination of mutually contaminated models. Proc. AISTATS 2014, JMLR Workshop and Conference Proceedings 33:1-9, 2014. [JMLR]
  • C. Scott, G. Blanchard, G. Handy. Classification with asymmetric label noise: consistency and maximal denoising. Proc. Conf. on Learning Theory (COLT 2013), JMLR Workshop and Conference Proceedings 30:489-511, 2013. [JMLR]
  • R. Martinez-Noriega, A. Roumy, G. Blanchard. Exemplar-based image inpainting: Fast priority and coherent nearest neighbor search. IEEE International Workshop on Machine Learning for Signal Processing (MLSP 2012) , 2012.
  • A. Beinrucker, U. Dogan, G. Blanchard. Early stopping for mutual information based feature selection. International Conference on Pattern Recognition (ICPR 2012) , 975-978, 2012.
  • A. Beinrucker, U. Dogan, G. Blanchard. A simple extension of stability feature selection. Pattern Recognition (Proceedings of the joint 34th DAGM and 36th OAGM Symposium), 256-265, 2012.
  • G. Blanchard, G. Lee, C. Scott. Generalizing from several related classification tasks to a new unlabeled sample. advances in neural inf. proc. systems (nips 2011), 2178-2186, 2011. [nips proceedings]
  • M. Kloft, G. Blanchard. The local Rademacher complexity of lp-norm multiple kernel learning. Advances in Neural Inf. Proc. Systems 24 (NIPS 2011), 2438-2446, 2011. [NIPS proceedings]
  • G. Blanchard, N. Krämer. Optimal learning rates for Kernel Conjugate Gradient regression. Advances in Neural Inf. Proc. Systems (NIPS 2010), 226-234, 2011. [NIPS proceedings]
  • G. Blanchard, T. Dickhaus, N. Hack, F. Konietschke, K. Rohmeyer, J. Rosenblatt, M. Scheer, W. Werft. μTOSS - Multiple hypothesis testing in an open software system. Proceedings of the First Workshop on Applications of Pattern Analysis, JMLR Workshop and Conference Proceedings 11:12-19, 2010. [JMLR]
  • G. Blanchard, N. Krämer. Kernel partial least squares is universally consistent. AISTATS 2010, JMLR Workshop and Conference Proceedings 9:57-64, 2010. [JMLR]
  • C. Scott, G. Blanchard. Novelty detection: unlabeled data definitely help. AISTATS 2009, JMLR Workshop and Conference Proceedings 5:464-471, 2009. [JMLR]
  • G. Blanchard, F. Fleuret. Occam's Hammer: a link between randomized learning and multiple testing FDR control. Proceedings of the 20th. conference on learning theory (COLT 2007), Springer Lecture Notes on Computer Science (4539), 112-126, 2007. [HAL]
  • S. Arlot, G. Blanchard, E. Roquain. Resampling-based confidence regions and multiple tests for a correlated random vector. Proceedings of the 20th. conference on learning theory (COLT 2007), Springer Lecture Notes on Computer Science (4539), 127-141, 2007.
  • M. Kawanabe, G. Blanchard, M. Sugiyama, V. Spokoiny, K.-R. Müller. A novel dimension reduction procedure for searching non-Gaussian subspaces. Independent Component Analysis and Blind Signal Separation (ICA 06), Springer Lecture Notes on Computer Science (3889), 149-156, 2006.
  • M. Sugiyama, M. Kawanabe, G. Blanchard, V. Spokoiny, K.-R. Müller. Obtaining the best linear unbiased estimator of noisy signals by non-Gaussian component analysis. IEEE International Conference on Acoustics, Speech, and Signal Processing (ICACSSP 06), volume 3, pp.608-611, 2006.
  • L. Zwald, G. Blanchard. On the convergence of eigenspaces in kernel principal components analysis. In Advances in Neural Inf. Proc. Systems (NIPS 05), volume 18, 1649-1656, MIT Press, 2006. [NIPS Proceedings]
  • F. Fleuret, G. Blanchard. Pattern recognition from one example via chopping. In Advances in Neural Inf. Proc. Systems (NIPS 05), volume 18, 371-378, MIT Press, 2006. [NIPS Proceedings]
  • G. Blanchard, M. Kawanabe, M. Sugiyama, V. Spokoiny, K.-R. Müller. Non-Gaussian component analysis: a semi-parametric framework for linear dimension reduction. In Advances in Neural Inf. Proc. Systems (NIPS 05), volume 18, 131-138, MIT Press, 2006. [NIPS Proceedings]
  • G. Blanchard, P. Massart, R. Vert, L. Zwald. Kernel Projection Machine: a new tool for pattern recognition. In Advances in Neural Inf. Proc. Systems (NIPS 2004), volume 17, 1649-1656, MIT Press, 2005. [NIPS Proceedings]
  • G. Blanchard, C. Schäfer, Y. Rozenholc. Oracle bounds and exact algorithm for dyadic classification trees. In Proceedings of the 17th. Conference on Learning Theory (COLT 04), Springer Lecture Notes in Artificial Intelligence (3120), 378-392, 2004.
  • O. Bousquet, L. Zwald, G. Blanchard. Statistical properties of kernel principal component analysis. In Proceedings of the 17th. Conference on Learning Theory (COLT 2004). Springer Lecture Notes in Artificial Intelligence (3120), 594-608, 2004.
Technical reports and unpublished
  • G. Blanchard, D. Geman. Hierarchical testing designs for pattern recognition. Technical report 2003-07, Université Paris-Sud, 2003. [gzipped ps]
  • Y. Amit, G. Blanchard. Multiple Randomized Classifiers. Technical report, University of Chicago, 2001. [gzipped ps]
[Last Update 10/2024] [Home]