Full Name
Alexandre Tsybakov
Job Title
Professor and Head of the Statistics Department
Company
CREST-ENSAE Paris
Speaker Bio
Alexandre B. Tsybakov is a Professor at ENSAE Paris and at Sorbonne University, Paris. From 1993 to 2017 he was a Professor at University Pierre and Marie Curie (Paris 6), and from 2009 to 2015 a Professor at Ecole Polytechnique. He was a member of the Institute for Information Transmission Problems, Moscow, from 1980 to 2007. His research interests include high-dimensional statistics, nonparametric function estimation, statistical machine learning, stochastic optimization, statistical inverse problems.
Prof. Tsybakov is an author of 3 books and of more than 150 journal papers. He is a Fellow of the Institute of Mathematical Statistics and he has been awarded a Le Cam's Lecture by the French Statistical Society, a Miller Professorship by the University of California (Berkeley), a Medallion Lecture by the Institute of Mathematical Statistics, a Gay-Lussac-Humboldt Prize, an Invited Lecture at the International Congress of Mathematicians and several other distinctions. He is a member of editorial boards of several journals.
Prof. Tsybakov is an author of 3 books and of more than 150 journal papers. He is a Fellow of the Institute of Mathematical Statistics and he has been awarded a Le Cam's Lecture by the French Statistical Society, a Miller Professorship by the University of California (Berkeley), a Medallion Lecture by the Institute of Mathematical Statistics, a Gay-Lussac-Humboldt Prize, an Invited Lecture at the International Congress of Mathematicians and several other distinctions. He is a member of editorial boards of several journals.
Speaking At
Abstract
This talk considers the problem of estimating discrete and continuous probability densities under low-rank constraints.
For discrete distributions, we assume that the two-dimensional array to estimate is a rank K probability matrix.
For the continuous case, we assume that the density with respect to the Lebesgue measure satisfies a multi-view model, meaning that it is Hölder smooth and can be decomposed as a sum of K components, each of which is a product of one-dimensional functions.
We propose estimators that achieve, up to logarithmic factors, the optimal convergence rates under such low-rank constraints.
In the discrete case, the proposed estimator is adaptive to the rank K. In the continuos case, our estimator is adaptive to the unknown support as well as to the smoothness, and to the unknown number of separable components K.
We also establish lower bounds for the discrete and continuous problems showing that the convergence rates of the proposed estimators are minimax optimal to within logarithmic factors. Joint work with Julien Chhor and Olga Klopp.
For discrete distributions, we assume that the two-dimensional array to estimate is a rank K probability matrix.
For the continuous case, we assume that the density with respect to the Lebesgue measure satisfies a multi-view model, meaning that it is Hölder smooth and can be decomposed as a sum of K components, each of which is a product of one-dimensional functions.
We propose estimators that achieve, up to logarithmic factors, the optimal convergence rates under such low-rank constraints.
In the discrete case, the proposed estimator is adaptive to the rank K. In the continuos case, our estimator is adaptive to the unknown support as well as to the smoothness, and to the unknown number of separable components K.
We also establish lower bounds for the discrete and continuous problems showing that the convergence rates of the proposed estimators are minimax optimal to within logarithmic factors. Joint work with Julien Chhor and Olga Klopp.