Optimal rates for regularization of statistical inverse learning problems
- We consider a statistical inverse learning (also called inverse regression) problem, where we observe the image of a function f through a linear operator A at i.i.d. random design points X-i , superposed with an additive noise. The distribution of the design points is unknown and can be very general. We analyze simultaneously the direct (estimation of Af) and the inverse (estimation of f) learning problems. In this general framework, we obtain strong and weak minimax optimal rates of convergence (as the number of observations n grows large) for a large class of spectral regularization methods over regularity classes defined through appropriate source conditions. This improves on or completes previous results obtained in related settings. The optimality of the obtained rates is shown not only in the exponent in n but also in the explicit dependency of the constant factor in the variance of the noise and the radius of the source condition set.
Author details: | Gilles BlanchardGND, Nicole MückeORCiDGND |
---|---|
DOI: | https://doi.org/10.1007/s10208-017-9359-7 |
ISSN: | 1615-3375 |
ISSN: | 1615-3383 |
Title of parent work (English): | Foundations of Computational Mathematics |
Publisher: | Springer |
Place of publishing: | New York |
Publication type: | Article |
Language: | English |
Date of first publication: | 2018/06/20 |
Publication year: | 2018 |
Release date: | 2021/10/27 |
Tag: | Inverse problem; Minimax convergence rates; Reproducing kernel Hilbert space; Spectral regularization; Statistical learning |
Volume: | 18 |
Issue: | 4 |
Number of pages: | 43 |
First page: | 971 |
Last Page: | 1013 |
Funding institution: | DFG via Research Unit [1735] |
Organizational units: | Mathematisch-Naturwissenschaftliche Fakultät / Institut für Mathematik |
DDC classification: | 5 Naturwissenschaften und Mathematik / 51 Mathematik / 510 Mathematik |
Peer review: | Referiert |
Publishing method: | Open Access / Green Open-Access |