TY - INPR A1 - Blanchard, Gilles A1 - Krämer, Nicole T1 - Convergence rates of kernel conjugate gradient for random design regression N2 - We prove statistical rates of convergence for kernel-based least squares regression from i.i.d. data using a conjugate gradient algorithm, where regularization against overfitting is obtained by early stopping. This method is related to Kernel Partial Least Squares, a regression method that combines supervised dimensionality reduction with least squares projection. Following the setting introduced in earlier related literature, we study so-called "fast convergence rates" depending on the regularity of the target regression function (measured by a source condition in terms of the kernel integral operator) and on the effective dimensionality of the data mapped into the kernel space. We obtain upper bounds, essentially matching known minimax lower bounds, for the L^2 (prediction) norm as well as for the stronger Hilbert norm, if the true regression function belongs to the reproducing kernel Hilbert space. If the latter assumption is not fulfilled, we obtain similar convergence rates for appropriate norms, provided additional unlabeled data are available. T3 - Preprints des Instituts für Mathematik der Universität Potsdam - 5 (2016) 8 KW - nonparametric regression KW - reproducing kernel Hilbert space KW - conjugate gradient KW - partial least squares KW - minimax convergence rates Y1 - 2016 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus4-94195 SN - 2193-6943 VL - 5 IS - 8 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Blanchard, Gilles A1 - Kraemer, Nicole T1 - Convergence rates of Kernel Conjugate Gradient for random design regression JF - Analysis and applications N2 - We prove statistical rates of convergence for kernel-based least squares regression from i.i.d. data using a conjugate gradient (CG) algorithm, where regularization against over-fitting is obtained by early stopping. This method is related to Kernel Partial Least Squares, a regression method that combines supervised dimensionality reduction with least squares projection. Following the setting introduced in earlier related literature, we study so-called "fast convergence rates" depending on the regularity of the target regression function (measured by a source condition in terms of the kernel integral operator) and on the effective dimensionality of the data mapped into the kernel space. We obtain upper bounds, essentially matching known minimax lower bounds, for the L-2 (prediction) norm as well as for the stronger Hilbert norm, if the true regression function belongs to the reproducing kernel Hilbert space. If the latter assumption is not fulfilled, we obtain similar convergence rates for appropriate norms, provided additional unlabeled data are available. KW - Nonparametric regression KW - reproducing kernel Hilbert space KW - conjugate gradient KW - partial least squares KW - minimax convergence rates Y1 - 2016 U6 - https://doi.org/10.1142/S0219530516400017 SN - 0219-5305 SN - 1793-6861 VL - 14 SP - 763 EP - 794 PB - World Scientific CY - Singapore ER - TY - JOUR A1 - Rastogi, Abhishake T1 - Tikhonov regularization with oversmoothing penalty for nonlinear statistical inverse problems JF - Communications on Pure and Applied Analysis N2 - In this paper, we consider the nonlinear ill-posed inverse problem with noisy data in the statistical learning setting. The Tikhonov regularization scheme in Hilbert scales is considered to reconstruct the estimator from the random noisy data. In this statistical learning setting, we derive the rates of convergence for the regularized solution under certain assumptions on the nonlinear forward operator and the prior assumptions. We discuss estimates of the reconstruction error using the approach of reproducing kernel Hilbert spaces. KW - Statistical inverse problem KW - Tikhonov regularization KW - Hilbert Scales KW - reproducing kernel Hilbert space KW - minimax convergence rates Y1 - 2020 U6 - https://doi.org/10.3934/cpaa.2020183 SN - 1534-0392 SN - 1553-5258 VL - 19 IS - 8 SP - 4111 EP - 4126 PB - American Institute of Mathematical Sciences CY - Springfield ER -