9419
2016
2016
eng
31
8
5
preprint
Universitätsverlag Potsdam
Potsdam
1
--
--
--
Convergence rates of kernel conjugate gradient for random design regression
We prove statistical rates of convergence for kernel-based least squares regression from i.i.d. data using a conjugate gradient algorithm, where regularization against overfitting is obtained by early stopping. This method is related to Kernel Partial Least Squares, a regression method that combines supervised dimensionality reduction with least squares projection. Following the setting introduced in earlier related literature, we study so-called "fast convergence rates" depending on the regularity of the target regression function (measured by a source condition in terms of the kernel integral operator) and on the effective dimensionality of the data mapped into the kernel space. We obtain upper bounds, essentially matching known minimax lower bounds, for the L^2 (prediction) norm as well as for the stronger Hilbert norm, if the true
regression function belongs to the reproducing kernel Hilbert space. If the latter assumption is not fulfilled, we obtain similar convergence rates for appropriate norms, provided additional unlabeled data are available.
urn:nbn:de:kobv:517-opus4-94195
2193-6943 (online)
Keine Nutzungslizenz vergeben - es gilt das deutsche Urheberrecht
Gilles Blanchard
Nicole Krämer
Preprints des Instituts für Mathematik der Universität Potsdam
5 (2016) 8
eng
uncontrolled
nonparametric regression
eng
uncontrolled
reproducing kernel Hilbert space
eng
uncontrolled
conjugate gradient
eng
uncontrolled
partial least squares
eng
uncontrolled
minimax convergence rates
Mathematik
Nonparametric regression
Asymptotic properties
Optimal stopping [See also 60G40, 91A60]
open_access
Institut für Mathematik
Universitätsverlag Potsdam
2016
Universität Potsdam
Universitätsverlag Potsdam
https://publishup.uni-potsdam.de/files/9419/premath08.pdf