• search hit 1 of 23
Back to Result List

Parallelizing spectrally regularized kernel algorithms

  • We consider a distributed learning approach in supervised learning for a large class of spectral regularization methods in an reproducing kernel Hilbert space (RKHS) framework. The data set of size n is partitioned into m = O (n(alpha)), alpha < 1/2, disjoint subsamples. On each subsample, some spectral regularization method (belonging to a large class, including in particular Kernel Ridge Regression, L-2-boosting and spectral cut-off) is applied. The regression function f is then estimated via simple averaging, leading to a substantial reduction in computation time. We show that minimax optimal rates of convergence are preserved if m grows sufficiently slowly (corresponding to an upper bound for alpha) as n -> infinity, depending on the smoothness assumptions on f and the intrinsic dimensionality. In spirit, the analysis relies on a classical bias/stochastic error analysis.

Export metadata

Additional Services

Search Google Scholar Statistics
Metadaten
Author details:Nicole Mücke, Gilles BlanchardGND
ISSN:1532-4435
Title of parent work (English):Journal of machine learning research
Publisher:Microtome Publishing
Place of publishing:Cambridge, Mass.
Publication type:Article
Language:English
Year of first publication:2018
Publication year:2018
Release date:2022/02/28
Tag:Distributed Learning; Minimax Optimality; Spectral Regularization
Volume:19
Number of pages:29
Funding institution:DFGGerman Research Foundation (DFG) [1735, SFB 1294]
Organizational units:Mathematisch-Naturwissenschaftliche Fakultät / Institut für Mathematik
DDC classification:5 Naturwissenschaften und Mathematik / 51 Mathematik / 510 Mathematik
Peer review:Referiert
License (German):License LogoCC-BY - Namensnennung 4.0 International
Accept ✔
This website uses technically necessary session cookies. By continuing to use the website, you agree to this. You can find our privacy policy here.