- search hit 1 of 1
On the Convergence Rate of l(p)-Norm Multiple Kernel Learning
- We derive an upper bound on the local Rademacher complexity of l(p)-norm multiple kernel learning, which yields a tighter excess risk bound than global approaches. Previous local approaches analyzed the case p - 1 only while our analysis covers all cases 1 <= p <= infinity, assuming the different feature mappings corresponding to the different kernels to be uncorrelated. We also show a lower bound that shows that the bound is tight, and derive consequences regarding excess loss, namely fast convergence rates of the order O( n(-)1+alpha/alpha where alpha is the minimum eigenvalue decay rate of the individual kernels.
Author details: | Marius Kloft, Gilles BlanchardGND |
---|---|
ISSN: | 1532-4435 |
Title of parent work (English): | JOURNAL OF MACHINE LEARNING RESEARCH |
Publisher: | MICROTOME PUBL |
Place of publishing: | BROOKLINE |
Publication type: | Article |
Language: | English |
Year of first publication: | 2012 |
Publication year: | 2012 |
Release date: | 2017/03/26 |
Tag: | generalization bounds; learning kernels; local Rademacher complexity; multiple kernel learning |
Volume: | 13 |
Number of pages: | 38 |
First page: | 2465 |
Last Page: | 2502 |
Funding institution: | German Science Foundation [DFG MU 987/6-1, RA 1894/1-1]; World Class University Program through the National Research Foundation of Korea; Korean Ministry of Education, Science, and Technology [R31-10008]; European Community [247022]; European Community’s 7th Framework Programme under the PAS-CAL2 Network of Excellence [ICT-216886] |