The search result changed since you submitted your search request. Documents might be displayed in a different sort order.
  • search hit 6 of 19
Back to Result List

On the Convergence Rate of l(p)-Norm Multiple Kernel Learning

  • We derive an upper bound on the local Rademacher complexity of l(p)-norm multiple kernel learning, which yields a tighter excess risk bound than global approaches. Previous local approaches analyzed the case p - 1 only while our analysis covers all cases 1 <= p <= infinity, assuming the different feature mappings corresponding to the different kernels to be uncorrelated. We also show a lower bound that shows that the bound is tight, and derive consequences regarding excess loss, namely fast convergence rates of the order O( n(-)1+alpha/alpha where alpha is the minimum eigenvalue decay rate of the individual kernels.

Export metadata

Additional Services

Search Google Scholar Statistics
Metadaten
Author details:Marius Kloft, Gilles BlanchardGND
ISSN:1532-4435
Title of parent work (English):JOURNAL OF MACHINE LEARNING RESEARCH
Publisher:MICROTOME PUBL
Place of publishing:BROOKLINE
Publication type:Article
Language:English
Year of first publication:2012
Publication year:2012
Release date:2017/03/26
Tag:generalization bounds; learning kernels; local Rademacher complexity; multiple kernel learning
Volume:13
Number of pages:38
First page:2465
Last Page:2502
Funding institution:German Science Foundation [DFG MU 987/6-1, RA 1894/1-1]; World Class University Program through the National Research Foundation of Korea; Korean Ministry of Education, Science, and Technology [R31-10008]; European Community [247022]; European Community&rsquo;s 7th Framework Programme under the PAS-CAL2 Network of Excellence [ICT-216886]
Accept ✔
This website uses technically necessary session cookies. By continuing to use the website, you agree to this. You can find our privacy policy here.