• search hit 4 of 5
Back to Result List

Trading variance reduction with unbiasedness : the regularized subspace information criterion for robust model selection in kernel regression

  • A well-known result by Stein (1956) shows that in particular situations, biased estimators can yield better parameter estimates than their generally preferred unbiased counterparts. This letter follows the same spirit, as we will stabilize the unbiased generalization error estimates by regularization and finally obtain more robust model selection criteria for learning. We trade a small bias against a larger variance reduction, which has the beneficial effect of being more precise on a single training set. We focus on the subspace information criterion (SIC), which is an unbiased estimator of the expected generalization error measured by the reproducing kernel Hilbert space norm. SIC can be applied to the kernel regression, and it was shown in earlier experiments that a small regularization of SIC has a stabilization effect. However, it remained open how to appropriately determine the degree of regularization in SIC. In this article, we derive an unbiased estimator of the expected squared error, between SIC and the expectedA well-known result by Stein (1956) shows that in particular situations, biased estimators can yield better parameter estimates than their generally preferred unbiased counterparts. This letter follows the same spirit, as we will stabilize the unbiased generalization error estimates by regularization and finally obtain more robust model selection criteria for learning. We trade a small bias against a larger variance reduction, which has the beneficial effect of being more precise on a single training set. We focus on the subspace information criterion (SIC), which is an unbiased estimator of the expected generalization error measured by the reproducing kernel Hilbert space norm. SIC can be applied to the kernel regression, and it was shown in earlier experiments that a small regularization of SIC has a stabilization effect. However, it remained open how to appropriately determine the degree of regularization in SIC. In this article, we derive an unbiased estimator of the expected squared error, between SIC and the expected generalization error and propose determining the degree of regularization of SIC such that the estimator of the expected squared error is minimized. Computer simulations with artificial and real data sets illustrate that the proposed method works effectively for improving the precision of SIC, especially in the high-noise-level cases. We furthermore compare the proposed method to the original SIC, the cross-validation, and an empirical Bayesian method in ridge parameter selection, with good resultsshow moreshow less

Export metadata

Additional Services

Search Google Scholar Statistics
Metadaten
Author details:Masashi Sugiyama, Motoaki Kawanabe, Klaus-Robert Müller
ISSN:0899-7667
Publication type:Article
Language:English
Year of first publication:2004
Publication year:2004
Release date:2017/03/24
Source:Neural Computation. - ISSN 0899-7667. - 16 (2004), 5, S. 1077 - 1104
Organizational units:Mathematisch-Naturwissenschaftliche Fakultät / Institut für Informatik und Computational Science
Peer review:Referiert
Institution name at the time of the publication:Mathematisch-Naturwissenschaftliche Fakultät / Institut für Informatik
Accept ✔
This website uses technically necessary session cookies. By continuing to use the website, you agree to this. You can find our privacy policy here.