Discriminative learning under covariate shift
- We address classification problems for which the training instances are governed by an input distribution that is allowed to differ arbitrarily from the test distribution-problems also referred to as classification under covariate shift. We derive a solution that is purely discriminative: neither training nor test distribution are modeled explicitly. The problem of learning under covariate shift can be written as an integrated optimization problem. Instantiating the general optimization problem leads to a kernel logistic regression and an exponential model classifier for covariate shift. The optimization problem is convex under certain conditions; our findings also clarify the relationship to the known kernel mean matching procedure. We report on experiments on problems of spam filtering, text classification, and landmine detection.
Verfasserangaben: | Steffen Bickel, Michael Brueckner, Tobias SchefferORCiD |
---|---|
URL: | http://jmlr.csail.mit.edu/ |
ISSN: | 1532-4435 |
Publikationstyp: | Wissenschaftlicher Artikel |
Sprache: | Englisch |
Jahr der Erstveröffentlichung: | 2009 |
Erscheinungsjahr: | 2009 |
Datum der Freischaltung: | 25.03.2017 |
Quelle: | Journal of machine learning research. - ISSN 1532-4435. - 10 (2009), S. 2137 - 2155 |
Organisationseinheiten: | Mathematisch-Naturwissenschaftliche Fakultät / Institut für Informatik und Computational Science |
Peer Review: | Referiert |
Publikationsweg: | Open Access |
Name der Einrichtung zum Zeitpunkt der Publikation: | Mathematisch-Naturwissenschaftliche Fakultät / Institut für Informatik |