• search hit 2 of 3
Back to Result List

Discriminative learning under covariate shift

  • We address classification problems for which the training instances are governed by an input distribution that is allowed to differ arbitrarily from the test distribution-problems also referred to as classification under covariate shift. We derive a solution that is purely discriminative: neither training nor test distribution are modeled explicitly. The problem of learning under covariate shift can be written as an integrated optimization problem. Instantiating the general optimization problem leads to a kernel logistic regression and an exponential model classifier for covariate shift. The optimization problem is convex under certain conditions; our findings also clarify the relationship to the known kernel mean matching procedure. We report on experiments on problems of spam filtering, text classification, and landmine detection.

Export metadata

Additional Services

Search Google Scholar Statistics
Metadaten
Author details:Steffen Bickel, Michael Brückner, Tobias SchefferORCiD
URL:http://jmlr.csail.mit.edu/
ISSN:1532-4435
Publication type:Article
Language:English
Year of first publication:2009
Publication year:2009
Release date:2017/03/25
Source:Journal of machine learning research. - ISSN 1532-4435. - 10 (2009), S. 2137 - 2155
Organizational units:Mathematisch-Naturwissenschaftliche Fakultät / Institut für Informatik und Computational Science
Peer review:Referiert
Publishing method:Open Access
Institution name at the time of the publication:Mathematisch-Naturwissenschaftliche Fakultät / Institut für Informatik
Accept ✔
This website uses technically necessary session cookies. By continuing to use the website, you agree to this. You can find our privacy policy here.