• search hit 3 of 3
Back to Result List

Discriminative learning under covariate shift

  • We address classification problems for which the training instances are governed by an input distribution that is allowed to differ arbitrarily from the test distribution-problems also referred to as classification under covariate shift. We derive a solution that is purely discriminative: neither training nor test distribution are modeled explicitly. The problem of learning under covariate shift can be written as an integrated optimization problem. Instantiating the general optimization problem leads to a kernel logistic regression and an exponential model classifier for covariate shift. The optimization problem is convex under certain conditions; our findings also clarify the relationship to the known kernel mean matching procedure. We report on experiments on problems of spam filtering, text classification, and landmine detection.

Export metadata

Additional Services

Share in Twitter Search Google Scholar Statistics
Metadaten
Author:Steffen Bickel, Michael Brückner, Tobias Scheffer
URL:http://jmlr.csail.mit.edu/
ISSN:1532-4435
Document Type:Article
Language:English
Year of first Publication:2009
Year of Completion:2009
Release Date:2017/03/25
Source:Journal of machine learning research. - ISSN 1532-4435. - 10 (2009), S. 2137 - 2155
Organizational units:Mathematisch-Naturwissenschaftliche Fakultät / Institut für Informatik und Computational Science
Peer Review:Referiert
Publication Way:Open Access
Institution name at the time of publication:Mathematisch-Naturwissenschaftliche Fakultät / Institut für Informatik