TY - GEN A1 - Blanchard, Gilles A1 - Scott, Clayton T1 - Corrigendum to: Classification with asymmetric label noise BT - Consistency and maximal denoising T2 - Electronic journal of statistics N2 - We point out a flaw in Lemma 15 of [1]. We also indicate how the main results of that section are still valid using a modified argument. Y1 - 2018 U6 - https://doi.org/10.1214/18-EJS1422 SN - 1935-7524 VL - 12 IS - 1 SP - 1779 EP - 1781 PB - Institute of Mathematical Statistics CY - Cleveland ER - TY - JOUR A1 - Katz-Samuels, Julian A1 - Blanchard, Gilles A1 - Scott, Clayton T1 - Decontamination of Mutual Contamination Models JF - Journal of machine learning research N2 - Many machine learning problems can be characterized by mutual contamination models. In these problems, one observes several random samples from different convex combinations of a set of unknown base distributions and the goal is to infer these base distributions. This paper considers the general setting where the base distributions are defined on arbitrary probability spaces. We examine three popular machine learning problems that arise in this general setting: multiclass classification with label noise, demixing of mixed membership models, and classification with partial labels. In each case, we give sufficient conditions for identifiability and present algorithms for the infinite and finite sample settings, with associated performance guarantees. KW - multiclass classification with label noise KW - classification with partial labels KW - mixed membership models KW - topic modeling KW - mutual contamination models Y1 - 2019 UR - http://arxiv.org/abs/1710.01167 SN - 1532-4435 VL - 20 PB - Microtome Publishing CY - Cambridge, Mass. ER - TY - JOUR A1 - Blanchard, Gilles A1 - Flaska, Marek A1 - Handy, Gregory A1 - Pozzi, Sara A1 - Scott, Clayton T1 - Classification with asymmetric label noise: Consistency and maximal denoising JF - Electronic journal of statistics N2 - In many real-world classification problems, the labels of training examples are randomly corrupted. Most previous theoretical work on classification with label noise assumes that the two classes are separable, that the label noise is independent of the true class label, or that the noise proportions for each class are known. In this work, we give conditions that are necessary and sufficient for the true class-conditional distributions to be identifiable. These conditions are weaker than those analyzed previously, and allow for the classes to be nonseparable and the noise levels to be asymmetric and unknown. The conditions essentially state that a majority of the observed labels are correct and that the true class-conditional distributions are "mutually irreducible," a concept we introduce that limits the similarity of the two distributions. For any label noise problem, there is a unique pair of true class-conditional distributions satisfying the proposed conditions, and we argue that this pair corresponds in a certain sense to maximal denoising of the observed distributions. Our results are facilitated by a connection to "mixture proportion estimation," which is the problem of estimating the maximal proportion of one distribution that is present in another. We establish a novel rate of convergence result for mixture proportion estimation, and apply this to obtain consistency of a discrimination rule based on surrogate loss minimization. Experimental results on benchmark data and a nuclear particle classification problem demonstrate the efficacy of our approach. KW - Classification KW - label noise KW - mixture proportion estimation KW - surrogate loss KW - consistency Y1 - 2016 U6 - https://doi.org/10.1214/16-EJS1193 SN - 1935-7524 VL - 10 SP - 2780 EP - 2824 PB - Institute of Mathematical Statistics CY - Cleveland ER -