A flexible procedure for mixture proportion estimation in positive-unlabeled learning

Zhenfeng Lin, James P. Long

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

Positive-unlabeled (PU) learning considers two samples, a positive set P with observations from only one class and an unlabeled set U with observations from two classes. The goal is to classify observations in U. Class mixture proportion estimation (MPE) in U is a key step in PU learning. Blanchard et al. showed that MPE in PU learning is a generalization of the problem of estimating the proportion of true null hypotheses in multiple testing problems. Motivated by this idea, we propose reducing the problem to one-dimension via construction of a probabilistic classifier trained on the P and U data sets followed by application of a one-dimensional mixture proportion method from the multiple testing literature to the observation class probabilities. The flexibility of this framework lies in the freedom to choose the classifier and the one-dimensional MPE method. We prove consistency of two mixture proportion estimators using bounds from empirical process theory, develop tuning parameter free implementations, and demonstrate that they have competitive performance on simulated waveform data and a protein signaling problem.

Original languageEnglish (US)
Pages (from-to)178-187
Number of pages10
JournalStatistical Analysis and Data Mining
Volume13
Issue number2
DOIs
StatePublished - Apr 1 2020

Keywords

  • PU learning
  • classification
  • empirical processes
  • local false discovery rate
  • mixture proportion estimation
  • multiple testing

ASJC Scopus subject areas

  • Analysis
  • Information Systems
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'A flexible procedure for mixture proportion estimation in positive-unlabeled learning'. Together they form a unique fingerprint.

Cite this