Variable selection in regression via repeated data splitting

Peter F. Thall, Kathy E. Russell, Richard M. Simon

Research output: Contribution to journalArticlepeer-review

12 Scopus citations

Abstract

A new algorithm—backward elimination via repeated data splitting (BERDS)—is proposed for variable selection in regression. Initially, the data are partitioned into two sets (E, V), and an exhaustive backward elimination (BE) is performed in E. For each p value cutoff α used in BE, the corresponding fitted model from E is validated in V by computing the sum of squared deviations of observed from predicted values. This is repeated m times, and the α minimizing the sum of the m sums of squares is used as the cutoff in a final BE on the entire data set. BERDS is a modification of the algorithm BECV proposed by Thall, Simon, and Grier (1992). An extensive simulation study shows that, compared to BECV, BERDS has a smaller model error and higher probabilities of excluding noise variables, of selecting each of several uncorrelated true predictors, and of selecting exactly one of two or three highly correlated true predictors. BERDS is also superior to standard BE with cutoffs.05 or.10, and this superiority increases with the number of noise variables in the data and the degree of correlation among true predictors. An application is provided for illustration.

Original languageEnglish (US)
Pages (from-to)416-434
Number of pages19
JournalJournal of Computational and Graphical Statistics
Volume6
Issue number4
DOIs
StatePublished - Dec 1997

Keywords

  • Cross validation
  • Data splitting
  • Monte Carlo Simulation
  • Regression
  • Variable Selection

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty
  • Discrete Mathematics and Combinatorics

Fingerprint

Dive into the research topics of 'Variable selection in regression via repeated data splitting'. Together they form a unique fingerprint.

Cite this