Spatially Weighted Principal Component Regression for high-dimensional prediction

Dan Shen, Hongtu Zhu

Research output: Contribution to journalConference articlepeer-review

4 Scopus citations

Abstract

We consider the problem of using high dimensional data residing on graphs to predict a low-dimensional outcome variable, such as disease status. Examples of data include time series and genetic data measured on linear graphs and imaging data measured on triangulated graphs (or lattices), among many others.Many of these data have two key features including spatial smoothness and intrinsically low dimensional structure. We propose a simple solution based on a general statistical framework, called spatially weighted principal component regression (SWPCR). In SWPCR, we introduce two sets of weights including importance score weights for the selection of individual features at each node and spatial weights for the incorporation of the neighboring pattern on the graph. We integrate the importance score weights with the spatial weights in order to recover the low dimensional structure of high dimensional data.We demonstrate the utility of our methods through extensive simulations and a real data analysis based on Alzheimer’s disease neuroimaging initiative data.

Original languageEnglish (US)
Pages (from-to)758-769
Number of pages12
JournalLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume9123
DOIs
StatePublished - 2015
Event24th International Conference on Information Processing in Medical Imaging, IPMI 2015 - Isle of Skye, United Kingdom
Duration: Jun 28 2015Jul 3 2015

Keywords

  • Graph
  • Principal component analysis
  • Regression
  • Spatial
  • Supervise
  • Weight

ASJC Scopus subject areas

  • Theoretical Computer Science
  • General Computer Science

Fingerprint

Dive into the research topics of 'Spatially Weighted Principal Component Regression for high-dimensional prediction'. Together they form a unique fingerprint.

Cite this