Dimensionality reduction based on non-parametric mutual information

Lev Faivishevsky, Jacob Goldberger

Research output: Contribution to journalArticlepeer-review

16 Scopus citations

Abstract

In this paper we introduce a supervised linear dimensionality reduction algorithm which finds a projected input space that maximizes the mutual information between input and output values. The algorithm utilizes the recently introduced MeanNN estimator for differential entropy. We show that the estimator is an appropriate tool for the dimensionality reduction task. Next we provide a nonlinear regression algorithm based on the proposed dimensionality reduction approach. The regression algorithm achieves comparable to state-of-the-art performance on the standard data sets but is three orders of magnitude faster. In addition we describe applications of the proposed dimensionality reduction algorithm to reduced-complexity supervised and semisupervised classification tasks.

Original languageEnglish
Pages (from-to)31-37
Number of pages7
JournalNeurocomputing
Volume80
DOIs
StatePublished - 15 Mar 2012

Keywords

  • Classification
  • Dimensionality reduction
  • Regression
  • Semisupervised learning

Fingerprint

Dive into the research topics of 'Dimensionality reduction based on non-parametric mutual information'. Together they form a unique fingerprint.

Cite this