Mutual information based dimensionality reduction with application to non-linear regression

Lev Faivishevsky, Jacob Goldberger

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

In this paper we introduce a supervised linear dimensionality reduction algorithm which is based on finding a projected input space that maximizes mutual information between input and output values. The algorithm utilizes the recently introduced MeanNN estimator for differential entropy. We show that the estimator is an appropriate tool for the dimensionality reduction task. Next we provide a nonlinear regression algorithm based on the proposed dimensionality reduction approach. The regression algorithm achieves comparable to state-of-the-art performance on the standard datasets being three orders of magnitude faster. In addition we demonstrate an application of the proposed dimensionality reduction algorithm to reduced-complexity classification.

Original languageEnglish
Title of host publicationProceedings of the 2010 IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2010
Pages1-6
Number of pages6
DOIs
StatePublished - 2010
Event2010 IEEE 20th International Workshop on Machine Learning for Signal Processing, MLSP 2010 - Kittila, Finland
Duration: 29 Aug 20101 Sep 2010

Publication series

NameProceedings of the 2010 IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2010

Conference

Conference2010 IEEE 20th International Workshop on Machine Learning for Signal Processing, MLSP 2010
Country/TerritoryFinland
CityKittila
Period29/08/101/09/10

Fingerprint

Dive into the research topics of 'Mutual information based dimensionality reduction with application to non-linear regression'. Together they form a unique fingerprint.

Cite this