Neighbourhood components analysis

Jacob Goldberger, Sam Roweis, Geoff Hinton, Ruslan Salakhutdinov

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

800 Scopus citations

Abstract

In this paper we propose a novel method for learning a Mahalanobis distance measure to be used in the KNN classification algorithm. The algorithm directly maximizes a stochastic variant of the leave-one-out KNN score on the training set. It can also learn a low-dimensional linear embedding of labeled data that can be used for data visualization and fast classification. Unlike other methods, our classification model is non-parametric, making no assumptions about the shape of the class distributions or the boundaries between them. The performance of the method is demonstrated on several data sets, both for metric learning and linear dimensionality reduction.

Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 17 - Proceedings of the 2004 Conference, NIPS 2004
PublisherNeural information processing systems foundation
ISBN (Print)0262195348, 9780262195348
StatePublished - 2005
Externally publishedYes
Event18th Annual Conference on Neural Information Processing Systems, NIPS 2004 - Vancouver, BC, Canada
Duration: 13 Dec 200416 Dec 2004

Publication series

NameAdvances in Neural Information Processing Systems
ISSN (Print)1049-5258

Conference

Conference18th Annual Conference on Neural Information Processing Systems, NIPS 2004
Country/TerritoryCanada
CityVancouver, BC
Period13/12/0416/12/04

Fingerprint

Dive into the research topics of 'Neighbourhood components analysis'. Together they form a unique fingerprint.

Cite this