Large margin hierarchical classification

Ofer Dekel, Joseph Keshet, Yoram Singer

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

152 Scopus citations

Abstract

We present an algorithmic framework for supervised classification learning where the set of labels is organized in a predefined hierarchical structure. This structure is encoded by a rooted tree which induces a metric over the label set. Our approach combines ideas from large margin kernel methods and Bayesian analysis. Following the large margin principle, we associate a prototype with each label in the tree and formulate the learning task as an optimization problem with varying margin constraints. In the spirit of Bayesian methods, we impose similarity requirements between the prototypes corresponding to adjacent labels in the hierarchy. We describe new online and batch algorithms for solving the constrained optimization problem. We derive a worst case loss-bound for the online algorithm and provide generalization analysis for its batch counter-part. We demonstrate the merits of our approach with a series of experiments on synthetic, text and speech data.

Original languageEnglish
Title of host publicationProceedings, Twenty-First International Conference on Machine Learning, ICML 2004
EditorsR. Greiner, D. Schuurmans
Pages209-216
Number of pages8
StatePublished - 2004
Externally publishedYes
EventProceedings, Twenty-First International Conference on Machine Learning, ICML 2004 - Banff, Alta, Canada
Duration: 4 Jul 20048 Jul 2004

Publication series

NameProceedings, Twenty-First International Conference on Machine Learning, ICML 2004

Conference

ConferenceProceedings, Twenty-First International Conference on Machine Learning, ICML 2004
Country/TerritoryCanada
CityBanff, Alta
Period4/07/048/07/04

Fingerprint

Dive into the research topics of 'Large margin hierarchical classification'. Together they form a unique fingerprint.

Cite this