Abstract
The focus of the paper is the problem of learning kernel operators from empirical data. We cast the kernel design problem as the construction of an accurate kernel from simple (and less accurate) base kernels. We use the boosting paradigm to perform the kernel construction process. To do so, we modify the booster so as to accommodate kernel operators. We also devise an efficient weak-learner for simple kernels that is based on generalized eigen vector decomposition. We demonstrate the effectiveness of our approach on synthetic data and on the USPS dataset. On the USPS dataset, the performance of the Perceptron algorithm with learned kernels is systematically better than a fixed RBF kernel.
Original language | English |
---|---|
Title of host publication | NIPS 2002 |
Subtitle of host publication | Proceedings of the 15th International Conference on Neural Information Processing Systems |
Editors | Suzanna Becker, Sebastian Thrun, Klaus Obermayer |
Publisher | MIT Press Journals |
Pages | 537-544 |
Number of pages | 8 |
ISBN (Electronic) | 0262025507, 9780262025508 |
State | Published - 2002 |
Externally published | Yes |
Event | 15th International Conference on Neural Information Processing Systems, NIPS 2002 - Vancouver, Canada Duration: 9 Dec 2002 → 14 Dec 2002 |
Publication series
Name | NIPS 2002: Proceedings of the 15th International Conference on Neural Information Processing Systems |
---|
Conference
Conference | 15th International Conference on Neural Information Processing Systems, NIPS 2002 |
---|---|
Country/Territory | Canada |
City | Vancouver |
Period | 9/12/02 → 14/12/02 |
Bibliographical note
Publisher Copyright:© NIPS 2002: Proceedings of the 15th International Conference on Neural Information Processing Systems. All rights reserved.
Funding
Special thanks to Cyril Goutte and to John Show-Taylor for pointing the connection to the generalized eigen vector problem. Thanks also to the anonymous reviewers for constructive comments.