TY - GEN
T1 - Boosting classifiers with tightened L0-relaxation penalties
AU - Goldberg, Noam
AU - Eckstein, Jonathan
PY - 2010
Y1 - 2010
N2 - We propose a novel boosting algorithm which improves on current algorithms for weighted voting classification by striking a better balance between classification accuracy and the sparsity of the weight vector. In order to justify our optimization formulations, we first consider a novel integer linear program as a model for sparse classifier selection, generalizing the minimum disagreement half-space problem whose complexity has been investigated in computational learning theory. Specifically, our mixed integer problem is that of finding a separating hyper-plane with minimum empirical error subject to an L0-norm penalty. We note that common "soft margin" linear programming formulations for robust classification are equivalent to the continuous relaxation of our formulation. Since the initial continuous relaxation is weak, we suggest a tighter relaxation, using novel cutting planes, to better approximate the integer solution. To solve this relaxation, we propose a new boosting algorithm based on linear programming with dynamic generation of variables and constraints. We demonstrate the classification performance of our proposed algorithm with experimental results, and justify our selection of parameters using a minimum description length, compression interpretation of learning.
AB - We propose a novel boosting algorithm which improves on current algorithms for weighted voting classification by striking a better balance between classification accuracy and the sparsity of the weight vector. In order to justify our optimization formulations, we first consider a novel integer linear program as a model for sparse classifier selection, generalizing the minimum disagreement half-space problem whose complexity has been investigated in computational learning theory. Specifically, our mixed integer problem is that of finding a separating hyper-plane with minimum empirical error subject to an L0-norm penalty. We note that common "soft margin" linear programming formulations for robust classification are equivalent to the continuous relaxation of our formulation. Since the initial continuous relaxation is weak, we suggest a tighter relaxation, using novel cutting planes, to better approximate the integer solution. To solve this relaxation, we propose a new boosting algorithm based on linear programming with dynamic generation of variables and constraints. We demonstrate the classification performance of our proposed algorithm with experimental results, and justify our selection of parameters using a minimum description length, compression interpretation of learning.
UR - http://www.scopus.com/inward/record.url?scp=77956537433&partnerID=8YFLogxK
M3 - ???researchoutput.researchoutputtypes.contributiontobookanthology.conference???
AN - SCOPUS:77956537433
SN - 9781605589077
T3 - ICML 2010 - Proceedings, 27th International Conference on Machine Learning
SP - 383
EP - 390
BT - ICML 2010 - Proceedings, 27th International Conference on Machine Learning
T2 - 27th International Conference on Machine Learning, ICML 2010
Y2 - 21 June 2010 through 25 June 2010
ER -