TY - GEN
T1 - Boosting classifiers with tightened l0-relaxation penalties
AU - Goldberg, N.
AU - Eckstein, Jonathan
N1 - Place of conference:Israel
PY - 2010
Y1 - 2010
N2 - We propose a novel boosting algorithm which
improves on current algorithms for weighted
voting classification by striking a better balance
between classification accuracy and the
sparsity of the weight vector. In order to justify
our optimization formulations, we first
consider a novel integer linear program as
a model for sparse classifier selection, generalizing
the minimum disagreement halfspace
problem whose complexity has been
investigated in computational learning theory.
Specifically, our mixed integer problem
is that of finding a separating hyperplane
with minimum empirical error subject
to an L0-norm penalty. We note that common
“soft margin” linear programming formulations
for robust classification are equivalent
to the continuous relaxation of our formulation.
Since the initial continuous relaxation
is weak, we suggest a tighter relaxation,
using novel cutting planes, to better approximate
the integer solution. To solve this relaxation,
we propose a new boosting algorithm
based on linear programming with dynamic
generation of variables and constraints. We
demonstrate the classification performance of
our proposed algorithm with experimental results,
and justify our selection of parameters
using a minimum description length, compression
interpretation of learning
AB - We propose a novel boosting algorithm which
improves on current algorithms for weighted
voting classification by striking a better balance
between classification accuracy and the
sparsity of the weight vector. In order to justify
our optimization formulations, we first
consider a novel integer linear program as
a model for sparse classifier selection, generalizing
the minimum disagreement halfspace
problem whose complexity has been
investigated in computational learning theory.
Specifically, our mixed integer problem
is that of finding a separating hyperplane
with minimum empirical error subject
to an L0-norm penalty. We note that common
“soft margin” linear programming formulations
for robust classification are equivalent
to the continuous relaxation of our formulation.
Since the initial continuous relaxation
is weak, we suggest a tighter relaxation,
using novel cutting planes, to better approximate
the integer solution. To solve this relaxation,
we propose a new boosting algorithm
based on linear programming with dynamic
generation of variables and constraints. We
demonstrate the classification performance of
our proposed algorithm with experimental results,
and justify our selection of parameters
using a minimum description length, compression
interpretation of learning
UR - https://scholar.google.co.il/scholar?q=Boosting+Classifiers+with+Tightened+L0-Relaxation+Penalties&btnG=&hl=en&as_sdt=0%2C5
M3 - Conference contribution
BT - Proceedings of the 27th International Conference on Machine Learning (ICML-10)
ER -