Abstract:
We study the strategies in feature selection with sparse support vector machine (SVM). Recently, the socalled L p-SVM (0<
p< 1) has attracted much attention because it can encourage better sparsity than the widely used
L1-SVM. However,
Lp-SVM is a non-convex and non-Lipschitz optimization problem. Solving this problem numerically is challenging. In this paper, we reformulate the
Lp-SVM into an optimization model with linear objective function and smooth constraints (LOSC-SVM) so that it can be solved by numerical methods for smooth constrained optimization. Our numerical experiments on artificial datasets show that LOSC-SVM (0<
p< 1) can improve the classification performance in both feature selection and classification by choosing a suitable parameter
p. We also apply it to some real-life datasets and experimental results show that it is superior to
L1-SVM.