We use cookies to improve your experience with our site.

基于Lp正则化的稀疏支持向量机特征选择算法

Sparse Support Vector Machine with Lp Penalty for Feature Selection

  • 摘要: 本文研究基于稀疏支持向量机的特征选择方法。近年来, Lp-SVM(0< p< 1)因其比广泛应用的L1-SVM具有更好的稀疏性,而成为重要的研究课题。然而,Lp-SVM是非凸且非Lipschitz连续的优化问题,为其设计算法具有一定的挑战性。本文提出Lp-SVM(0< p< 1)的一个等价模型LOSC-SVM,该等价模型具有线性目标函数和光滑约束条件,从而可利用光滑约束最优化的成熟算法求解Lp-SVM(0< p< 1)。人工数值实验验证了模型的有效性,并表明通过选择合适的正则化阶次p,可提高分类及特征选择性能。真实数据实验结果表明,LOSC-SVM的特征选择和分类性能均优于L1-SVM。

     

    Abstract: We study the strategies in feature selection with sparse support vector machine (SVM). Recently, the socalled L p-SVM (0< p< 1) has attracted much attention because it can encourage better sparsity than the widely used L1-SVM. However, Lp-SVM is a non-convex and non-Lipschitz optimization problem. Solving this problem numerically is challenging. In this paper, we reformulate the Lp-SVM into an optimization model with linear objective function and smooth constraints (LOSC-SVM) so that it can be solved by numerical methods for smooth constrained optimization. Our numerical experiments on artificial datasets show that LOSC-SVM (0< p< 1) can improve the classification performance in both feature selection and classification by choosing a suitable parameter p. We also apply it to some real-life datasets and experimental results show that it is superior to L1-SVM.

     

/

返回文章
返回