? 基于<em>L</em><sub><em>p</em></sub>正则化的稀疏支持向量机特征选择算法
Journal of Computer Science and Technology
Quick Search in JCST
 Advanced Search 
      Home | PrePrint | SiteMap | Contact Us | Help
 
Indexed by   SCIE, EI ...
Bimonthly    Since 1986
Journal of Computer Science and Technology 2017, Vol. 32 Issue (1) :68-77    DOI: 10.1007/s11390-017-1706-2
Data Management and Data Mining << Previous Articles | Next Articles >>
基于Lp正则化的稀疏支持向量机特征选择算法
Lan Yao1, Feng Zeng2,*, Member, CCF, Dong-Hui Li3, and Zhi-Gang Chen2, Senior Member, CCF
1 College of Mathematics and Econometrics, Hunan University, Changsha 410082, China;
2 School of Software, Central South University, Changsha 410083, China;
3 School of Mathematical Sciences, South China Normal University, Guangzhou 510631, China
Sparse Support Vector Machine with Lp Penalty for Feature Selection
Lan Yao1, Feng Zeng2,*, Member, CCF, Dong-Hui Li3, and Zhi-Gang Chen2, Senior Member, CCF
1 College of Mathematics and Econometrics, Hunan University, Changsha 410082, China;
2 School of Software, Central South University, Changsha 410083, China;
3 School of Mathematical Sciences, South China Normal University, Guangzhou 510631, China

摘要
参考文献
相关文章
Download: [PDF 1328KB]  
摘要 本文研究基于稀疏支持向量机的特征选择方法。近年来, Lp-SVM(0< p< 1)因其比广泛应用的L1-SVM具有更好的稀疏性,而成为重要的研究课题。然而,Lp-SVM是非凸且非Lipschitz连续的优化问题,为其设计算法具有一定的挑战性。本文提出Lp-SVM(0< p< 1)的一个等价模型LOSC-SVM,该等价模型具有线性目标函数和光滑约束条件,从而可利用光滑约束最优化的成熟算法求解Lp-SVM(0< p< 1)。人工数值实验验证了模型的有效性,并表明通过选择合适的正则化阶次p,可提高分类及特征选择性能。真实数据实验结果表明,LOSC-SVM的特征选择和分类性能均优于L1-SVM。
关键词机器学习   特征选择   支持向量机   Lp正则化     
Abstract: We study the strategies in feature selection with sparse support vector machine (SVM). Recently, the socalled L p-SVM (0< p< 1) has attracted much attention because it can encourage better sparsity than the widely used L1-SVM. However, Lp-SVM is a non-convex and non-Lipschitz optimization problem. Solving this problem numerically is challenging. In this paper, we reformulate the Lp-SVM into an optimization model with linear objective function and smooth constraints (LOSC-SVM) so that it can be solved by numerical methods for smooth constrained optimization. Our numerical experiments on artificial datasets show that LOSC-SVM (0< p< 1) can improve the classification performance in both feature selection and classification by choosing a suitable parameter p. We also apply it to some real-life datasets and experimental results show that it is superior to L1-SVM.
Keywordsmachine learning   feature selection   support vector machine   Lp-regularization     
Received 2016-02-28;
本文基金:

This work is supported in part by the National Natural Science Foundation of China under Grant Nos. 61502159, 61379057, 11101081, and 11271069, and the Research Foundation of Central South University of China under Grant No. 2014JSJJ019.

通讯作者: Feng Zeng     Email: fengzeng@csu.edu.cn
About author: Lan Yao is an assistant professor of the College of Mathematics and Econometrics, Hunan University, Changsha. She got her B.S. degree in computer science, M.S. and Ph.D. degrees in applied mathematics from Hunan University, Changsha, in 2000, 2006 and 2014 respectively. Her research interests include data mining, numerical methods in optimization and network optimization.
引用本文:   
Lan Yao, Feng Zeng, Dong-Hui Li, Zhi-Gang Chen.基于Lp正则化的稀疏支持向量机特征选择算法[J]  Journal of Computer Science and Technology , 2017,V32(1): 68-77
Lan Yao, Feng Zeng, Dong-Hui Li, Zhi-Gang Chen.Sparse Support Vector Machine with Lp Penalty for Feature Selection[J]  Journal of Computer Science and Technology, 2017,V32(1): 68-77
链接本文:  
http://jcst.ict.ac.cn:8080/jcst/CN/10.1007/s11390-017-1706-2
Copyright 2010 by Journal of Computer Science and Technology