计算机科学 ›› 2015, Vol. 42 ›› Issue (10): 235-238.

• 人工智能 • 上一篇    下一篇

鲁棒的光滑支持向量机

胡金扣,邢红杰   

  1. 河北大学数学与计算机学院河北省机器学习与计算智能重点实验室 保定071002,河北大学数学与计算机学院河北省机器学习与计算智能重点实验室 保定071002
  • 出版日期:2018-11-14 发布日期:2018-11-14
  • 基金资助:
    本文受国家自然科学基金项目(60903089,61473111),河北省自然科学基金项目(F2013201060),河北大学基金项目(3504020)资助

Robust Smooth Support Vector Machine

HU Jin-kou and XING Hong-jie   

  • Online:2018-11-14 Published:2018-11-14

摘要: 光滑支持向量机(Smooth Support Vector Machine,SSVM)是传统支持向量机的一种改进模型,它利用光滑方法将传统支持向量机的二次规划问题转化成无约束优化问题,并使用Newton-Armijo算法求解该无约束优化问题。在光滑支持向量机的基础上提出了鲁棒的光滑支持向量机(Robust Smooth Support Vector Machine,RSSVM),其利用M-estimator代替SSVM中基于L2范数的正则化项,并利用半二次最小化优化方法求解相应的最优化问题。实验结果表明所提方法可以有效地提高SSVM的抗噪声能力。

关键词: 光滑支持向量机,半二次最小化,核函数

Abstract: Smooth support vector machine (SSVM) is regarded as an improved model of the traditional support vector machine.SSVM utilizes the smooth technique to reformulate the quadratic programming problem of the traditional support vector machine as an unstrained optimization one.Moreover,the Newton-Armijo algorithm is used to solve the unstrained optimization problem.In the paper,on the basis of SSVM,robust smooth support vector machine (RSSVM) was proposed by utilizing M-estimator to substitute the L2-norm based regularization term of SSVM.Furthermore,the half-quadratic minimization method is used to solve the corresponding optimization problem of RSSVM.Experimental results demonstrate that the proposed method can efficiently enhance the anti-noise capability of SSVM.

Key words: Smooth support vector machine,Half-quadratic minimization,Kernel function

[1] Burges C.A tutorial on support vector machines for pattern re-cognition[J].Data Mining and Knowledge Discovery,1998,2(2):121-167
[2] Vapnik V N.The Nature of Statistical Learning Theory [M].New York:Springer-Verlag,1995
[3] Vapnik V N.Statistical Learning Theory[M].New York:Wiley,1998
[4] Tsai C F.Training support vector machines based on stackedgeneralization for image classification[J].Neurocomputing,2005,64(2):497-503
[5] Zhang Y Q,Shen D G.Design efficient support vector machine for fast classification[J].Pattern Recognition,2005,38(1):157-161
[6] Vapnik V N,Golowich S,Smola A.Support vector method for function approximation,regression estimation and signal processing[M]∥Advances in Neural Information Processing Systems.1997:281-287
[7] Tsoing J J.Hybrid approach of selecting hyper-parameters ofsupport vector machine for regression[J].IEEE Transactions on Systems,Man,and Cybernetics,Part B:Cybernetics,2006,36(3):699-709
[8] Celikyilmaz A,Türksen I B.Fuzzy functions with support vector machines[J].Information Sciences,2007,177(23):5163-5177
[9] Lee Y J,Mangasarian O L.SSVM:A smooth support vector machine for classification[J].Computational Optimization and Applications,2001,20(1):5-22
[10] Lin Y.Support vector machine and the Bayes rule in classification[J].Data Mining and Knowledge Discovery,2002,6(3):259-275
[11] Lee Y J,Hsieh W F,Huang C M.ε-SSVR:A Smooth Support Vector Machine for ε-Insensitive Regression[J].IEEE Transactions on Knowledge and Data Engineering,2005,17(5):678-685
[12] Chang C C,Chien L J,Lee Y J.A novel framework for multi-class classification via ternary smooth support vector machines[J].Pattern Recognition,2011,44(6):1235-1244
[13] Ji R,Yang Y,Zhang W.Incremental smooth support vector regression for Takagi-Sugeno fuzzy modeling[J].Neurocompu-ting,2014,123:281-291
[14] Geman D,Reynolds G.Constrained restoration and the recovery of discontinuities[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,1992,14(3):367-383
[15] Geman D,Yang C.Nonlinear imagerecovery with half-quadratic regularization[J].IEEE Transactions on Image Processing,1995,5(7):932-946
[16] Nikolova M,Ng M.Analysis of half-quadratic minimizationmethods for signal and image recovery[J].SIAM Journal on Scientific Computing,2005,27(3):937-966
[17] He R,Zheng W,Hu B.Maximum correntropy criterion for robust face recognition[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2011,33(8):1561-1576
[18] He R,Zheng W.A regularized correntropy framework for robust pattern recognition[J].Neural Computation,2011,23(8):2074-2100
[19] He R,Hu B.Robust Principal Component Analysis Based onMaximum Correntropy Criterion[J].IEEE Transactions on Ima-ge Processing,2011,20(6):1485-1494
[20] Mackay D J C.A practical Bayesian framework for backpropagation networks[J].Neural Computation,1992,4(3):448-472
[21] Boyd S,Vandenberghe L.Convex Optimization[M].Cambridge:Cambridge Unversity Press,2004
[22] Yuan X,Hu B.Robust Feature Extraction via Information Theo-retic Learning[C]∥Proceedings of the 26th Annual InternationalConference on Machine Learning.2009:1193-1200
[23] Ripley B D.Pattern Recognition and Neural Networks[M].Cambridge:Cambridge University Press,1996
[24] Frank A,Asuncion A.UCI machine learning repository[EB/OL].http://archive.ics.uci.edu/ml

No related articles found!
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!