Computer Science ›› 2019, Vol. 46 ›› Issue (11): 193-201.doi: 10.11896/jsjkx.181001840

• Artificial Intelligence • Previous Articles     Next Articles

Robust SVM Based on Zeroth Order Variance Reduction

LU Shu-xia1,2, CAI Lian-xiang1, ZHANG Luo-huan1   

  1. (College of Mathematics and Information Science,Hebei University,Baoding,Hebei 071002,China)1
    (Hebei Province Key Laboratory of Machine Learning and Computational Intelligence,Baoding,Hebei 071002,China)2
  • Received:2018-10-04 Online:2019-11-15 Published:2019-11-14

Abstract: Great losses will be produced when traditional SVM methods are used to deal with the classification problem with noisy data,which makes the classification hyperplane seriously deviates from the optimal hyperplane,resulting in poor classification performance.In order to solve this problem,this paper proposed a robust support vector machine (RSVM) and gave a loss function in sinusoidal square form.According to the characteristics of sinusoidal function,the value of loss function is limited to the range of [0,1],even for noise data,which improves the anti-noise ability of SVM.In addition,when the traditional stochastic gradient descent method is used to solve the SVM,a single sample gradient is used to approximately replace the full gradient in each iteration,which will inevitably produce variance.As the number of iterations increases,the variance also accumulates,which seriously affects the classification performance of the algorithm.In order to reduce the influence of variance,this paper introduced a zeroth order-stochastic variance reduced gradient (ZO-SVRG) algorithm.This algorithm uses coordinate gradient estimation method to replace gradient approximately,and reduces the influence of variance by introducing the gradient correction term in each iteration.Besides,in the output of the internal and external loop,the weighted average output form is adopted,and then the convergence speed of the optimization problem is accelerated.The experimental results show that the robust support vector machine based on zeroth-order variance reduction algorithm has better robustness to noise data and effectively reduces the influence of variance.In order to further improve the performance of the algorithm,the influence of the main parameters λ and k on the accuracy of algorithm were analyzed.For both linear and nonlinear cases,when its parameter pairs (λ,k) are satisfied (λ=1,k=5) and (λ=10,k=3),respectively,the highest accuracy of each can be achieved.

Key words: Loss function, Noise, Support vector machine, Variance reduction, Zeroth order optimization

CLC Number: 

  • TP181
[1]FRENAY B,VERLEYSEN M.Classification in the Presence of Label Noise:A Survey ∥IEEE Transactions on Neural Networks and Learning Systems.2014:845-869.
[2]LIU Y F,ZHANG H H.Support Vector Machines with Adaptive Lq Penalty .Computational Statistic & Data Analysis,2007,51(12):6380-6394.
[3]LIN C F,WANG S D.Fuzzy support vector machines .IEEE Transactions on Neural Networks,2002,13(2):464-471.
[4]WU Y,LIU Y.Adaptively Weighted Large Margin Classifiers.Journal of Computational and Graphical Statistics,2013,22(2):416-432.
[5]SUN S C,HUANG D.A Novel Robust Smooth Support Vector Machine.Applied Mechanics and Materials,2012,1603(297):1438-1441.
[6]HUANG X L,SHI L.Ramp Loss Linear Programming Support Vector Machine .Machine Learning,2014,15(1):2185-2211.
[7]XU G B,CAO Z,HU B G.Robust Support Vector Machines Based on the Rescaled Hinge loss function.Pattern Recognition,2017,63:139-148.
[8]SHALEV-SHWARTZ S,SINGER Y.Pegasos:Primal Estimated Sub-Gradient Solver for SVM.Mathematical Programming,2011,127(1):3-30.
[9]ZHANG J R.Accelerating Stochastic Gradient Descent usingPredictive Variance Reduction ∥Advances in Neural Information Processing Systems.2013:315-323.
[10]NESTEROV Y.Random gradient-free minimization of convexfunctions.Foundations of Computational Mathematics,2015,2(17):527-566.
[11]LIU S,KAILKHURA B.Zeroth-Order Stochastic Variance Reduction for Nonconvex Optimization .arXiv:1805.10367v2.
[12]LIU L,CHENG M.Stochastic Zeroth-order Optimization viaVariance Reduction Method .arXiv:1805.11811v1.
[13]NEMIROVSKI A,JUDITSKY A.Robust Stochastic Approximation Approach to Stochastic Programming .SIAM Journal on Optimization :A Publication of the Society for Industrial and Applied Mathematics,2009,19(4):1574-1609.
[14]DUCHII J C,JORDAN M I. Optimal rates for zeroth-order convex optimization:The power of two function evaluations .IEEE Transactions on Information Theory,2015,61(5):2788-2806.
[15]GU B,HUO Z. Zeroth-Order Asynchronous Doubly Stochastic Algorithm with Variance Reduction .arXiv:1612.01425v1.
[16]CHENG F,ZHANG J. Large Cost-Sensitive Margin Distribution Machine for Imbalanced Data Classification .Neurocomputing,2017,224(8):45-57.
[17]LACOSTE-JULIEN S,SCHMIDT M.A Simpler Approach to Obtaining an O(1/t) Convergence Rate for Projected Stochastic sub-gradient Method .arXiv:1212.2002v2.
[1] ZHOU Hui, SHI Hao-chen, TU Yao-feng, HUANG Sheng-jun. Robust Deep Neural Network Learning Based on Active Sampling [J]. Computer Science, 2022, 49(7): 164-169.
[2] MENG Yue-bo, MU Si-rong, LIU Guang-hui, XU Sheng-jun, HAN Jiu-qiang. Person Re-identification Method Based on GoogLeNet-GMP Based on Vector Attention Mechanism [J]. Computer Science, 2022, 49(7): 142-147.
[3] HOU Xia-ye, CHEN Hai-yan, ZHANG Bing, YUAN Li-gang, JIA Yi-zhen. Active Metric Learning Based on Support Vector Machines [J]. Computer Science, 2022, 49(6A): 113-118.
[4] SHAN Xiao-ying, REN Ying-chun. Fishing Type Identification of Marine Fishing Vessels Based on Support Vector Machine Optimized by Improved Sparrow Search Algorithm [J]. Computer Science, 2022, 49(6A): 211-216.
[5] CHEN Jing-nian. Acceleration of SVM for Multi-class Classification [J]. Computer Science, 2022, 49(6A): 297-300.
[6] GAO Rong-hua, BAI Qiang, WANG Rong, WU Hua-rui, SUN Xiang. Multi-tree Network Multi-crop Early Disease Recognition Method Based on Improved Attention Mechanism [J]. Computer Science, 2022, 49(6A): 363-369.
[7] YIN Wen-bing, GAO Ge, ZENG Bang, WANG Xiao, CHEN Yi. Speech Enhancement Based on Time-Frequency Domain GAN [J]. Computer Science, 2022, 49(6): 187-192.
[8] XING Yun-bing, LONG Guang-yu, HU Chun-yu, HU Li-sha. Human Activity Recognition Method Based on Class Increment SVM [J]. Computer Science, 2022, 49(5): 78-83.
[9] TANG Chao-chen, QIU Hong-bing, LIU Xin, TANG Qing-hua. Angle Estimation of Coherent MIMO Radar Under the Condition of Non-uniform Noise [J]. Computer Science, 2022, 49(5): 262-265.
[10] ZHENG Jian-wei, HUANG Juan-juan, QIN Meng-jie, XU Hong-hui, LIU Zhi. Hyperspectral Image Denoising Based on Non-local Similarity and Weighted-truncated NuclearNorm [J]. Computer Science, 2021, 48(9): 160-167.
[11] ZHANG Xiao-yu, WANG Bin, AN Wei-chao, YAN Ting, XIANG Jie. Glioma Segmentation Network Based on 3D U-Net++ with Fusion Loss Function [J]. Computer Science, 2021, 48(9): 187-193.
[12] HUANG Ying-qi, CHEN Hong-mei. Cost-sensitive Convolutional Neural Network Based Hybrid Method for Imbalanced Data Classification [J]. Computer Science, 2021, 48(9): 77-85.
[13] TAO Xing-peng, XU Hong-hui, ZHENG Jian-wei, CHEN Wan-jun. Hyperspectral Image Denoising Based on Nonconvex Low Rank Matrix Approximation and TotalVariation Regularization [J]. Computer Science, 2021, 48(8): 125-133.
[14] ZHAO Min, LIU Jing-lei. Semi-supervised Clustering Based on Gaussian Fields and Adaptive Graph Regularization [J]. Computer Science, 2021, 48(7): 137-144.
[15] HUANG Xue-bing, WEI Jia-yi, SHEN Wen-yu, LING Li. MR Image Enhancement Based on Adaptive Weighted Duplicate Filtering and Homomorphic Filtering [J]. Computer Science, 2021, 48(6A): 21-27.
Full text



No Suggested Reading articles found!