Please wait a minute...
J4  2012, Vol. 46 Issue (7): 1327-1332    DOI: 10.3785/j.issn.1008-973X.2012.07.027
    
Fast incremental learning method for one-class support vector machine
WANG Hong-bo, ZHAO Guang-zhou, QI Dong-lian, LU Da
College of Electrical Engineering, Zhejiang University, Hangzhou 310027, China
Download:   PDF(0KB) HTML
Export: BibTeX | EndNote (RIS)      

Abstract  

A fast incremental learning method of one-class support vector machine (OCSVM) was proposed. A new decision function of OCSVM was constructed by adding a delta function based on the initial classifier in order to achieve the incremental learning. The objective function which had the similar form with OCSVM was constructed to solve the parameters of delta function by analyzing the geometric properties of delta function. The optimization problem can be converted into a standard quadratic programming (QP) problem, but the Karush-Kuhn-Tucker(KKT)conditions greatly changed. An improved sequential minimal optimization (SMO) method was proposed according to the new KKT condition. Directly manipulating the initial classifier and under its influence, the method only trained the new data, so saved much learning time and storage space. Experimental results show that the fast incremental learning method performs better than other incremental methods in both time and accuracy.



Published: 01 July 2012
CLC:  TP 181  
Cite this article:

WANG Hong-bo, ZHAO Guang-zhou, QI Dong-lian, LU Da. Fast incremental learning method for one-class support vector machine. J4, 2012, 46(7): 1327-1332.

URL:

http://www.zjujournals.com/eng/10.3785/j.issn.1008-973X.2012.07.027     OR     http://www.zjujournals.com/eng/Y2012/V46/I7/1327


一类支持向量机的快速增量学习方法

提出一类支持向量机(OCSVM)的快速增量学习方法. 在OCSVM初始分类器的基础上, 添加一个德尔塔函数形成新的决策函数, 实现增量学习的过程. 通过分析德尔塔函数的几何特性, 构造出与OCSVM相似的优化目标函数, 从而求解德尔塔函数的参数. 优化问题能够进一步转化为标准的二次规划(QP)问题, 但是在优化过程中Karush-Kuhn-Tucker(KKT)条件发生很大改变. 根据新的KKT条件, 为QPP提出修正的序贯最小优化(SMO)求解方法. 整个学习过程直接操作初始分类器,仅仅训练新增样本,避免了对初始样本的重复训练, 因此能够节约大量的学习时间和存储空间. 实验结果表明, 提出的快速增量学习方法在时间和精度上均优于其他的增量学习方法.

[1] VAPNIK V N. The nature of statistical learning theory [M]. New York: Springer, 1995.
[2] 潘志松, 陈斌, 缪志敏, 等. OneClass分类器研究 [J]. 电子学报, 2009, 37(11): 2496-2503.
PAN Zhisong, CHEN Bin, MIAO Zhimin, et al. Overview of study on oneclass classifiers [J]. ACTA Electronica Sinica, 2009, 37(11): 2496-2503.
[3] 王钰, 周志华, 周傲英. 机器学习及其应用 [M]. 北京: 清华大学出版社, 2006: 32-56.
[4] SCHLKOPF B, WILLIANMSON R, SMOLA A, et al. Support vector method for novelty detection [J]. Advances in Neural Information Processing Systems, 2000, 12(3): 582-588.
[5] TAX D M J, DUIN R P W. Support vector data description [J]. Machine Learning, 2004, 54(1): 45-66.
[6] 徐磊, 赵光宙, 顾弘. 基于作用集的一类支持向量机递推式训练算法 [J]. 浙江大学学报: 工学版, 2009, 43(1): 42-46.
XU Lei, ZHAO Guangzhou, GU Hong. Recursive training algorithm for one class support vector machine based on active set method [J]. Journal of Zhejiang University: Engineering Science, 2009, 43(1): 42-46.
[7] MUOZMAR J, BOVOLO F, GMEZCHOVA L, et al. Semisupervised oneclass support vector machines for classification of remote sensing data [J]. IEEE Transactions on Geoscience and Remote Sensing, 2010, 48(8): 3188-3197.
[8] SYED N A, LIU H, SUNG K K. Incremental learning with support vector machines [C]∥ Proceedings of the Workshop on Support Vector Machines at the International Joint Conference on Artificial Intelligence (IJCA199). Stockholm, Sweden: IEEE, 1999.
[9]CAUWENBERGHS G, POGGIO T. Incremental and decremental support vector machine learning [J]. Machine Learning, 2001, 44(13): 409-415.
[10] 孔锐, 张冰. 一种快速支持向量机增量学习算法 [J]. 控制与决策, 2005, 20(10): 1129-1132.
KONG Rui, ZHANG Bing. A fast incremental learning algorithm for support vector machine [J]. Control and Decision, 2005, 20(10): 1129-1132.
[11] TAX D M J, PAVEL L. Online SVM learning: from classification to data description and back [C]∥ Proceedings of 13th Workshop on Neural Networks for Signal Processing. Molina: IEEE, 2003: 499-508.
[12] KIM P J, CHANG H J, CHOI J Y. Fast incremental learning for oneclass support vector classifier using sample margin information [C]∥ Proceedings of 19th International Conference on Pattern Recognition (ICPR). Tampa: IEEE, 2008: 1-4.
[13] BERTSEKAS D P, HAGER W W, MANGASARIAN O L. Nonlinear programming [M]. New York: Athena Scientific Belmont, MA, 1999.
[14] SCHLKOPF B, BURGES C J C, SMOLA A J. Advances in kernel methods: support vector learning [M]. [S. l.]: MIT, 1998: 185-208.
[15] FLAKE G W, LAWRENCE S. Efficient SVM regression training with SMO [J]. Machine Learning, 2002, 46(1): 271-290.
[16] SCHLKOPF B, PLATT J C, JOHN S T, et al. Estimating the support of a highdimensional distribution [J]. Neural Computation, 2001, 13(7): 1443-1471.
[17] FRANK A, ASUNCION A. UCI machine learning repository [DB/OL]. [20100301]. http:∥archive.ics.uci.edu/ml.

[1] LIN Yi-ning, WEI Wei, DAI Yuan-ming. Semi-supervised Hough Forest tracking method[J]. J4, 2013, 47(6): 977-983.
[2] LI Kan, HUANG Wen-xiong, HUANG Zhong-hua. Multi-sensor detected object classification method based on
support vector machine
[J]. J4, 2013, 47(1): 15-22.
[3] AI Jie-qing, GAO Ji, PENG Yan-bin, ZHENG Zhi-jun. Negotiation decision model based on transductive
support vector machine
[J]. J4, 2012, 46(6): 967-973.
[4] PAN Jun, KONG Fan-sheng, WANG Rui-qin. Locality sensitive discriminant transductive learning[J]. J4, 2012, 46(6): 987-994.
[5] JIN Zhuo-jun, QIAN Hui, ZHU Miao-liang. Trajectory evaluation method based on intention analysis[J]. J4, 2011, 45(10): 1732-1737.
[6] GU Hong, ZHAO Guang-zhou. Image retrieval and recognition based on generalized
local distance functions
[J]. J4, 2011, 45(4): 596-601.
[7] LUO Jian-hong, CHEN De-zhao. Application of adaptive ensemble algorithm based on
correctness and diversity
[J]. J4, 2011, 45(3): 557-562.
[8] SHANG Xiu-Qin, LEI Jian-Gang, SUN You-Xian. Genetic programming based twoterm prediction model of iron ore burning through point[J]. J4, 2010, 44(7): 1266-1269.
[9] XU Lei, DIAO Guang-Zhou, GU Hong. Preprocess method of pairwise coupling based on multi-spheres[J]. J4, 2010, 44(2): 237-242.