Please wait a minute...
J4  2012, Vol. 46 Issue (7): 1327-1332    DOI: 10.3785/j.issn.1008-973X.2012.07.027
电气工程     
一类支持向量机的快速增量学习方法
王洪波, 赵光宙, 齐冬莲, 卢达
浙江大学 电气工程学院, 浙江 杭州 310027
Fast incremental learning method for one-class support vector machine
WANG Hong-bo, ZHAO Guang-zhou, QI Dong-lian, LU Da
College of Electrical Engineering, Zhejiang University, Hangzhou 310027, China
 全文: PDF  HTML
摘要:

提出一类支持向量机(OCSVM)的快速增量学习方法. 在OCSVM初始分类器的基础上, 添加一个德尔塔函数形成新的决策函数, 实现增量学习的过程. 通过分析德尔塔函数的几何特性, 构造出与OCSVM相似的优化目标函数, 从而求解德尔塔函数的参数. 优化问题能够进一步转化为标准的二次规划(QP)问题, 但是在优化过程中Karush-Kuhn-Tucker(KKT)条件发生很大改变. 根据新的KKT条件, 为QPP提出修正的序贯最小优化(SMO)求解方法. 整个学习过程直接操作初始分类器,仅仅训练新增样本,避免了对初始样本的重复训练, 因此能够节约大量的学习时间和存储空间. 实验结果表明, 提出的快速增量学习方法在时间和精度上均优于其他的增量学习方法.

Abstract:

A fast incremental learning method of one-class support vector machine (OCSVM) was proposed. A new decision function of OCSVM was constructed by adding a delta function based on the initial classifier in order to achieve the incremental learning. The objective function which had the similar form with OCSVM was constructed to solve the parameters of delta function by analyzing the geometric properties of delta function. The optimization problem can be converted into a standard quadratic programming (QP) problem, but the Karush-Kuhn-Tucker(KKT)conditions greatly changed. An improved sequential minimal optimization (SMO) method was proposed according to the new KKT condition. Directly manipulating the initial classifier and under its influence, the method only trained the new data, so saved much learning time and storage space. Experimental results show that the fast incremental learning method performs better than other incremental methods in both time and accuracy.

出版日期: 2012-07-01
:  TP 181  
基金资助:

 国家自然科学基金资助项目(60872070); 浙江省科技计划资助项目(2008C21141); 浙江省科技计划资助项目(2010C33044);浙江省重大科技攻关项目(2010C11069).

通讯作者: 赵光宙, 男, 教授, 博导.     E-mail: zhaogz@zju.edu.cn
作者简介: 王洪波(1982-), 男, 博士生, 从事模式识别、支持向量机的研究. E-mail: whongbo@zju.edu.cn
服务  
把本文推荐给朋友
加入引用管理器
E-mail Alert
RSS
作者相关文章  

引用本文:

王洪波, 赵光宙, 齐冬莲, 卢达. 一类支持向量机的快速增量学习方法[J]. J4, 2012, 46(7): 1327-1332.

WANG Hong-bo, ZHAO Guang-zhou, QI Dong-lian, LU Da. Fast incremental learning method for one-class support vector machine. J4, 2012, 46(7): 1327-1332.

链接本文:

http://www.zjujournals.com/eng/CN/10.3785/j.issn.1008-973X.2012.07.027        http://www.zjujournals.com/eng/CN/Y2012/V46/I7/1327

[1] VAPNIK V N. The nature of statistical learning theory [M]. New York: Springer, 1995.
[2] 潘志松, 陈斌, 缪志敏, 等. OneClass分类器研究 [J]. 电子学报, 2009, 37(11): 2496-2503.
PAN Zhisong, CHEN Bin, MIAO Zhimin, et al. Overview of study on oneclass classifiers [J]. ACTA Electronica Sinica, 2009, 37(11): 2496-2503.
[3] 王钰, 周志华, 周傲英. 机器学习及其应用 [M]. 北京: 清华大学出版社, 2006: 32-56.
[4] SCHLKOPF B, WILLIANMSON R, SMOLA A, et al. Support vector method for novelty detection [J]. Advances in Neural Information Processing Systems, 2000, 12(3): 582-588.
[5] TAX D M J, DUIN R P W. Support vector data description [J]. Machine Learning, 2004, 54(1): 45-66.
[6] 徐磊, 赵光宙, 顾弘. 基于作用集的一类支持向量机递推式训练算法 [J]. 浙江大学学报: 工学版, 2009, 43(1): 42-46.
XU Lei, ZHAO Guangzhou, GU Hong. Recursive training algorithm for one class support vector machine based on active set method [J]. Journal of Zhejiang University: Engineering Science, 2009, 43(1): 42-46.
[7] MUOZMAR J, BOVOLO F, GMEZCHOVA L, et al. Semisupervised oneclass support vector machines for classification of remote sensing data [J]. IEEE Transactions on Geoscience and Remote Sensing, 2010, 48(8): 3188-3197.
[8] SYED N A, LIU H, SUNG K K. Incremental learning with support vector machines [C]∥ Proceedings of the Workshop on Support Vector Machines at the International Joint Conference on Artificial Intelligence (IJCA199). Stockholm, Sweden: IEEE, 1999.
[9]CAUWENBERGHS G, POGGIO T. Incremental and decremental support vector machine learning [J]. Machine Learning, 2001, 44(13): 409-415.
[10] 孔锐, 张冰. 一种快速支持向量机增量学习算法 [J]. 控制与决策, 2005, 20(10): 1129-1132.
KONG Rui, ZHANG Bing. A fast incremental learning algorithm for support vector machine [J]. Control and Decision, 2005, 20(10): 1129-1132.
[11] TAX D M J, PAVEL L. Online SVM learning: from classification to data description and back [C]∥ Proceedings of 13th Workshop on Neural Networks for Signal Processing. Molina: IEEE, 2003: 499-508.
[12] KIM P J, CHANG H J, CHOI J Y. Fast incremental learning for oneclass support vector classifier using sample margin information [C]∥ Proceedings of 19th International Conference on Pattern Recognition (ICPR). Tampa: IEEE, 2008: 1-4.
[13] BERTSEKAS D P, HAGER W W, MANGASARIAN O L. Nonlinear programming [M]. New York: Athena Scientific Belmont, MA, 1999.
[14] SCHLKOPF B, BURGES C J C, SMOLA A J. Advances in kernel methods: support vector learning [M]. [S. l.]: MIT, 1998: 185-208.
[15] FLAKE G W, LAWRENCE S. Efficient SVM regression training with SMO [J]. Machine Learning, 2002, 46(1): 271-290.
[16] SCHLKOPF B, PLATT J C, JOHN S T, et al. Estimating the support of a highdimensional distribution [J]. Neural Computation, 2001, 13(7): 1443-1471.
[17] FRANK A, ASUNCION A. UCI machine learning repository [DB/OL]. [20100301]. http:∥archive.ics.uci.edu/ml.

[1] 林亦宁, 韦巍, 戴渊明. 半监督Hough Forest跟踪算法[J]. J4, 2013, 47(6): 977-983.
[2] 李侃,黄文雄,黄忠华. 基于支持向量机的多传感器探测目标分类方法[J]. J4, 2013, 47(1): 15-22.
[3] 潘俊, 孔繁胜, 王瑞琴. 局部敏感判别直推学习机[J]. J4, 2012, 46(6): 987-994.
[4] 艾解清, 高济, 彭艳斌, 郑志军. 基于直推式支持向量机的协商决策模型[J]. J4, 2012, 46(6): 967-973.
[5] 金卓军, 钱徽, 朱淼良. 基于倾向性分析的轨迹评测技术[J]. J4, 2011, 45(10): 1732-1737.
[6] 顾弘, 赵光宙. 广义局部图像距离函数下的图像分类与识别[J]. J4, 2011, 45(4): 596-601.
[7] 罗建宏,陈德钊. 兼顾正确率和差异性的自适应集成算法及应用[J]. J4, 2011, 45(3): 557-562.
[8] 商秀芹, 卢建刚, 孙优贤. 基于遗传规划的铁矿烧结终点2级预测模型[J]. J4, 2010, 44(7): 1266-1269.
[9] 徐磊, 赵光宙, 顾弘. 成对耦合分类器的多球体预处理方法[J]. J4, 2010, 44(2): 237-242.