Please wait a minute...
浙江大学学报(工学版)  2020, Vol. 54 Issue (2): 331-339    DOI: 10.3785/j.issn.1008-973X.2020.02.014
计算机技术、信息工程     
基于改进三体训练法的半监督专利文本分类方法
胡云青(),邱清盈*(),余秀,武建伟
浙江大学 机械工程学院,浙江 杭州 310027
Semi-supervised patent text classification method based on improved Tri-training algorithm
Yun-qing HU(),Qing-ying QIU*(),Xiu YU,Jian-wei WU
College of Mechanical Engineering, Zhejiang University, Hangzhou 310027, China
 全文: PDF(615 KB)   HTML
摘要:

针对信息增益算法只能考察特征对整个系统的贡献、忽略特征对单个类别的信息贡献的问题,提出改进信息增益算法,通过引入权重系数调整对分类有重要价值的特征的信息增益值,以更好地考虑一个词在类别间的分布不均匀性. 针对传统专利自动分类中训练集标注瓶颈问题,提出基于改进三体训练算法的半监督分类方法,通过追踪每次更新后的训练集样本类别分布来动态改变3个分类器对同一未标记样本类别的预测概率阈值,从而在降低噪音数据影响的同时实现对未标记训练样本的充分利用. 实验结果表明,本研究所提出的分类方法在有标记训练样本较少的情况下,可以取得较好的自动分类效果,并且适当增大未标记样本数据可以增强分类器的泛化能力.

关键词: 专利文本分类特征选择信息增益半监督三体训练算法    
Abstract:

An improved information gain (IG) algorithm was proposed, in order to solve the problem that the IG algorithm can only be used to investigate the contribution of features to the whole system, but not for a single category. The weight coefficient is introduced to adjust the information gain values of features important for classification, so the inhomogeneity of distribution of a word among categories can be better considered. A semi-supervised classification method based on the improved Tri-training algorithm was proposed, aiming at the bottleneck problem of training set labeling in traditional patent automatic classification. The prediction probability thresholds of the same unlabeled sample's category of three classifiers are dynamically changed by tracking the distribution of sample categories of training sets after each iteration. As a result, the influence of noise data is reduced and the full advantage of the unmarked training samples is achieved. Results indicate that the proposed classification method has positive automatic classification effect in the case of fewer labeled training samples, and the generalization ability of the classifier can be improved through appropriately increasing unlabeled sample data.

Key words: patent text classification    feature selection    information gain    semi-supervised    Tri-training algorithm
收稿日期: 2018-12-27 出版日期: 2020-03-10
CLC:  TP 391  
基金资助: 国家自然科学基金资助项目(51075356)
通讯作者: 邱清盈     E-mail: huyunqing616@163.com;medesign@zju.edu.cn
作者简介: 胡云青(1994—),男,硕士生,从事专利知识挖掘和创新设计研究. orcid.org/0000-0003-1710-4423. E-mail: huyunqing616@163.com
服务  
把本文推荐给朋友
加入引用管理器
E-mail Alert
作者相关文章  
胡云青
邱清盈
余秀
武建伟

引用本文:

胡云青,邱清盈,余秀,武建伟. 基于改进三体训练法的半监督专利文本分类方法[J]. 浙江大学学报(工学版), 2020, 54(2): 331-339.

Yun-qing HU,Qing-ying QIU,Xiu YU,Jian-wei WU. Semi-supervised patent text classification method based on improved Tri-training algorithm. Journal of ZheJiang University (Engineering Science), 2020, 54(2): 331-339.

链接本文:

http://www.zjujournals.com/eng/CN/10.3785/j.issn.1008-973X.2020.02.014        http://www.zjujournals.com/eng/CN/Y2020/V54/I2/331

分类器 F1
Dim=150 Dim=250 Dim=350 Dim=450 Dim=550 Dim=650 Dim=750 Dim=850 Dim=950
Xgboost IG_New&Xgboost 0.515 0.516 0.516 0.519 0.516 0.518 0.518 0.518 0.518
IG&Xgboost 0.469 0.471 0.471 0.480 0.473 0.474 0.475 0.474 0.475
SVM IG_New&SVM 0.474 0.470 0.475 0.502 0.475 0.471 0.470 0.474 0.474
IG&SVM 0.430 0.432 0.432 0.450 0.441 0.439 0.430 0.432 0.432
NB IG_New&NB 0.420 0.412 0.425 0.431 0.430 0.420 0.424 0.425 0.429
IG&NB 0.362 0.375 0.367 0.370 0.355 0.383 0.352 0.360 0.354
表 1  专利数据集特征选择对比结果(试验1)
n F1
Num=160 Num=200 Num=240
6 0.667 0.675 0.684
7 0.679 0.688 0.698
8 0.684 0.690 0.698
9 0.683 0.690 0.698
10 0.683 0.690 0.699
表 2  专利数据集未标记训练子集数对比结果(试验2)
|DU| F1
Num=160 Num=200 Num=240
4 000 0.684 0.690 0.698
5 500 0.688 0.694 0.705
7 000 0.692 0.699 0.706
8 500 0.687 0.698 0.701
10 000 0.681 0.693 0.698
表 3  专利数据集未标记训练集中样本数量对比结果(试验3)
分类方法 F1
Num=160 Num=200 Num=240 Num=360
IG_New&Tri-training_New 0.684 0.690 0.698 0.711
IG_New&Tri-training 0.583 0.597 0.603 0.620
IG_New&Xgboost 0.519 0.527 0.538 0.562
IG_New&SVM 0.502 0.510 0.518 0.539
IG_New&NB 0.431 0.442 0.453 0.478
表 4  专利数据集分类方法对比结果(试验4)
分类器 F1
Dim=150 Dim=250 Dim=350 Dim=450 Dim=550 Dim=650 Dim=750 Dim=850 Dim=950
Xgboost IG_New&Xgboost 0.665 0.670 0.670 0.675 0.708 0.704 0.700 0.700 0.700
IG&Xgboost 0.648 0.660 0.660 0.662 0.690 0.690 0.674 0.670 0.670
SVM IG_New&SVM 0.664 0.670 0.665 0.673 0.671 0.674 0.674 0.674 0.635
IG&SVM 0.595 0.652 0.584 0.594 0.635 0.604 0.585 0.592 0.592
NB IG_New&NB 0.660 0.660 0.660 0.667 0.667 0.654 0.663 0.660 0.650
IG&NB 0.625 0.594 0.610 0.653 0.660 0.622 0.600 0.615 0.602
表 5  aclImdb数据集特征选择对比结果(试验5)
n F1
Num=160 Num=200 Num=240
6 0.710 0.715 0.729
7 0.714 0.715 0.731
8 0.726 0.730 0.741
9 0.717 0.730 0.745
10 0.719 0.728 0.741
表 6  aclImdb数据集未标记训练子集数对比结果(试验6)
|DU| F1
Num=160 Num=200 Num=240
4 000 0.726 0.730 0.741
5 500 0.730 0.736 0.745
7 000 0.725 0.741 0.741
8500 0.725 0.741 0.741
10 000 0.732 0.745 0.645
表 7  aclImdb数据集未标记训练集中样本数量选择对比结果(试验7)
分类方法 F1
Num=160 Num=200 Num=240 Num=360
IG_New&Tri-training_New 0.726 0.730 0.741 0.759
IG_New&Tri-training 0.710 0.724 0.735 0.738
IG_New&Xgboost 0.675 0.681 0.685 0.692
IG_New&SVM 0.673 0.676 0.679 0.684
IG_New&NB 0.667 0.675 0.680 0.689
表 8  aclImdb数据集分类方法选择对比结果(试验8)
1 TAKERU M, SHIN-ICHI M, SHIN I, et al Virtual adversarial training: a regularization method for supervisedand semi-supervised learning[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2018, 2883039
2 YANG H F, LIN K, CHEN C S Supervised learning of semantics-preserving hash via deep convolutional neural networks[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2018, 40 (2): 437- 451
doi: 10.1109/TPAMI.2017.2666812
3 周志华 基于分歧的半监督学习[J]. 自动化学报, 2013, 39 (11): 1871- 1878
ZHOU Zhi-hua Disagreement-based semi-supervised learning[J]. Actaautomatica Sinica, 2013, 39 (11): 1871- 1878
doi: 10.3724/SP.J.1004.2013.01871
4 CHAPELLE O, SCH?LKOPFB, ZIEN A. Semi-supervised learning [J]. IEEE Transactions on Neural Networks, 2009, 20(3): 542.
5 TURIAN J, RATINOV L, BENGIO Y. Word representations: a simple and general method for semi-supervised learning [C]// Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics, Association for Computational Linguistics. Uppsala: ACL, 2010: 384-394.
6 KIPF T N, WELLING M. Semi-supervised classification with graph convolutional networks [C]// ICLR 2017. [s.l.]: ICLR, 2017: 1-14.
7 DAI A M, LE Q V. Semi-supervised sequencelearning [C]// Neural Information Processing Systems. Montreal: NIPS, 2015: 1−9.
8 SHAHSHAHANI B M, LANDGREBE D A The effect of unlabeled samples in reducing the small sample size problem and mitigating the Hughes phenomenon[J]. IEEE Transactions on Geoscience and Remote Sensing, 1994, 32 (5): 1087- 1095
doi: 10.1109/36.312897
9 MILLER D, UYAR H. A mixture of experts classifier with learning based on both labeled and unlabeled data [C]// Advances in Neural Information Processing Systems 9. Denver: NIPS, 1997: 571-577.
10 NIGAM K, MCCALLUM A K, THRUN, S Text classification from labeled and unlabeled documents using EM[J]. Machine Learning, 2000, 39 (2/3): 103- 134
doi: 10.1023/A:1007692713085
11 JOACHIMS T. Transductive inference for text classification using support vector machines [C]// Proceedings of the 16th International Conference on Machine Learning. San Francisco: Morgan Kaufmann Publishers Inc, 1999: 200-209.
12 ZHU X J, GHAHRAMANI Z, LAFFERTY J. Semi-supervised learning using gaussian fields and harmonic functions [C]// Proceedings of the 20th International Conference on Machine Learning. Washington DC: ICML, 2003: 912-919.
13 ZHOU Z H, LI M Semi-supervised learning by disagrement[J]. Knowledge and Information Systems, 2010, 24 (3): 415- 439
doi: 10.1007/s10115-009-0209-z
14 BLUM A, MITCHELL T. Combining labeled and unlabeled data with co-training [C]// Proceedings of the 11th Annual Conference on Computational Learning Theory. Madison: ACM, 1998: 92-100.
15 张倩, 刘怀亮 一种基于半监督学习的短文本分类方法[J]. 现代图书情报技术, 2013, 29 (2): 30- 35
ZHANG Qian, LIU Huai-liang An algorithm of short text classification based on semi-supervised learning[J]. New Technology of Library and Information Service, 2013, 29 (2): 30- 35
16 LI S S, HUANG C R, ZHOU G D, et al. Employing personal/impersonal views in supervised and 30 semi-supervised sentiment classification [C]// Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics. Uppsala: ACL, 2010: 414-423.
17 GOLDAN S A, ZHOU Y. Enhancing supervised learning with unlabeled data [C]// Proceedings of the 17th International Conference on Machine Learning. San Francisco: IMLS: 327-334.
18 ZHOU Z H, LI M. Tri-Training: exploiting unlabeled data using three classifiers [J]. IEEE Transactions on Knowledge and Data Engineering, 2005, 17(11): 1529-1541.
19 SAITO K, USHIKU Y, HARADA T. A symmetric tri-training for unsupervised domain adaptation [C]// Proceedings of the 34th International Conference on Machine Learning. Sydney: JMLR, 2017: 2988-2997.
20 TELLEZ E S, MOCTEZUMA D, MIRANDA-JIMéNEZ S, et al An automated text categorization framework based on hyperparameter optimization[J]. Knowledge-Based Systems, 2018, 149: 110- 123
doi: 10.1016/j.knosys.2018.03.003
21 XU Y, CHEN L. Term-frequency based feature selection methods for text categorization [C]// Proceedings of the 2010 4th International Conference on Genetic and Evolutionary Computing. Shenzhen: ICGEC, 2010: 280-283.
22 SHANG C, MIN L, FENG S, et al Feature selection via maximizing global information gain for text classification[J]. Knowledge-Based Systems, 2013, 54 (4): 298- 309
23 YIN C Y, XI J W Maximum entropy model for mobile text classificationin cloud computing using improved information gain algorithm[J]. Multimedia Tools and Applications, 2017, 76 (16): 16875- 16891
doi: 10.1007/s11042-016-3545-5
24 石慧, 贾代平, 苗培 基于词频信息的改进信息增益文本特征选择算法[J]. 计算机应用, 2014, 34 (11): 3279- 3282
SHI Hui, JIA Dai-ping, MIAO Pei Improved information gain text feature selection algorithm based on word frequency information[J]. Journal of Computer Applications, 2014, 34 (11): 3279- 3282
25 KO Y. A study of term weighting schemes using class information for text classification [C]// Proceedings of the 35th International ACM SIGIR Conferenceon Research and Development in Information Retrieval. Portland: ACM, 2012: 1029-1030.
26 SUN S L Local within-class accuracies for weighting individual outputs in multiple classifier systems[J]. Pattern Recognition Letters, 2010, 31 (2): 119- 124
doi: 10.1016/j.patrec.2009.09.017
27 WANG S, MINGKU L L, YAO X. Resampling-based ensemble methods for online class imbalance learning [J]. IEEE Transactions on Knowledge and Data Engineering, 2015, 27(5): 1356-1368.
[1] 王友卫,凤丽洲. 基于合群度-隶属度噪声检测及动态特征选择的改进AdaBoost算法[J]. 浙江大学学报(工学版), 2021, 55(2): 367-376.
[2] 晋耀,张为. 采用Anchor-Free网络结构的实时火灾检测算法[J]. 浙江大学学报(工学版), 2020, 54(12): 2430-2436.
[3] 刘如辉, 黄炜平, 王凯, 刘创, 梁军. 半监督约束集成的快速密度峰值聚类算法[J]. 浙江大学学报(工学版), 2018, 52(11): 2191-2200.
[4] 黄正宇, 蒋鑫龙, 刘军发, 陈益强, 谷洋. 基于融合特征的半监督流形约束定位方法[J]. 浙江大学学报(工学版), 2017, 51(4): 655-662.
[5] 丰小月, 梁艳春, 林希珣, 管仁初. 永恒语言学习研究与发展[J]. 浙江大学学报(工学版), 2017, 51(1): 82-88.
[6] 林亦宁, 韦巍, 戴渊明. 半监督Hough Forest跟踪算法[J]. J4, 2013, 47(6): 977-983.
[7] 朱晓恩, 郝欣, 夏顺仁. 基于Levy flight的特征选择算法[J]. J4, 2013, 47(4): 638-643.
[8] 姚伏天, 钱沄涛, 李吉明. 空间约束半监督高斯过程下的高光谱图像分类[J]. J4, 2012, 46(7): 1295-1300.
[9] 潘俊,孔繁胜,王瑞琴. 加权成对约束投影半监督聚类[J]. J4, 2011, 45(5): 934-940.
[10] 张玉红, 胡学钢, 杨秋洁. 一种适用于数据流分类的特征选择方法[J]. J4, 2011, 45(12): 2247-2251.
[11] 汤健, 赵立杰, 岳恒, 柴天佑. 基于多源数据特征融合的球磨机负荷软测量[J]. J4, 2010, 44(7): 1406-1413.
[12] 叶建芳, 潘晓弘, 王正肖, 等. 基于免疫离散粒子群算法的调度属性选择[J]. J4, 2009, 43(12): 2203-2207.
[13] 谢波 陈岭 陈根才 陈纯. 普通话语音情感识别的特征选择技术[J]. J4, 2007, 41(11): 1816-1822.