Please wait a minute...
Journal of ZheJiang University (Engineering Science)  2020, Vol. 54 Issue (2): 331-339    DOI: 10.3785/j.issn.1008-973X.2020.02.014
Computer Technology, Information Engineering     
Semi-supervised patent text classification method based on improved Tri-training algorithm
Yun-qing HU(),Qing-ying QIU*(),Xiu YU,Jian-wei WU
College of Mechanical Engineering, Zhejiang University, Hangzhou 310027, China
Download: HTML     PDF(615KB) HTML
Export: BibTeX | EndNote (RIS)      

Abstract  

An improved information gain (IG) algorithm was proposed, in order to solve the problem that the IG algorithm can only be used to investigate the contribution of features to the whole system, but not for a single category. The weight coefficient is introduced to adjust the information gain values of features important for classification, so the inhomogeneity of distribution of a word among categories can be better considered. A semi-supervised classification method based on the improved Tri-training algorithm was proposed, aiming at the bottleneck problem of training set labeling in traditional patent automatic classification. The prediction probability thresholds of the same unlabeled sample's category of three classifiers are dynamically changed by tracking the distribution of sample categories of training sets after each iteration. As a result, the influence of noise data is reduced and the full advantage of the unmarked training samples is achieved. Results indicate that the proposed classification method has positive automatic classification effect in the case of fewer labeled training samples, and the generalization ability of the classifier can be improved through appropriately increasing unlabeled sample data.



Key wordspatent text classification      feature selection      information gain      semi-supervised      Tri-training algorithm     
Received: 27 December 2018      Published: 10 March 2020
CLC:  TP 391  
  TH 122  
Corresponding Authors: Qing-ying QIU     E-mail: huyunqing616@163.com;medesign@zju.edu.cn
Cite this article:

Yun-qing HU,Qing-ying QIU,Xiu YU,Jian-wei WU. Semi-supervised patent text classification method based on improved Tri-training algorithm. Journal of ZheJiang University (Engineering Science), 2020, 54(2): 331-339.

URL:

http://www.zjujournals.com/eng/10.3785/j.issn.1008-973X.2020.02.014     OR     http://www.zjujournals.com/eng/Y2020/V54/I2/331


基于改进三体训练法的半监督专利文本分类方法

针对信息增益算法只能考察特征对整个系统的贡献、忽略特征对单个类别的信息贡献的问题,提出改进信息增益算法,通过引入权重系数调整对分类有重要价值的特征的信息增益值,以更好地考虑一个词在类别间的分布不均匀性. 针对传统专利自动分类中训练集标注瓶颈问题,提出基于改进三体训练算法的半监督分类方法,通过追踪每次更新后的训练集样本类别分布来动态改变3个分类器对同一未标记样本类别的预测概率阈值,从而在降低噪音数据影响的同时实现对未标记训练样本的充分利用. 实验结果表明,本研究所提出的分类方法在有标记训练样本较少的情况下,可以取得较好的自动分类效果,并且适当增大未标记样本数据可以增强分类器的泛化能力.


关键词: 专利文本分类,  特征选择,  信息增益,  半监督,  三体训练算法 
分类器 F1
Dim=150 Dim=250 Dim=350 Dim=450 Dim=550 Dim=650 Dim=750 Dim=850 Dim=950
Xgboost IG_New&Xgboost 0.515 0.516 0.516 0.519 0.516 0.518 0.518 0.518 0.518
IG&Xgboost 0.469 0.471 0.471 0.480 0.473 0.474 0.475 0.474 0.475
SVM IG_New&SVM 0.474 0.470 0.475 0.502 0.475 0.471 0.470 0.474 0.474
IG&SVM 0.430 0.432 0.432 0.450 0.441 0.439 0.430 0.432 0.432
NB IG_New&NB 0.420 0.412 0.425 0.431 0.430 0.420 0.424 0.425 0.429
IG&NB 0.362 0.375 0.367 0.370 0.355 0.383 0.352 0.360 0.354
Tab.1 Comparsion results of feature selection on patent dataset (Test 1)
n F1
Num=160 Num=200 Num=240
6 0.667 0.675 0.684
7 0.679 0.688 0.698
8 0.684 0.690 0.698
9 0.683 0.690 0.698
10 0.683 0.690 0.699
Tab.2 Comparsion results of number of unlabeled training subsets on patent dateset (Test 2)
|DU| F1
Num=160 Num=200 Num=240
4 000 0.684 0.690 0.698
5 500 0.688 0.694 0.705
7 000 0.692 0.699 0.706
8 500 0.687 0.698 0.701
10 000 0.681 0.693 0.698
Tab.3 Comparison results of sample size of unlabeled training sets on patent dataset (Test 3)
分类方法 F1
Num=160 Num=200 Num=240 Num=360
IG_New&Tri-training_New 0.684 0.690 0.698 0.711
IG_New&Tri-training 0.583 0.597 0.603 0.620
IG_New&Xgboost 0.519 0.527 0.538 0.562
IG_New&SVM 0.502 0.510 0.518 0.539
IG_New&NB 0.431 0.442 0.453 0.478
Tab.4 Comparsion results of classification method selection on patent dataset (Test 4)
分类器 F1
Dim=150 Dim=250 Dim=350 Dim=450 Dim=550 Dim=650 Dim=750 Dim=850 Dim=950
Xgboost IG_New&Xgboost 0.665 0.670 0.670 0.675 0.708 0.704 0.700 0.700 0.700
IG&Xgboost 0.648 0.660 0.660 0.662 0.690 0.690 0.674 0.670 0.670
SVM IG_New&SVM 0.664 0.670 0.665 0.673 0.671 0.674 0.674 0.674 0.635
IG&SVM 0.595 0.652 0.584 0.594 0.635 0.604 0.585 0.592 0.592
NB IG_New&NB 0.660 0.660 0.660 0.667 0.667 0.654 0.663 0.660 0.650
IG&NB 0.625 0.594 0.610 0.653 0.660 0.622 0.600 0.615 0.602
Tab.5 Comparsion results of feature selection on aclImdb dataset (Test 5)
n F1
Num=160 Num=200 Num=240
6 0.710 0.715 0.729
7 0.714 0.715 0.731
8 0.726 0.730 0.741
9 0.717 0.730 0.745
10 0.719 0.728 0.741
Tab.6 Comparsion results of number of unlabeled training subsets on aclImdb dataset (Test 6)
|DU| F1
Num=160 Num=200 Num=240
4 000 0.726 0.730 0.741
5 500 0.730 0.736 0.745
7 000 0.725 0.741 0.741
8500 0.725 0.741 0.741
10 000 0.732 0.745 0.645
Tab.7 Comparison results of sample size of unlabeled training sets on aclImdb dataset (Test 7)
分类方法 F1
Num=160 Num=200 Num=240 Num=360
IG_New&Tri-training_New 0.726 0.730 0.741 0.759
IG_New&Tri-training 0.710 0.724 0.735 0.738
IG_New&Xgboost 0.675 0.681 0.685 0.692
IG_New&SVM 0.673 0.676 0.679 0.684
IG_New&NB 0.667 0.675 0.680 0.689
Tab.8 Comparsion results of classification method selection on aclImdb dataset (Test 8)
[1]   TAKERU M, SHIN-ICHI M, SHIN I, et al Virtual adversarial training: a regularization method for supervisedand semi-supervised learning[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2018, 2883039
[2]   YANG H F, LIN K, CHEN C S Supervised learning of semantics-preserving hash via deep convolutional neural networks[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2018, 40 (2): 437- 451
doi: 10.1109/TPAMI.2017.2666812
[3]   周志华 基于分歧的半监督学习[J]. 自动化学报, 2013, 39 (11): 1871- 1878
ZHOU Zhi-hua Disagreement-based semi-supervised learning[J]. Actaautomatica Sinica, 2013, 39 (11): 1871- 1878
doi: 10.3724/SP.J.1004.2013.01871
[4]   CHAPELLE O, SCH?LKOPFB, ZIEN A. Semi-supervised learning [J]. IEEE Transactions on Neural Networks, 2009, 20(3): 542.
[5]   TURIAN J, RATINOV L, BENGIO Y. Word representations: a simple and general method for semi-supervised learning [C]// Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics, Association for Computational Linguistics. Uppsala: ACL, 2010: 384-394.
[6]   KIPF T N, WELLING M. Semi-supervised classification with graph convolutional networks [C]// ICLR 2017. [s.l.]: ICLR, 2017: 1-14.
[7]   DAI A M, LE Q V. Semi-supervised sequencelearning [C]// Neural Information Processing Systems. Montreal: NIPS, 2015: 1−9.
[8]   SHAHSHAHANI B M, LANDGREBE D A The effect of unlabeled samples in reducing the small sample size problem and mitigating the Hughes phenomenon[J]. IEEE Transactions on Geoscience and Remote Sensing, 1994, 32 (5): 1087- 1095
doi: 10.1109/36.312897
[9]   MILLER D, UYAR H. A mixture of experts classifier with learning based on both labeled and unlabeled data [C]// Advances in Neural Information Processing Systems 9. Denver: NIPS, 1997: 571-577.
[10]   NIGAM K, MCCALLUM A K, THRUN, S Text classification from labeled and unlabeled documents using EM[J]. Machine Learning, 2000, 39 (2/3): 103- 134
doi: 10.1023/A:1007692713085
[11]   JOACHIMS T. Transductive inference for text classification using support vector machines [C]// Proceedings of the 16th International Conference on Machine Learning. San Francisco: Morgan Kaufmann Publishers Inc, 1999: 200-209.
[12]   ZHU X J, GHAHRAMANI Z, LAFFERTY J. Semi-supervised learning using gaussian fields and harmonic functions [C]// Proceedings of the 20th International Conference on Machine Learning. Washington DC: ICML, 2003: 912-919.
[13]   ZHOU Z H, LI M Semi-supervised learning by disagrement[J]. Knowledge and Information Systems, 2010, 24 (3): 415- 439
doi: 10.1007/s10115-009-0209-z
[14]   BLUM A, MITCHELL T. Combining labeled and unlabeled data with co-training [C]// Proceedings of the 11th Annual Conference on Computational Learning Theory. Madison: ACM, 1998: 92-100.
[15]   张倩, 刘怀亮 一种基于半监督学习的短文本分类方法[J]. 现代图书情报技术, 2013, 29 (2): 30- 35
ZHANG Qian, LIU Huai-liang An algorithm of short text classification based on semi-supervised learning[J]. New Technology of Library and Information Service, 2013, 29 (2): 30- 35
[16]   LI S S, HUANG C R, ZHOU G D, et al. Employing personal/impersonal views in supervised and 30 semi-supervised sentiment classification [C]// Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics. Uppsala: ACL, 2010: 414-423.
[17]   GOLDAN S A, ZHOU Y. Enhancing supervised learning with unlabeled data [C]// Proceedings of the 17th International Conference on Machine Learning. San Francisco: IMLS: 327-334.
[18]   ZHOU Z H, LI M. Tri-Training: exploiting unlabeled data using three classifiers [J]. IEEE Transactions on Knowledge and Data Engineering, 2005, 17(11): 1529-1541.
[19]   SAITO K, USHIKU Y, HARADA T. A symmetric tri-training for unsupervised domain adaptation [C]// Proceedings of the 34th International Conference on Machine Learning. Sydney: JMLR, 2017: 2988-2997.
[20]   TELLEZ E S, MOCTEZUMA D, MIRANDA-JIMéNEZ S, et al An automated text categorization framework based on hyperparameter optimization[J]. Knowledge-Based Systems, 2018, 149: 110- 123
doi: 10.1016/j.knosys.2018.03.003
[21]   XU Y, CHEN L. Term-frequency based feature selection methods for text categorization [C]// Proceedings of the 2010 4th International Conference on Genetic and Evolutionary Computing. Shenzhen: ICGEC, 2010: 280-283.
[22]   SHANG C, MIN L, FENG S, et al Feature selection via maximizing global information gain for text classification[J]. Knowledge-Based Systems, 2013, 54 (4): 298- 309
[23]   YIN C Y, XI J W Maximum entropy model for mobile text classificationin cloud computing using improved information gain algorithm[J]. Multimedia Tools and Applications, 2017, 76 (16): 16875- 16891
doi: 10.1007/s11042-016-3545-5
[24]   石慧, 贾代平, 苗培 基于词频信息的改进信息增益文本特征选择算法[J]. 计算机应用, 2014, 34 (11): 3279- 3282
SHI Hui, JIA Dai-ping, MIAO Pei Improved information gain text feature selection algorithm based on word frequency information[J]. Journal of Computer Applications, 2014, 34 (11): 3279- 3282
[25]   KO Y. A study of term weighting schemes using class information for text classification [C]// Proceedings of the 35th International ACM SIGIR Conferenceon Research and Development in Information Retrieval. Portland: ACM, 2012: 1029-1030.
[26]   SUN S L Local within-class accuracies for weighting individual outputs in multiple classifier systems[J]. Pattern Recognition Letters, 2010, 31 (2): 119- 124
doi: 10.1016/j.patrec.2009.09.017
[27]   WANG S, MINGKU L L, YAO X. Resampling-based ensemble methods for online class imbalance learning [J]. IEEE Transactions on Knowledge and Data Engineering, 2015, 27(5): 1356-1368.
[1] You-wei WANG,Li-zhou FENG. Improved AdaBoost algorithm using group degree and membership degree based noise detection and dynamic feature selection[J]. Journal of ZheJiang University (Engineering Science), 2021, 55(2): 367-376.
[2] Ya-jing WANG,Qun WANG,Bo-wen LI,Zhi-wen LIU,Yuan-yuan PIAO,Tao YU. Seizure prediction based on pre-ictal period selection of EEG signal[J]. Journal of ZheJiang University (Engineering Science), 2020, 54(11): 2258-2265.
[3] LIU Ru-hui, HUANG Wei-ping, WANG Kai, LIU Chuang, LIANG Jun. Semi-supervised constraint ensemble clustering by fast search and find of density peaks[J]. Journal of ZheJiang University (Engineering Science), 2018, 52(11): 2191-2200.
[4] HUANG Zheng-yu, JIANG Xin-long, LIU Jun-fa, CHEN Yi-qiang, GU Yang. Fusion feature based semi-supervised manifold localization method[J]. Journal of ZheJiang University (Engineering Science), 2017, 51(4): 655-662.
[5] FENG Xiao yue, LIANG Yan chun, LIN Xi xun, GUAN Ren chu. Research and development of never-ending language learning[J]. Journal of ZheJiang University (Engineering Science), 2017, 51(1): 82-88.
[6] LIN Yi-ning, WEI Wei, DAI Yuan-ming. Semi-supervised Hough Forest tracking method[J]. Journal of ZheJiang University (Engineering Science), 2013, 47(6): 977-983.
[7] ZHU Xiao-en, HAO Xin, XIA Shun-ren. Feature selection algorithm based on Levy flight[J]. Journal of ZheJiang University (Engineering Science), 2013, 47(4): 638-643.
[8] YAO Fu-tian, QIAN Yun-tao, LI Ji-ming. Semi-supervised learning based Gaussian processes for
hyperspectral image classification
[J]. Journal of ZheJiang University (Engineering Science), 2012, 46(7): 1295-1300.
[9] PAN Jun, KONG Fan-sheng, WANG Rui-qin. Semi-supervised clustering with weighted pairwise
constraints projection
[J]. Journal of ZheJiang University (Engineering Science), 2011, 45(5): 934-940.
[10] LI Wei-tao, ZHOU Xiao-jie, CHAI Tian-you. Gabor filter and latent semantic analysi based
burning state recognition
[J]. Journal of ZheJiang University (Engineering Science), 2011, 45(12): 2120-2126.
[11] ZHANG Yu-hong, HU Xue-gang, YANG Qiu-jie. A feature selection approach suitable for
data stream classification
[J]. Journal of ZheJiang University (Engineering Science), 2011, 45(12): 2247-2251.
[12] SHANG Jian, DIAO Li-Jie, YUE Heng, CHAI Tian-You. Soft sensor for ball mill load based on multisource data
feature fusion
[J]. Journal of ZheJiang University (Engineering Science), 2010, 44(7): 1406-1413.
[13] XIE Jian-Fang, BO Xiao-Hong, WANG Zheng-Xiao, et al. Scheduling feature selection based on immune binary partial swarm optimization[J]. Journal of ZheJiang University (Engineering Science), 2009, 43(12): 2203-2207.