Computer Technology, Control Technology |
|
|
|
|
Model combination algorithm based on consensus maximization |
DONG Li yan, ZHU Qi, LI Yong li |
1. College of Computer Science and Technology, Jilin University, Changchun 130012, China;
2. School of Computer Science and Technology, Northeast Normal University, Changchun 130117,China |
|
|
Abstract
The combinatorial scheme was optimized and model combination algorithm based on the consensus maximization was proposed, aiming at the problem that the original random forest algorithm can not distinguish the classification advantage between each single classifier. The new algorithm integrated the empirical error and generalization error of the classifier into the classifier weight calculation, which makes each single classifier give full play to their personality and advantages. As a result, this method strengthened the advantage of good classifiers and weakened the disadvantage of poor classifiers. Experimental results show that this optimized algorithm can not only improve the performance of combinatorial classifiers, but also improve the classification accuracy and the generalization ability. This improvement is instructive to improve the performance of the same type of multi-model combination algorithm.
|
Published: 06 March 2017
|
|
|
Cite this article:
DONG Li yan, ZHU Qi, LI Yong li. Model combination algorithm based on consensus maximization. JOURNAL OF ZHEJIANG UNIVERSITY (ENGINEERING SCIENCE), 2017, 51(2): 416-421.
|
基于最大共识的模型组合算法
针对原有的随机森林算法没有区别各个单分类器之间的分类优势,对分类器的组合方案进行优化,提出一种基于最大共识的模型组合算法.该算法将分类器的经验误差和泛化误差融入到分类器的权重计算中,充分发挥了单分类器的个性与优势,强化分类效果好的单分类器的优势,弱化分类效果较差的单分类器的劣势.实验结果表明,基于最大共识模型组合算法能够提升组合分类器的分类性能,在提高分类精度的同时,也具有较强的泛化能力,这一改进对于提升同类型多模型组合算法的性能具有一定指导意义.
|
|
[1] GALL J, LEMPITSKY V. Decision forests for computer vision and medical image analysis[M]. London :Springer, 2013: 143-157.
[2] GRAY K R, ALJABAR P, HECKEMANN R A, et al.Random forest-based similarity measures for multi-modal classification of Alzheimer’s disease [J]. NeuroImage, 2013, 65: 167-175.
[3] ZHAI S, XIA T, WANG S. A multi-class boosting method with direct optimization [C]∥Proceedings of the 20th ACM SIGKDD international conference on Knowledge Discovery and Data Mining. New York: ACM, 2014: 273-282.
[4] ZHAI S, XIA T, TAN M, et al. Direct 0-1 loss minimization and margin maximization with boosting[C]∥Advances in Neural Information Processing Systems. Nevada: \[s.n.\] 2013: 872-880.
[5] DONG L, LI X, LANG P. Prediction of rockburst classification using Random Forest [J]. Transactions of Nonferrous Metals Society of China, 2013, 23(2):472-477.
[6] HANSEN L K, SALAMON P. Neural network ensembles [J]. IEEE Transactions on Pattern Analysis & Machine Intelligence, 1990, 12(10): 993-1001.
[7] BAUMANN F, CHEN J, VOGT K, et al. Improved threshold selection by using calibrated probabilities for random forest classifiers [C]∥Computer and Robot Vision (CRV), 2015 12th Conference on. Halifax: IEEE,2015: 155-160.
[8] WANG P, JIANG T, FAN G, et al. Prediction of torpedo Initial velocity based on random forests regression[C]∥Intelligent Human-Machine Systems and Cybernetics (IHMSC), 2015 7th International Conference on. Hangzhou: IEEE, 2015: 337-339.
[9] KREMIC E, SUBASI A. Performance of Random Forest and SVM in Face Recognition [J]. The International Arab Journal of Information Technology, 2016, 13(2): 287-293.
[10]MARATEA A, PETROSINO A, MANZO M. Adjusted F-measure and kernel scaling for imbalanced data learning [J]. Information Sciences, 2014, 257(2):331-341. |
|
Viewed |
|
|
|
Full text
|
|
|
|
|
Abstract
|
|
|
|
|
Cited |
|
|
|
|
|
Shared |
|
|
|
|
|
Discussed |
|
|
|
|