Naive Bayes classifier,Decision region,NBTree,C4.5 algorithm,Support vector machine (SVM)," /> Improving naive Bayes classifier by dividing its decision regions" /> Improving naive Bayes classifier by dividing its decision regions" /> Naive Bayes classifier,Decision region,NBTree,C4.5 algorithm,Support vector machine (SVM),"/> <span style="font-size:13.3333px;">Improving naive Bayes classifier by dividing its decision regions</span>
Please wait a minute...
Front. Inform. Technol. Electron. Eng.  2011, Vol. 12 Issue (8): 647-657    DOI: 10.1631/jzus.C1000437
    
Improving naive Bayes classifier by dividing its decision regions
Zhi-yong Yan, Cong-fu Xu*, Yun-he Pan
Institute of Artificial Intelligence, Zhejiang University, Hangzhou 310027, China
Improving naive Bayes classifier by dividing its decision regions
Zhi-yong Yan, Cong-fu Xu*, Yun-he Pan
Institute of Artificial Intelligence, Zhejiang University, Hangzhou 310027, China
 全文: PDF(236 KB)  
摘要: Classification can be regarded as dividing the data space into decision regions separated by decision boundaries. In this paper we analyze decision tree algorithms and the NBTree algorithm from this perspective. Thus, a decision tree can be regarded as a classifier tree, in which each classifier on a non-root node is trained in decision regions of the classifier on the parent node. Meanwhile, the NBTree algorithm, which generates a classifier tree with the C4.5 algorithm and the naive Bayes classifier as the root and leaf classifiers respectively, can also be regarded as training naive Bayes classifiers in decision regions of the C4.5 algorithm. We propose a second division (SD) algorithm and three soft second division (SD-soft) algorithms to train classifiers in decision regions of the naive Bayes classifier. These four novel algorithms all generate two-level classifier trees with the naive Bayes classifier as root classifiers. The SD and three SD-soft algorithms can make good use of both the information contained in instances near decision boundaries, and those that may be ignored by the naive Bayes classifier. Finally, we conduct experiments on 30 data sets from the UC Irvine (UCI) repository. Experiment results show that the SD algorithm can obtain better generalization abilities than the NBTree and the averaged one-dependence estimators (AODE) algorithms when using the C4.5 algorithm and support vector machine (SVM) as leaf classifiers. Further experiments indicate that our three SD-soft algorithms can achieve better generalization abilities than the SD algorithm when argument values are selected appropriately.
关键词: Naive Bayes classifier')" href="#">Naive Bayes classifierDecision regionNBTreeC4.5 algorithmSupport vector machine (SVM)    
Abstract: Classification can be regarded as dividing the data space into decision regions separated by decision boundaries. In this paper we analyze decision tree algorithms and the NBTree algorithm from this perspective. Thus, a decision tree can be regarded as a classifier tree, in which each classifier on a non-root node is trained in decision regions of the classifier on the parent node. Meanwhile, the NBTree algorithm, which generates a classifier tree with the C4.5 algorithm and the naive Bayes classifier as the root and leaf classifiers respectively, can also be regarded as training naive Bayes classifiers in decision regions of the C4.5 algorithm. We propose a second division (SD) algorithm and three soft second division (SD-soft) algorithms to train classifiers in decision regions of the naive Bayes classifier. These four novel algorithms all generate two-level classifier trees with the naive Bayes classifier as root classifiers. The SD and three SD-soft algorithms can make good use of both the information contained in instances near decision boundaries, and those that may be ignored by the naive Bayes classifier. Finally, we conduct experiments on 30 data sets from the UC Irvine (UCI) repository. Experiment results show that the SD algorithm can obtain better generalization abilities than the NBTree and the averaged one-dependence estimators (AODE) algorithms when using the C4.5 algorithm and support vector machine (SVM) as leaf classifiers. Further experiments indicate that our three SD-soft algorithms can achieve better generalization abilities than the SD algorithm when argument values are selected appropriately.
Key words: Naive Bayes classifier    Decision region    NBTree    C4.5 algorithm    Support vector machine (SVM)
收稿日期: 2010-12-20 出版日期: 2011-08-03
CLC:  TP181  
服务  
把本文推荐给朋友 Improving naive Bayes classifier by dividing its decision regions”的文章,特向您推荐。请打开下面的网址:http://www.zjujournals.com/xueshu/fitee/CN/abstract/abstract15069.shtml" name="neirong"> Improving naive Bayes classifier by dividing its decision regions">
加入引用管理器
E-mail Alert
RSS
作者相关文章  
Zhi-yong Yan
Cong-fu Xu
Yun-he Pan

引用本文:

Zhi-yong Yan, Cong-fu Xu, Yun-he Pan. Improving naive Bayes classifier by dividing its decision regions. Front. Inform. Technol. Electron. Eng., 2011, 12(8): 647-657.

链接本文:

http://www.zjujournals.com/xueshu/fitee/CN/10.1631/jzus.C1000437        http://www.zjujournals.com/xueshu/fitee/CN/Y2011/V12/I8/647

[1] Jian-ru Xue, Di Wang, Shao-yi Du, Di-xiao Cui, Yong Huang, Nan-ning Zheng. 无人车自主定位和障碍物感知的视觉主导多传感器融合方法[J]. Frontiers of Information Technology & Electronic Engineering, 2017, 18(1): 122-138.
[2] Tao-cheng Hu, Jin-hui Yu. 基于最大间隔的贝叶斯分类器[J]. Front. Inform. Technol. Electron. Eng., 2016, 17(10): 973-981.
[3] Izabela Nielsen, Robert Wójcik, Grzegorz Bocewicz, Zbigniew Banaszak. 模糊操作时间约束下的多模过程优化:声明式建模方法[J]. Front. Inform. Technol. Electron. Eng., 2016, 17(4): 338-347.
[4] Jo?o Carneiro, Diogo Martinho, Goreti Marreiros, Paulo Novais. 应用于普适群体决策的智能谈判模型[J]. Front. Inform. Technol. Electron. Eng., 2016, 17(4): 296-308.
[5] Ya-tao Zhang, Cheng-yu Liu, Shou-shui Wei, Chang-zhi Wei, Fei-fei Liu. 基于非线性支持向量机和遗传算法的移动ECG质量评估[J]. Front. Inform. Technol. Electron. Eng., 2014, 15(7): 564-573.
[6] Feng-fei Zhao, Zheng Qin, Zhuo Shao, Jun Fang, Bo-yan Ren. 用于在线值函数近似的贪婪特征替换方法[J]. Front. Inform. Technol. Electron. Eng., 2014, 15(3): 223-231.
[7] Hong-xia Pang, Wen-de Dong, Zhi-hai Xu, Hua-jun Feng, Qi Li, Yue-ting Chen. Novel linear search for support vector machine parameter selection[J]. Front. Inform. Technol. Electron. Eng., 2011, 12(11): 885-896.
[8] Peng Chen, Yong-zai Lu. Extremal optimization for optimizing kernel function and its parameters in support vector regression[J]. Front. Inform. Technol. Electron. Eng., 2011, 12(4): 297-306.
[9] Zhuo-jun Jin, Hui Qian, Shen-yi Chen, Miao-liang Zhu. Convergence analysis of an incremental approach to online inverse reinforcement learning[J]. Front. Inform. Technol. Electron. Eng., 2011, 12(1): 17-24.
[10] Shen-yi Chen, Hui Qian, Jia Fan, Zhuo-jun Jin, Miao-liang Zhu. Modified reward function on abstract features in inverse reinforcement learning[J]. Front. Inform. Technol. Electron. Eng., 2010, 11(9): 718-723.