Please wait a minute...
Front. Inform. Technol. Electron. Eng.  2010, Vol. 11 Issue (12): 939-947    DOI: 10.1631/jzus.C1000137
    
A regeneratable dynamic differential evolution algorithm for neural networks with integer weights
Jian Bao, Yu Chen, Jin-shou Yu
School of Information Science and Engineering, East China University of Science and Technology, Shanghai 200237, China, Institute of Software and Intelligent Technology, Hangzhou Dianzi University, Hangzhou 310018, China
Download:   PDF(0KB)
Export: BibTeX | EndNote (RIS)      

Abstract  Neural networks with integer weights are more suited for embedded systems and hardware implementations than those with real weights. However, many learning algorithms, which have been proposed for training neural networks with float weights, are inefficient and difficult to train for neural networks with integer weights. In this paper, a novel regeneratable dynamic differential evolution algorithm (RDDE) is presented. This algorithm is efficient for training networks with integer weights. In comparison with the conventional differential evolution algorithm (DE), RDDE has introduced three new strategies: (1) A regeneratable strategy is introduced to ensure further evolution, when all the individuals are the same after several iterations such that they cannot evolve further. In other words, there is an escape from the local minima. (2) A dynamic strategy is designed to speed up convergence and simplify the algorithm by updating its population dynamically. (3) A local greedy strategy is introduced to improve local searching ability when the population approaches the global optimal solution. In comparison with other gradient based algorithms, RDDE does not need the gradient information, which has been the main obstacle for training networks with integer weights. The experiment results show that RDDE can train integer-weight networks more efficiently.

Key wordsDifferential evolution      Integer weights      Neural networks      Greedy      Embedded systems      Function approximation     
Received: 05 May 2010      Published: 09 December 2010
CLC:  TP183  
Cite this article:

Jian Bao, Yu Chen, Jin-shou Yu. A regeneratable dynamic differential evolution algorithm for neural networks with integer weights. Front. Inform. Technol. Electron. Eng., 2010, 11(12): 939-947.

URL:

http://www.zjujournals.com/xueshu/fitee/10.1631/jzus.C1000137     OR     http://www.zjujournals.com/xueshu/fitee/Y2010/V11/I12/939


A regeneratable dynamic differential evolution algorithm for neural networks with integer weights

Neural networks with integer weights are more suited for embedded systems and hardware implementations than those with real weights. However, many learning algorithms, which have been proposed for training neural networks with float weights, are inefficient and difficult to train for neural networks with integer weights. In this paper, a novel regeneratable dynamic differential evolution algorithm (RDDE) is presented. This algorithm is efficient for training networks with integer weights. In comparison with the conventional differential evolution algorithm (DE), RDDE has introduced three new strategies: (1) A regeneratable strategy is introduced to ensure further evolution, when all the individuals are the same after several iterations such that they cannot evolve further. In other words, there is an escape from the local minima. (2) A dynamic strategy is designed to speed up convergence and simplify the algorithm by updating its population dynamically. (3) A local greedy strategy is introduced to improve local searching ability when the population approaches the global optimal solution. In comparison with other gradient based algorithms, RDDE does not need the gradient information, which has been the main obstacle for training networks with integer weights. The experiment results show that RDDE can train integer-weight networks more efficiently.

关键词: Differential evolution,  Integer weights,  Neural networks,  Greedy,  Embedded systems,  Function approximation 
[1] Yu-jun Xiao, Wen-yuan Xu, Zhen-hua Jia, Zhuo-ran Ma, Dong-lian Qi. NIPAD: a non-invasive power-based anomaly detection scheme for programmable logic controllers[J]. Front. Inform. Technol. Electron. Eng., 2017, 18(4): 519-534.
[2] Muhammad Asif Zahoor Raja, Iftikhar Ahmad, Imtiaz Khan, Muhammed Ibrahem Syam, Abdul Majid Wazwaz. Neuro-heuristic computational intelligence for solving nonlinear pantograph systems[J]. Front. Inform. Technol. Electron. Eng., 2017, 18(4): 464-484.
[3] Jun-hong Zhang, Yu Liu. Application of complete ensemble intrinsic time-scale decomposition and least-square SVM optimized using hybrid DE and PSO to fault diagnosis of diesel engines[J]. Front. Inform. Technol. Electron. Eng., 2017, 18(2): 272-286.
[4] Shafqat Ullah Khan, Ijaz Mansoor Qureshi, Fawad Zaman, Wasim Khan. Detecting faulty sensors in an array using symmetrical structure and cultural algorithm hybridized with differential evolution[J]. Front. Inform. Technol. Electron. Eng., 2017, 18(2): 235-245.
[5] Yuan Liang, Wei-feng Lv, Wen-jun Wu, Ke Xu. Friendship-aware task planning in mobile crowdsourcing[J]. Front. Inform. Technol. Electron. Eng., 2017, 18(1): 107-121.
[6] Zi-wu Ren, Zhen-hua Wang, Li-ning Sun. A hybrid biogeography-based optimization method for the inverse kinematics problem of an 8-DOF redundant humanoid manipulator[J]. Front. Inform. Technol. Electron. Eng., 2015, 16(7): 607-616.
[7] Tahir Nadeem Malik, Salman Zafar, Saaqib Haroon. An improved chaotic hybrid differential evolution for the short-term hydrothermal scheduling problem considering practical constraints[J]. Front. Inform. Technol. Electron. Eng., 2015, 16(5): 404-417.
[8] De-xuan Zou, Li-qun Gao, Steven Li. Volterra filter modeling of a nonlinear discrete-time system based on a ranked differential evolution algorithm[J]. Front. Inform. Technol. Electron. Eng., 2014, 15(8): 687-696.
[9] Feng-fei Zhao, Zheng Qin, Zhuo Shao, Jun Fang, Bo-yan Ren. Greedy feature replacement for online value function approximation[J]. Front. Inform. Technol. Electron. Eng., 2014, 15(3): 223-231.
[10] Yan Liu, Jie Yang, Long Li, Wei Wu. Negative effects of sufficiently small initial weights on back-propagation neural networks[J]. Front. Inform. Technol. Electron. Eng., 2012, 13(8): 585-592.
[11] Ding-cheng Feng, Feng Chen, Wen-li Xu. Learning robust principal components from L1-norm maximization[J]. Front. Inform. Technol. Electron. Eng., 2012, 13(12): 901-908.
[12] Sahar Moghimi, Mohammad Hossein Miran Baygi, Giti Torkaman, Ehsanollah Kabir, Ali Mahloojifar, Narges Armanfard. Studying pressure sores through illuminant invariant assessment of digital color images[J]. Front. Inform. Technol. Electron. Eng., 2010, 11(8): 598-606.
[13] Bo-yang Qu, Ponnuthurai-Nagaratnam Suganthan. Multi-objective differential evolution with diversity enhancement[J]. Front. Inform. Technol. Electron. Eng., 2010, 11(7): 538-543.