Feedforward neural network (FNN) is an information processing system that simulates human brain function to a certain extent by referring to the structure of a biological neural network and the working mechanism of biological neurons. As the most important part, the architecture of FNN essentially influences its application performance. This paper proposed a structure self-adaptive algorithm for FNN (SSAFNN) based on biological mechanism. First, self-adaptive neuron growing and pruning indexes are proposed based on the idea of biological neuron grow factor and neuron competition, respectively. The FNN structure is dynamically adjusted according to the growing and pruning indexes of hidden neurons. Second, the connect weights of FNN are automatically adjusted during the self-organizing process and trained by gradient descent method during the learning process. Third, the theoretical analysis is given that this proposed algorithm not only optimized the network structure but also ensured the convergence and performance of FNN. The proposed SSAFNN is tested on the benchmark problems in the field of classification and prediction and is applied in the engineering problems of anchor bolt non-destructive testing and wastewater effluent ammonia nitrogen predicting. The experiment results reveal the good performance and potentiality of SSAFNN in industrial applications.
- Feedforward neural network ,
- neuron growing index ,
- neuron pruning index ,
- structure self-adaptive
- Neurons ,
- Indexes ,
- Feedforward neural networks ,
- Biological neural networks ,
- Training ,
- Heuristic algorithms
The structure and learning procedure of feedforward neural network is designed via the conception of structure of biological neural network and the working mechanism of biological neurons , . FNN, due to its learning and universal approximating ability, has been widely applied in many fields, such as data classification, soft measurement, and nonlinear system modeling –.
Traditionally, the FNN structure is designed by human experience and sufficient data. Once determined, it will not be adjusted, and only the connection weights for various tasks will be adjusted. On one hand, the FNN with large scale can learn the training samples well, but often lead to the occurrence of “over-fitting” phenomenon and the increase of calculation and storage , . On the other hand, the FNN with small scale has better generalization ability, but has insufficient information processing capability for complex problems , . Thus, how to dynamically adjust the structure of FNN, especially how to adjust the parameters in the FNN while dynamically adjusting the network structure, has always been an open problem that needs to be solved.
The FNN structure self-organizing problem mainly has the following two aspects: structure growing and structure pruning. Start with a small scale of neural network, the growing algorithm can add hidden layer neurons and their connection weights with other layers, for enhancing the approximating ability of neural networks. Several researches focus on this issue. A growing grid structure is proposed by Fritzke , which can be regarded as a growth self-organizing feature map. Wu and Chow  proposed a growth neural network based on genetic algorithm. Li et al.  proposed a forward neural network structure growing algorithm based on the hidden layer activation function and its derivative function. Pruning algorithm can delete the redundant neurons and their connections weights in large scale neural network, to make the FNN a more suitable structure and improve the efficiency of FNN. For example, optimal brain surgeon (OBS) method for neuron pruning is proposed by Hassibi and Stork ; Zeng et al.  proposed the sensitivity analysis of neurons to prune hidden neurons. Jiang et al.  implemented a nonlinear inversion of resistivity imaging using pruning Bayesian neural network. There are also several neural network self-adaptive or self-organizing researches combining the advantages of growing algorithm and pruning algorithm. Islam et al.  proposed a new algorithm, called adaptive merging and growing algorithm (AMGA), in designing artificial neural networks (ANNs). Hsu et al.  proposed an adaptive growing-and-pruning neural network control (AGPNNC) system for modeling piezoelectric ceramic motor. Han and Qiao  and Qiao et al.  proposed two kind of self-organizing algorithm called constructing and pruning (CP) and hybrid constructing and pruning strategy (HCPS) for optimizing the structure of FNN.
Some existing self-organizing FNN algorithms are analyzed in the above, which overcomes some shortcomings during the FNN structure self-organizing, such as how to stop structural adjustment , . But there are mainly some disadvantages that need to be solved. (1) The theoretical link between structural self-organizing algorithms and biological mechanisms has not been explained; (2) The ability to restructure FNN to solve complex problems that needs to be improved; (3) There are often several computation parameters to be select for different problems, the universality of structure self-organizing algorithms needs to improve.
In this paper, we proposed a new algorithm, called structure self-adaptive algorithm for feedforward neural network (AASFNN). For neuron growing phase, an adaptive neuron growing index is proposed by the concept of biological neuron growing factor; for neuron pruning phase, an adaptive neuron pruning index is proposed by the conception of biological neuron competition. The connection weights of FNN are automatically adjusted during the self-organizing process and trained by gradient descent method with adaptive learning rates during the learning process. The contributions of this work are as follows:
- Both the adaptive growing and pruning algorithm are proposed by the conception of biological mechanism. The difference between SSAFNN and other methods is the completely independence of growing mechanism and pruning mechanism. This design helps improving the ability of FNN for solving complex problems.
- The adaptive design of growing index, pruning index, and learning rate improved the universality of FNN for solving different problems.
- The theoretical analysis is given for the convergence guarantee of both the SSAFNN structure self-organizing process and learning process.
The subsequent sections are organized as follows: Section II gives the basic conceptions of FNN; the adaptive growing and pruning index is designed in Section III; Section IV describes the structure self-adaptive process and learning process of SSAFNN in detail; Section V presents experiment results and discussions of SSAFNN in benchmark testing and engineering applications; Finally, Section VI draws the conclusions.
In this paper, a biological mechanism based self-adaptive algorithm for feedforward neural network structure is proposed. The adaptive neuron growing and pruning index are designed based on the concepts of biological neuron grow factor and neuron competition, respectively. The effectiveness of SSAFNN has been demonstrated by both the theoretical analysis and several benchmark tests of classification and prediction problems. The proposed SSAFNN is also applied in two engineering problems, bolt non-destructive testing and effluent ammonia nitrogen predicting. All the benchmark and engineering experiments show that the SSAFNN effectively optimized the network structure and had good performance. The improvements of SSAFNN can be summarized as follows:
- According to the idea of biological neural network, two different mechanisms are designed for neuron growing and pruning, respectively. These mechanisms effectively optimized the network structure, and improved the FNN performance in solving different problems.
- The proposed SSAFNN effectively improved the performance in solving complex problems without losing the efficiency in solving simple problems.
Notice that the difference of the growing and pruning mechanisms has led to the inconsistency of the appropriate range of hidden nodes, and the initialization of the neural network still affects the problem solving performance. Further researches along these directions are being expected.
The Kavian Scientific Research Association (KSRA) is a non-profit research organization to provide research / educational services in December 2013. The members of the community had formed a virtual group on the Viber social network. The core of the Kavian Scientific Association was formed with these members as founders. These individuals, led by Professor Siavosh Kaviani, decided to launch a scientific / research association with an emphasis on education.
KSRA research association, as a non-profit research firm, is committed to providing research services in the field of knowledge. The main beneficiaries of this association are public or private knowledge-based companies, students, researchers, researchers, professors, universities, and industrial and semi-industrial centers around the world.
Our main services Based on Education for all Spectrum people in the world. We want to make an integration between researches and educations. We believe education is the main right of Human beings. So our services should be concentrated on inclusive education.
The KSRA team partners with local under-served communities around the world to improve the access to and quality of knowledge based on education, amplify and augment learning programs where they exist, and create new opportunities for e-learning where traditional education systems are lacking or non-existent.
FULL Paper PDF file:A Biological Mechanism Based Structure Self-Adaptive Algorithm for Feedforward Neural Network and Its Engineering Applications
A Biological Mechanism Based Structure Self-Adaptive Algorithm for Feedforward Neural Network and Its Engineering Applications
in IEEE Access, vol. 7, pp. 25111-25122, 2019,
PDF reference and original file: Click here
Professor Siavosh Kaviani was born in 1961 in Tehran. He had a professorship. He holds a Ph.D. in Software Engineering from the QL University of Software Development Methodology and an honorary Ph.D. from the University of Chelsea.