Abstract
In order to reduce the complexity of a single hidden layered multilayer neural network, a new two hidden layered MFNN (THL-MFNN) with a combined structure of a RBFN and MLPs is proposed, and its associated training method is discussed. The proposed THL-MFNN can be easily constructed, and can be efficiently trained by online recursive methods. The performance of the proposed THL-MFNN with P/4+2 = 18 hidden nodes and 34 weights is equal to that of an optimaum Bayesian equalizer using an RBFN with P = 64 hidden nodes and 64 weights. The role of each layer in the proposed THL-MFNN is presented by theoretical approach, and the feasibility of a more reduced structure is given.
Original language | English |
---|---|
Pages | 1675-1680 |
Number of pages | 6 |
Publication status | Published - 2001 |
Event | International Joint Conference on Neural Networks (IJCNN'01) - Washington, DC, United States Duration: 2001 Jul 15 → 2001 Jul 19 |
Other
Other | International Joint Conference on Neural Networks (IJCNN'01) |
---|---|
Country/Territory | United States |
City | Washington, DC |
Period | 01/7/15 → 01/7/19 |
All Science Journal Classification (ASJC) codes
- Software
- Artificial Intelligence