Abstract
In this paper, we propose a new learning algorithm for multilayer neural networks. In the backpropagation learning algorithm, weights are adjusted to reduce the error or cost function that reflects the difference between the computed and desired outputs. In the proposed learning algorithm, we consider each term of the output layer as a function of weights and adjust the weights directly so that the output layers produce the desired outputs. Experiments show the proposed algorithm consistently performs better than the back-propagation learning algorithm.
Original language | English |
---|---|
Pages | 1721-1724 |
Number of pages | 4 |
Publication status | Published - 1999 |
Event | International Joint Conference on Neural Networks (IJCNN'99) - Washington, DC, USA Duration: 1999 Jul 10 → 1999 Jul 16 |
Other
Other | International Joint Conference on Neural Networks (IJCNN'99) |
---|---|
City | Washington, DC, USA |
Period | 99/7/10 → 99/7/16 |
All Science Journal Classification (ASJC) codes
- Software
- Artificial Intelligence