Abstract
In this paper, we propose to train the RBF neural network using a global descent method. Essentially, the method imposes a monotonic transformation on the training objective to improve numerical sensitivity without altering the relative orders of all local extrema. A gradient descent search which inherits the global descent property is derived to locate the global solution of an error objective. Numerical examples comparing the global descent algorithm with a gradient-based line-search algorithm shows superiority of the proposed global descent algorithm in terms of speed of convergence and quality of solution achieved.
Original language | English |
---|---|
Pages (from-to) | 96-99 |
Number of pages | 4 |
Journal | Proceedings - International Conference on Pattern Recognition |
Volume | 16 |
Issue number | 2 |
Publication status | Published - 2002 |
All Science Journal Classification (ASJC) codes
- Computer Vision and Pattern Recognition