Abstract
There is no consensus on measuring distances between two different neural network architectures. Two folds of methods are used for that purpose: Structural and behavioral distance measures. In this paper, we focus on the later one that compares differences based on output responses given the same input. Usually neural network output can be interpreted as a probabilistic function given the input signals if it is normalized to 1. Information theoretic distance measures are widely used to measure distances between two probabilistic distributions. In the framework of evolving diverse neural networks, we adopted information-theoretic distance measures to improve its performance. Experimental results on UCI benchmark dataset show the promising possibility of the approach.
Original language | English |
---|---|
Title of host publication | Neural Information Processing - 14th International Conference, ICONIP 2007, Revised Selected Papers |
Pages | 1007-1016 |
Number of pages | 10 |
Edition | PART 2 |
DOIs | |
Publication status | Published - 2008 |
Event | 14th International Conference on Neural Information Processing, ICONIP 2007 - Kitakyushu, Japan Duration: 2007 Nov 13 → 2007 Nov 16 |
Publication series
Name | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) |
---|---|
Number | PART 2 |
Volume | 4985 LNCS |
ISSN (Print) | 0302-9743 |
ISSN (Electronic) | 1611-3349 |
Other
Other | 14th International Conference on Neural Information Processing, ICONIP 2007 |
---|---|
Country/Territory | Japan |
City | Kitakyushu |
Period | 07/11/13 → 07/11/16 |
Bibliographical note
Funding Information:This research was supported by Brain Science and Engineering Research Program sponsored by Korean Ministry of Commerce, Industry and Energy.
All Science Journal Classification (ASJC) codes
- Theoretical Computer Science
- Computer Science(all)