Abstract
In this paper, we analyze probability density functions when the number of training samples is limited, assuming normal distributions. As the dimension of data increases significantly, the performance of a classifier suffers when the number of training samples is not adequate. This problem becomes worse as high dimensional data such as hyperspectral images are widely available. The key factor in designing a classifier is estimation of probability density functions, which are completely determined by covariance matrices and mean vectors in case of the Gaussian ML classifier. In this paper, we provide in-depth analyses of estimation of probability density functions in terms of the number of training samples assuming normal distributions and provide a guideline in choosing the dimensionality of data for a given set of training samples.
Original language | English |
---|---|
Pages | 458-462 |
Number of pages | 5 |
Publication status | Published - 2004 |
Event | Proceedings of the Seventh IASTED International Conference on Computer Graphics and Imaging - Kauai, HI, United States Duration: 2004 Aug 17 → 2004 Aug 19 |
Other
Other | Proceedings of the Seventh IASTED International Conference on Computer Graphics and Imaging |
---|---|
Country/Territory | United States |
City | Kauai, HI |
Period | 04/8/17 → 04/8/19 |
All Science Journal Classification (ASJC) codes
- Engineering(all)