Abstract
Although machine learning shows state-of-the-art performan-ce in a variety of fields, it is short a theoretical understanding of how machine learning works. Recently, theoretical approaches are actively being studied, and there are results for one of them, margin and its distribu-tion. In this paper, especially we focused on the role of margin in the perturbations of inputs and parameters. We show a generalization bound for two cases, a linear model for binary classification and neural networks for multi-classification, when the inputs have normal distributed random noises. The additional generalization term caused by random noises is related to margin and exponentially inversely proportional to the noise level for binary classification. And in neural networks, the additional generalization term depends on (input dimension) × (norms of input and weights). For these results, we used the PAC-Bayesian framework. This paper is considering random noises and margin together, and it will be helpful to a better understanding of model sensitivity and the construc-tion of robust generalization.
Original language | English |
---|---|
Pages (from-to) | 217-233 |
Number of pages | 17 |
Journal | Journal of the Korean Mathematical Society |
Volume | 59 |
Issue number | 2 |
DOIs | |
Publication status | Published - 2022 Mar |
Bibliographical note
Publisher Copyright:© 2022 Korean Mathematical Society.
All Science Journal Classification (ASJC) codes
- General Mathematics