Hi Jun Choe, Hayeong Koh, Jimin Lee

Research output: Contribution to journalArticlepeer-review


Although machine learning shows state-of-the-art performan-ce in a variety of fields, it is short a theoretical understanding of how machine learning works. Recently, theoretical approaches are actively being studied, and there are results for one of them, margin and its distribu-tion. In this paper, especially we focused on the role of margin in the perturbations of inputs and parameters. We show a generalization bound for two cases, a linear model for binary classification and neural networks for multi-classification, when the inputs have normal distributed random noises. The additional generalization term caused by random noises is related to margin and exponentially inversely proportional to the noise level for binary classification. And in neural networks, the additional generalization term depends on (input dimension) × (norms of input and weights). For these results, we used the PAC-Bayesian framework. This paper is considering random noises and margin together, and it will be helpful to a better understanding of model sensitivity and the construc-tion of robust generalization.

Original languageEnglish
Pages (from-to)217-233
Number of pages17
JournalJournal of the Korean Mathematical Society
Issue number2
Publication statusPublished - 2022 Mar

Bibliographical note

Publisher Copyright:
© 2022 Korean Mathematical Society.

All Science Journal Classification (ASJC) codes

  • General Mathematics


Dive into the research topics of 'MARGIN-BASED GENERALIZATION FOR CLASSIFICATIONS WITH INPUT NOISE'. Together they form a unique fingerprint.

Cite this