Augmenting data for sarcasm detection with unlabeled conversation context

Hankyol Lee, Youngjae Yu, Gunhee Kim

Research output: Chapter in Book/Report/Conference proceedingConference contribution

10 Citations (Scopus)

Abstract

We present a novel data augmentation technique, CRA (Contextual Response Augmentation), which utilizes conversational context to generate meaningful samples for training. We also mitigate the issues regarding unbalanced context lengths by changing the input-output format of the model such that it can deal with varying context lengths effectively. Specifically, our proposed model, trained with the proposed data augmentation technique, participated in the sarcasm detection task of FigLang2020, have won and achieves the best performance in both Reddit and Twitter datasets.

Original languageEnglish
Title of host publicationACL 2020 - Figurative Language Processing, Proceedings of the 2nd Workshop
PublisherAssociation for Computational Linguistics (ACL)
Pages12-17
Number of pages6
ISBN (Electronic)9781952148125
DOIs
Publication statusPublished - 2020
Event2nd Workshop on Figurative Language Processing 2020 at the 58th Annual Meeting of the Association for Computational Linguistics, ACL 2020 - Virtual, Online, United States
Duration: 2020 Jul 9 → …

Publication series

NameProceedings of the Annual Meeting of the Association for Computational Linguistics
ISSN (Print)0736-587X

Conference

Conference2nd Workshop on Figurative Language Processing 2020 at the 58th Annual Meeting of the Association for Computational Linguistics, ACL 2020
Country/TerritoryUnited States
CityVirtual, Online
Period20/7/9 → …

Bibliographical note

Publisher Copyright:
© 2020 Association for Computational Linguistics.

All Science Journal Classification (ASJC) codes

  • Computer Science Applications
  • Linguistics and Language
  • Language and Linguistics

Fingerprint

Dive into the research topics of 'Augmenting data for sarcasm detection with unlabeled conversation context'. Together they form a unique fingerprint.

Cite this