Abstract
Existing machine reading comprehension models are reported to be brittle for adversarially perturbed questions when optimizing only for accuracy, which led to the creation of new reading comprehension benchmarks, such as SQuAD 2.0 which contains such type of questions. However, despite the super-human accuracy of existing models on such datasets, it is still unclear how the model predicts the answerability of the question, potentially due to the absence of a shared annotation for the explanation. To address such absence, we release SQuAD2-CR dataset, which contains annotations on unanswerable questions from the SQuAD 2.0 dataset, to enable an explanatory analysis of the model prediction. Specifically, we annotate (1) explanation on why the most plausible answer span cannot be the answer and (2) which part of the question causes unanswerability. We share intuitions and experimental results that how this dataset can be used to analyze and improve the interpretability of existing reading comprehension model behavior.
Original language | English |
---|---|
Title of host publication | LREC 2020 - 12th International Conference on Language Resources and Evaluation, Conference Proceedings |
Editors | Nicoletta Calzolari, Frederic Bechet, Philippe Blache, Khalid Choukri, Christopher Cieri, Thierry Declerck, Sara Goggi, Hitoshi Isahara, Bente Maegaard, Joseph Mariani, Helene Mazo, Asuncion Moreno, Jan Odijk, Stelios Piperidis |
Publisher | European Language Resources Association (ELRA) |
Pages | 5425-5432 |
Number of pages | 8 |
ISBN (Electronic) | 9791095546344 |
Publication status | Published - 2020 |
Event | 12th International Conference on Language Resources and Evaluation, LREC 2020 - Marseille, France Duration: 2020 May 11 → 2020 May 16 |
Publication series
Name | LREC 2020 - 12th International Conference on Language Resources and Evaluation, Conference Proceedings |
---|
Conference
Conference | 12th International Conference on Language Resources and Evaluation, LREC 2020 |
---|---|
Country/Territory | France |
City | Marseille |
Period | 20/5/11 → 20/5/16 |
Bibliographical note
Publisher Copyright:© European Language Resources Association (ELRA), licensed under CC-BY-NC
All Science Journal Classification (ASJC) codes
- Language and Linguistics
- Education
- Library and Information Sciences
- Linguistics and Language