Document Modeling with Graph Attention Networks for Multi-grained Machine Reading Comprehension
- Bo Zheng ,
- Haoyang Wen ,
- Yaobo Liang ,
- Nan Duan ,
- Wanxiang Che ,
- Daxin Jiang (姜大昕) ,
- Ming Zhou ,
- Ting Liu
ACL 2020 |
Published by Association for Computational Linguistics
Natural Questions is a new challenging machine reading comprehension benchmark with two-grained answers, which are a long answer (typically a paragraph) and a short answer (one or more entities inside the long answer). Despite the effectiveness of existing methods on this benchmark, they treat these two sub-tasks individually during training while ignoring their dependencies. To address this issue, we present a novel multi-grained machine reading comprehension framework that focuses on modeling documents at their hierarchical nature, which are different levels of granularity: documents, paragraphs, sentences, and tokens. We utilize graph attention networks to obtain different levels of representations so that they can be learned simultaneously. The long and short answers can be extracted from paragraph-level representation and token-level representation, respectively. In this way, we can model the dependencies between the two-grained answers to provide evidence for each other. We jointly train the two sub-tasks, and our experiments show that our approach significantly outperforms previous systems at both long and short answer criteria.