Two-Stage Synthesis Networks for Transfer Learning in Machine Comprehension

  • David Golub ,
  • Po-Sen Huang ,
  • Xiaodong He ,
  • Li Deng

EMNLP 2017 |

Published by ACL

论文与出版物

We develop a technique for transfer learning in machine comprehension (MC) using a novel two-stage synthesis network (SynNet). Given a high-performing MC model in one domain, our technique aims to answer questions about documents in another domain, where we use no labeled data of question-answer pairs. Using the proposed SynNet with a pretrained model on the SQuAD dataset, we achieve an F1 measure of 46.6% on the challenging NewsQA dataset, approaching performance of in-domain models (F1 measure of 50.0%) and outperforming the out-of-domain baseline by 7.6%, without use of provided annotations.