TP-N2F: Tensor Product Representation for Natural To Formal Language Generation
- Kezhen Chen ,
- Qiuyuan Huang ,
- Hamid Palangi ,
- Paul Smolensky ,
- Kenneth D. Forbus ,
- Jianfeng Gao
Thirty-third Conference on Neural Information Processing Systems (NeurIPS) 2019, workshop |
Best Paper Award at KR2ML Workshop in NeurIPS 2019.
Télécharger BibTexGenerating formal-language represented by relational tuples, such as Lisp programs or mathematical expressions, from a natural-language input is an extremely challenging task because it requires to explicitly capture discrete symbolic structural information from the input to generate the output. Most state-of-the-art neural sequence models do not explicitly capture such structure information, and thus do not perform well on these tasks. In this paper, we propose a new encoder-decoder model based on Tensor Product Representations (TPRs) for Natural- to Formal-language generation, called TP-N2F. The encoder of TP-N2F employs TPR ‘binding’ to encode natural-language symbolic structure in vector space and the decoder uses TPR ‘unbinding’ to generate a sequence of relational tuples, each consisting of a relation (or operation) and a number of arguments, in symbolic space. TP-N2F considerably outperforms LSTM-based Seq2Seq models, creating a new state of the art results on two benchmarks: the MathQA dataset for math problem solving, and the AlgoList dataset for program synthesis. Ablation studies show that improvements are mainly attributed to the use of TPRs in both the encoder and decoder to explicitly capture relational structure information for symbolic reasoning.