Stylized Knowledge-Grounded Dialogue Generation via Disentangled Template Rewriting
- Qingfeng Sun ,
- Can Xu ,
- Huang Hu ,
- Yujing Wang ,
- Jian Miao ,
- Xiubo Geng ,
- Yining Chen ,
- Fei Xu ,
- Daxin Jiang (姜大昕)
NAACL 2022 |
Current Knowledge-Grounded Dialogue Generation (KDG) models specialize in producing rational and factual responses. However, to establish long-term relationships with users, the KDG model needs the capability to generate responses in a desired style or attribute. Thus, we study a new problem: Stylized Knowledge-Grounded Dialogue Generation (SKDG). It presents two challenges: (1) How to train a SKDG model where no