Accumulating Conversational Skills using Continual Learning
- Sungjin Lee
IEEE SLT 2018 |
While neural conversational models have led to promising advances in reducing hand-crafted features and errors induced by the traditional complex system architecture, training neural models from scratch requires an enormous amount of data. If pre-trained models can be reused when they have many things in common with a new task, we can significantly cut down the amount of required data. To achieve this goal, we adopt a neural continual learning algorithm to allow a conversational agent to accumulate skills across different tasks in a data-efficient way.