Manisha Thakkar, Nitin N Pise
Task-oriented dialogue (TOD) systems are becoming increasingly popular due to their wide acceptability at personal and enterprise levels. These systems assist users to accomplish the intended task. Natural language input is given to such systems as text or speech. It is very complex to design TOD systems because it is difficult to maintain dialogue flow during the conversation. These neural systems require a lot of task-specific annotated data. To overcome the data scarcity of such systems recent advancements in Pretrained Language Models (PLMs) have shown promising results. In this paper, we studied the application of transformer-based PLMs to TOD systems tasks and compared their performances.
Task-oriented dialogue system, End-to-End systems Pretrained Language models, text-to-text transfer Transformer (T5)
Cite this paper
Manisha Thakkar, Nitin N Pise. (2023) Leveraging Transformer-based Pretrained Language Model for Task-oriented Dialogue System. International Journal of Computers, 8, 1-4