oalogo2  

AUTHOR(S):

Manisha Thakkar, Nitin N Pise

 

TITLE

Leveraging Transformer-based Pretrained Language Model for Task-oriented Dialogue System

pdf PDF

ABSTRACT

Task-oriented dialogue (TOD) systems are becoming increasingly popular due to their wide acceptability at personal and enterprise levels. These systems assist users to accomplish the intended task. Natural language input is given to such systems as text or speech. It is very complex to design TOD systems because it is difficult to maintain dialogue flow during the conversation. These neural systems require a lot of task-specific annotated data. To overcome the data scarcity of such systems recent advancements in Pretrained Language Models (PLMs) have shown promising results. In this paper, we studied the application of transformer-based PLMs to TOD systems tasks and compared their performances.

KEYWORDS

Task-oriented dialogue system, End-to-End systems Pretrained Language models, text-to-text transfer Transformer (T5)

 

Cite this paper

Manisha Thakkar, Nitin N Pise. (2023) Leveraging Transformer-based Pretrained Language Model for Task-oriented Dialogue System. International Journal of Computers, 8, 1-4

 

cc.png
Copyright © 2023 Author(s) retain the copyright of this article.
This article is published under the terms of the Creative Commons Attribution License 4.0