Quality Assessment of Automatic Paraphrasing Tool for English: An Analysis at Syntactic Level

Abstract

Tools for paraphrasing are regarded as significant educational resources that support academia. Both professionals and students can use these technologies to make their jobs easier. But the effectiveness of these paraphrasing techniques needs to be evaluated and assessed. This study examines the syntactic similarities and differences between the original and the paraphrased text to evaluate the quality of automated paraphrasing performed by such tools. The data used in this analysis comes from QuillBot's paraphrasing of both literary and non-literary texts. Through the corpus tool AntConc, syntactic features were studied. The HSO measure in WordNet was also utilized to measure the relatedness between sentences at the aforementioned level. There were many variations between the original and the paraphrased text. The automated paraphrase of non-literary text by QuillBot is closer to the original text than that of literary text. Syntactic modifications were discovered, including changes to word order, tense, voice number, and grammatical category. These modifications occasionally skewed the message while other times they elaborated it. Therefore, manual revision and rechecking of automatic paraphrases should be done rather than taking it for granted. While conventional technologies, like QuillBot, might be depended on for paraphrasing of non-literary text, it needs to be manually verified and updated in addition to the automated paraphrase of literary content carried out by such programs. Keywords: Automatic paraphrasing, Syntactic analysis, QuillBot, HSO measure, WordNet

    Similar works