LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Paraphrase Generation Model Integrating Transformer Architecture, Part-of-Speech Features, and Pointer Generator Network

Photo from wikipedia

In recent years, hardware advancements have enabled natural language processing tasks that were previously difficult to achieve due to their intense computing requirements. This study focuses on paraphrase generation, which… Click to show full abstract

In recent years, hardware advancements have enabled natural language processing tasks that were previously difficult to achieve due to their intense computing requirements. This study focuses on paraphrase generation, which entails rewriting a sentence using different words and sentence structures while preserving its original meaning. This increases sentence diversity, thereby improving the performance of downstream tasks, such as question–answering systems and machine translation. This study proposes a novel paraphrase generation model that combines the Transformer architecture with part-of-speech features, and this model is trained using a Chinese corpus. New features are incorporated to improve the performance of the Transformer architecture, and the pointer generation network is used when the training data contain low-frequency words. This allows the model to focus on input words with important information according to their attention distributions.

Keywords: transformer architecture; generation; generation model; model; paraphrase generation

Journal Title: IEEE Access
Year Published: 2023

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.