In recent years, hardware advancements have enabled natural language processing tasks that were previously difficult to achieve due to their intense computing requirements. This study focuses on paraphrase generation, which… Click to show full abstract
In recent years, hardware advancements have enabled natural language processing tasks that were previously difficult to achieve due to their intense computing requirements. This study focuses on paraphrase generation, which entails rewriting a sentence using different words and sentence structures while preserving its original meaning. This increases sentence diversity, thereby improving the performance of downstream tasks, such as question–answering systems and machine translation. This study proposes a novel paraphrase generation model that combines the Transformer architecture with part-of-speech features, and this model is trained using a Chinese corpus. New features are incorporated to improve the performance of the Transformer architecture, and the pointer generation network is used when the training data contain low-frequency words. This allows the model to focus on input words with important information according to their attention distributions.
               
Click one of the above tabs to view related content.