LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Towards unsupervised text multi-style transfer with parameter-sharing scheme

Photo from wikipedia

Abstract Text style transfer is an important task in the field of natural language generation. Because of the lack of parallel data, it is a challenge to address this problem… Click to show full abstract

Abstract Text style transfer is an important task in the field of natural language generation. Because of the lack of parallel data, it is a challenge to address this problem in an unsupervised manner. Existing methods mainly focus on the two-style transfer task, i.e. from one source style to one target style. In this paper, we first propose the task of unsupervised text multi-style transfer to address the problem of efficient text transfer from a source style to multiple target styles. To tackle this new task, we present a novel model based on Non-Autoregressive Transformer (NAT). The model consists of two parts: a parameter-shared style-independent module and a style-dependent module. In practice, we only need to reinitialize the parameter of style-dependent modules and retrain the whole model which can converge fast. Experimental results show that our model not only performs well in two-style transfer task, but also promises good results in the multi-style scenario.

Keywords: task; style transfer; unsupervised text; multi style; transfer; style

Journal Title: Neurocomputing
Year Published: 2021

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.