Abstract Text style transfer is an important task in the field of natural language generation. Because of the lack of parallel data, it is a challenge to address this problem… Click to show full abstract
Abstract Text style transfer is an important task in the field of natural language generation. Because of the lack of parallel data, it is a challenge to address this problem in an unsupervised manner. Existing methods mainly focus on the two-style transfer task, i.e. from one source style to one target style. In this paper, we first propose the task of unsupervised text multi-style transfer to address the problem of efficient text transfer from a source style to multiple target styles. To tackle this new task, we present a novel model based on Non-Autoregressive Transformer (NAT). The model consists of two parts: a parameter-shared style-independent module and a style-dependent module. In practice, we only need to reinitialize the parameter of style-dependent modules and retrain the whole model which can converge fast. Experimental results show that our model not only performs well in two-style transfer task, but also promises good results in the multi-style scenario.
               
Click one of the above tabs to view related content.