LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Hierarchical attention based long short-term memory for Chinese lyric generation

Photo from wikipedia

Automating the process of lyric generation should face the challenge of being meaningful and semantically related to a scenario. Traditional keyword or template based lyric generation systems always ignore the… Click to show full abstract

Automating the process of lyric generation should face the challenge of being meaningful and semantically related to a scenario. Traditional keyword or template based lyric generation systems always ignore the patterns and styles of lyricists, which suffer from improper lyric construction and maintenance. A Chinese lyric generation system is proposed to learn patterns and styles of certain lyricists and generate lyrics automatically. A long short-term memory network is utilized to process each lyric line and generate the next line word by word. A hierarchical attention model is designed to capture the contextual information at both sentence and document level, which could learn high level representations of each lyric line and the entire document. Furthermore, the LSTM decoder decodes all the semantic contextual information into lyric lines word by word. The results of the automatically generated lyrics show that the proposed method can correctly capture the patterns and styles of a certain lyricist and fit into certain scenarios, which also outperforms state-of-the-art models.

Keywords: term memory; long short; generation; lyric generation; short term; chinese lyric

Journal Title: Applied Intelligence
Year Published: 2018

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.