LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Injecting User Identity Into Pretrained Language Models for Document-Level Sentiment Classification

Photo by austindistel from unsplash

This paper mainly studies the combination of pre-trained language models and user identity information for document-level sentiment classification. In recent years, pre-trained language models (PLMs) such as BERT have achieved… Click to show full abstract

This paper mainly studies the combination of pre-trained language models and user identity information for document-level sentiment classification. In recent years, pre-trained language models (PLMs) such as BERT have achieved state-of-the-art results on many NLP applications, including document-level sentiment classification. On the other hand, a collection of works introduce additional information such as user identity for better text modeling. However, most of them inject user identity into traditional models, while few studies have been conducted to study the combination of pre-trained language models and user identity for even better performance. To address this issue, in this paper, we propose to unite user identity and PLMs and formulate User-enhanced Pre-trained Language Models (U-PLMs). Specifically, we demonstrate two simple yet effective attempts, i.e. embedding-based and attention-based personalization, which inject user identity into different parts of a pre-trained language model and provide personalization from different perspectives. Experiments in three datasets with two backbone PLMs show that our proposed methods outperform the best state-of-the-art baseline method with an absolute improvement of up to 3%, 2.8%, and 2.2% on accuracy. In addition, our methods encode user identity with plugin modules, which are fully compatible with most auto-encoding pre-trained language models.

Keywords: identity; user identity; trained language; language models; pre trained

Journal Title: IEEE Access
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.