In recent years, BERT encoder methods have been widely used in aspect term sentiment analysis (ATSA) tasks. Many ways of putting text and aspect term into a BERT sentence encoder… Click to show full abstract
In recent years, BERT encoder methods have been widely used in aspect term sentiment analysis (ATSA) tasks. Many ways of putting text and aspect term into a BERT sentence encoder separately aim to create vectors by obtaining context and aspect words. However, the semantic relevance of these initially extracted context-hiding vectors and aspect word-hiding vectors is poor. Moreover, they are easily affected by irrelevant words. Therefore, the CGBN model is proposed in this paper, which uses only the sentence sequence as the input to the BERT encoder. Moreover, the context-hiding vectors and aspect word-hiding vectors containing rich semantic association information were able to be extracted simultaneously for the first time. In addition, this paper proposes a new interactive gating mechanism called a co-gate. Compared with the general interactive feature extraction mechanism, it can not only effectively reduce the interference of noisy words but also fuse the information of context and aspect term better and capture emotional semantic features. To enhance the ability of BERT to be fine-tuned with domain data, the pretraining file of BERT Post Training (BERT-PT) is used in this paper to fine-tune the CGBN model. A method of domain adaptation is also applied with combined training sets, thus enhancing the training effect of the target domain data. Experiments and analysis prove the validity of the model.
               
Click one of the above tabs to view related content.