LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Piecewise convolutional neural networks with position attention and similar bag attention for distant supervision relation extraction

Photo from wikipedia

In relation to extraction tasks, distant supervision is a very effective method, which can automatically generate training data via aligning KBs and texts, thereby solving the problem of manually labeling… Click to show full abstract

In relation to extraction tasks, distant supervision is a very effective method, which can automatically generate training data via aligning KBs and texts, thereby solving the problem of manually labeling data. However, distant supervision inevitably accompanies the wrong labeling problem. The paper presents a neural relation extraction method to deal with the problem of noisy words and poor feature information in the one-sentence bags generated by distant supervision. Previous studies mainly focus on sentence-level denoising and even bag-level denoising by designing neural networks. In the paper, we propose a piecewise convolutional neural network with position attention and similar bag attention for distant supervision relation extraction(PCNN-PATT-SBA). First, we propose a position attention based on Gaussian distribution, by modeling the position relationship between non-entity words and entity words to assign weights for the words of the sentence, which is expected to reduce the influence of those noisy words. In addition, we propose similar bag attention based on the feature similarity between different bags, which merges the features of similar bags to solve the problem of poor feature information in one-sentence bags. Experimental results on the New York Times dataset demonstrate the effectiveness of our proposed position attention and similar bag attention modules. Our method also achieves better relation extraction accuracy than state-of-the-art methods on this dataset. And compared to the bag-of-sentence attention model, the P value is increased by 6.9%, Compared with selective attention over instances (PCNN-ATT), an increase of 25.6%, compared to Instance-Level Adversarial Training (PCNN-HATT), an increase of 12.1%.

Keywords: attention; relation extraction; distant supervision

Journal Title: Applied Intelligence
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.