LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

RMAN: Relational multi-head attention neural network for joint extraction of entities and relations

Photo by lexscope from unsplash

The task of extracting entities and relations has evolved from distributed extraction to joint extraction. The joint model overcomes the disadvantages of distributed extraction method and strengthens the information interaction… Click to show full abstract

The task of extracting entities and relations has evolved from distributed extraction to joint extraction. The joint model overcomes the disadvantages of distributed extraction method and strengthens the information interaction between entities and relations. However, the existing methods of the joint model rarely pay attention to the semantic information between words, which have limitations in solving the problem of overlapping relations. In this paper, we propose an RMAN model for joint extraction of entities and relations, which includes multi-feature fusion encoder sentence representation and decoder sequence annotation. We first add a multi-head attention layer after Bi-LSTM to obtain sentence representations, and leverage the attention mechanism to capture relation-based sentence representations. Then, we perform sequence annotation on the sentence representation to obtain entity pairs. Experiments on NYT-single, NYT-multi and WebNLG datasets demonstrate that our model can efficiently extract overlapping triples, which outperforms other baselines.

Keywords: extraction entities; joint extraction; attention; entities relations; extraction; multi head

Journal Title: Applied Intelligence
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.