Relation classification is a crucial ingredient in numerous information-extraction systems and has attracted a great deal of attention in recent years. Traditional approaches largely rely on feature engineering and suffer… Click to show full abstract
Relation classification is a crucial ingredient in numerous information-extraction systems and has attracted a great deal of attention in recent years. Traditional approaches largely rely on feature engineering and suffer from the limitations of the domain adaption and the error propagation. To overcome the above-mentioned problems, many deep neural network-based methods have been proposed; however, these methods cannot effectively locate and utilize the relation trigger features. To locate the relation trigger features and make full use of them, we propose a novel multi-gram convolution neural network-based self-attention model with a recurrent neural network framework. The multi-gram conventional neural network attention model can learn the adaptive relational semantics of inputs based on the fact that a relation can be totally defined by the shortest dependency path between its two entities. With the learned relational semantics, we can obtain the corresponding importance distribution over input sentences and locate the relation trigger features. For effective information propagation and integration, we utilize a bidirectional gated recurrent unit to encode the high-level features during recurrent propagation. The experimental results on two benchmark datasets demonstrate that the proposed model outperforms most of the state-of-the-art models.
               
Click one of the above tabs to view related content.