LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Recurrent networks with attention and convolutional networks for sentence representation and classification

Photo by absolut from unsplash

In this paper, we propose a bi-attention, a multi-layer attention and an attention mechanism and convolution neural network based text representation and classification model (ACNN). The bi-attention have two attention… Click to show full abstract

In this paper, we propose a bi-attention, a multi-layer attention and an attention mechanism and convolution neural network based text representation and classification model (ACNN). The bi-attention have two attention mechanism to learn two context vectors, forward RNN with attention to learn forward context vector c⃗$\overrightarrow {\mathbf {c}}$ and backward RNN with attention to learn backward context vector c⃖$\overleftarrow {\mathbf {c}}$, and then concatenation c⃗$\overrightarrow {\mathbf {c}}$ and c⃖$\overleftarrow {\mathbf {c}}$ to get context vector c. The multi-layer attention is the stack of the bi-attention. In the ACNN, the context vector c is obtained by the bi-attention, then the convolution operation is performed on the context vector c, and the max-pooling operation is used to reduce the dimension. After max-pooling operation the text is converted to low-dimensional sentence vector m. Finally, the Softmax classifier be used for text classification. We test our model on 8 benchmarks text classification datasets, and our model achieved a better or the same performance compare with the state-of-the-art methods.

Keywords: representation classification; context vector; attention; classification

Journal Title: Applied Intelligence
Year Published: 2018

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.