Text classification is an essential task in many natural language processing (NLP) applications; we know each sentence may have only a few words that play an important role in text… Click to show full abstract
Text classification is an essential task in many natural language processing (NLP) applications; we know each sentence may have only a few words that play an important role in text classification, while other words have no significant effect on the classification results. Finding these keywords has an important impact on the classification accuracy. In this paper, we propose a network model, named RCNNA, recurrent convolution neural networks with attention (RCNNA), which models on the human conditional reflexes for text classification. The model combines bidirectional LSTM (BLSTM), attention mechanism, and convolutional neural networks (CNNs) as the receptors, nerve centers, and effectors in the reflex arc, respecctively. The receptors get the context information through BLSTM, the nerve centers get the important information of the sentence through the attention mechanism, and the effectors capture more key information by CNN. Finally, the model outputs the classification result by the softmax function. We test our NLP algorithm on four datasets containing Chinese and English for text classification, including a comparison of random initialization word vectors and pre-training word vectors. The experiments show that the RCNNA achieves the best performance by comparing with the state-of-the-art baseline methods.
               
Click one of the above tabs to view related content.