In this paper, we propose an attention-based multi-task neural network model for text classification and sequence tagging and then apply it to the named entity recognition and the intent analysis… Click to show full abstract
In this paper, we propose an attention-based multi-task neural network model for text classification and sequence tagging and then apply it to the named entity recognition and the intent analysis of Chinese online medical questions. We found that the use of both attention and multi-task learning improved the performance of these tasks. Our method achieved superior performance in named entity recognition and intent analysis compared with other baseline methods; the method is a light-weight solution that is suitable for deployment on small servers. Furthermore, we took advantage of the model's capabilities for these two tasks and built a simple question-answering system for cardiovascular issues. Users and service providers can monitor the logic of the answers generated by this system.
               
Click one of the above tabs to view related content.