LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

An attention-based multi-task model for named entity recognition and intent analysis of Chinese online medical questions

Photo from wikipedia

In this paper, we propose an attention-based multi-task neural network model for text classification and sequence tagging and then apply it to the named entity recognition and the intent analysis… Click to show full abstract

In this paper, we propose an attention-based multi-task neural network model for text classification and sequence tagging and then apply it to the named entity recognition and the intent analysis of Chinese online medical questions. We found that the use of both attention and multi-task learning improved the performance of these tasks. Our method achieved superior performance in named entity recognition and intent analysis compared with other baseline methods; the method is a light-weight solution that is suitable for deployment on small servers. Furthermore, we took advantage of the model's capabilities for these two tasks and built a simple question-answering system for cardiovascular issues. Users and service providers can monitor the logic of the answers generated by this system.

Keywords: multi task; entity recognition; intent analysis; recognition intent; attention; named entity

Journal Title: Journal of biomedical informatics
Year Published: 2020

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.