LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Convolutional End-to-End Memory Networks for Multi-Hop Reasoning

Photo by kellysikkema from unsplash

Machine reading and comprehension using differentiable reasoning models has recently been studied extensively, and memory networks have demonstrated promising performance on some reasoning tasks such as factual reasoning and basic… Click to show full abstract

Machine reading and comprehension using differentiable reasoning models has recently been studied extensively, and memory networks have demonstrated promising performance on some reasoning tasks such as factual reasoning and basic deduction. However, as a natural language understanding model, memory networks still face challenges on the numeric representations for sentences, particularly the text representation method and the effectiveness of learned vector representations. In this paper, inspired by the convolution mechanism in the computer vision domain, a raw text representation architecture for question answering problem named convolutional end-to-end memory networks(CMemN2N) architecture is proposed. The convolutional architecture of the proposed model allows us to abstract the useful local information for reasoning to get the significant numeric sentence representation passed to the follow-up sub-tasks. Our experiments show that CMemN2N achieves better results on most of the 20 bAbI task dataset, yielding improvements for the average result compared to the state-of-the-art.

Keywords: convolutional end; end end; memory; end memory; memory networks

Journal Title: IEEE Access
Year Published: 2019

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.