This paper introduces a novel method that integrates structural information with training deep neural models to solve math word problems. Prior works adopt the graph structure to represent rich information… Click to show full abstract
This paper introduces a novel method that integrates structural information with training deep neural models to solve math word problems. Prior works adopt the graph structure to represent rich information residing in the input sentences. However, they lack the consideration of different relation types between other parts of the sentences. To provide various types of structural information in a uniform way, we propose a graph transformer encoder to integrate heterogeneous graphs of various input representations. We developed two types of graph structures. First, the Dependency Graph maintains long-distance lexical dependency between words and quantities. Second, the Question Overlap Graph captures the gist within the problem body. The two graphs are encoded as a single graph for graph transformation. Experimental results show that our method produces competitive results compared to the baselines. Our model outperforms state-of-the-art models in Equation and Answer accuracy near three percent in SVAMP benchmark. Moreover, we discuss that integrating different types of textual characteristics may improve the quality of mathematical logic inference from natural language sentences.
               
Click one of the above tabs to view related content.