LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Exploiting deep representations for natural language processing

Photo from archive.org

Abstract Advanced neural network models generally implement systems as multiple layers to model complex functions and capture complicated linguistic structures at different levels [1]. However, only the top layers of deep… Click to show full abstract

Abstract Advanced neural network models generally implement systems as multiple layers to model complex functions and capture complicated linguistic structures at different levels [1]. However, only the top layers of deep networks are leveraged in the subsequent process, which misses the opportunity to exploit the useful information embedded in other layers. In this work, we propose to expose all of these embedded signals with two types of mechanisms, namely deep connections and iterative routings. While deep connections allow better information and gradient flow across layers, iterative routings directly combine the layer representations to form a final output with iterative routing-by-agreement mechanism. Experimental results on both machine translation and language representation tasks demonstrate the effectiveness and universality of the proposed approaches, which indicates the necessity of exploiting deep representations for natural language processing tasks. While the two strategies individually boost performance, combining them can further improve performance.

Keywords: deep representations; natural language; language processing; representations natural; exploiting deep; language

Journal Title: Neurocomputing
Year Published: 2020

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.