LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Recurrent Translation-Based Network for Top-N Sparse Sequential Recommendation

Photo by dulhiier from unsplash

Fulfilling users’ needs and increasing the retention rate of recommendation systems are challenging. Most users have consumed a few items in most systems. Translation-based model performs well on sparse datasets.… Click to show full abstract

Fulfilling users’ needs and increasing the retention rate of recommendation systems are challenging. Most users have consumed a few items in most systems. Translation-based model performs well on sparse datasets. However, a user and only single previous item are considered for the user suggestion of next items. Alternatively, recurrent neural network utilizes sequential dependency but performs poorly on sparse datasets. We unify both and propose Recurrent Translation-based Network (RTN). RTN utilizes sequences of users’ consumed items without limiting interactions between items to the most recent one. The results of conducting experiments on real-world datasets show that RTN outperforms other state-of-the-art approaches on sparse datasets.

Keywords: recurrent translation; based network; translation; recommendation; translation based

Journal Title: IEEE Access
Year Published: 2019

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.