Fulfilling users’ needs and increasing the retention rate of recommendation systems are challenging. Most users have consumed a few items in most systems. Translation-based model performs well on sparse datasets.… Click to show full abstract
Fulfilling users’ needs and increasing the retention rate of recommendation systems are challenging. Most users have consumed a few items in most systems. Translation-based model performs well on sparse datasets. However, a user and only single previous item are considered for the user suggestion of next items. Alternatively, recurrent neural network utilizes sequential dependency but performs poorly on sparse datasets. We unify both and propose Recurrent Translation-based Network (RTN). RTN utilizes sequences of users’ consumed items without limiting interactions between items to the most recent one. The results of conducting experiments on real-world datasets show that RTN outperforms other state-of-the-art approaches on sparse datasets.
               
Click one of the above tabs to view related content.