LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Multi-Group Transfer Learning on Multiple Latent Spaces for Text Classification

Photo from wikipedia

Transfer learning aims to leverage valuable information in one domain to promote the learning tasks in the other domain. Some recent studies indicated that the latent information, which has a… Click to show full abstract

Transfer learning aims to leverage valuable information in one domain to promote the learning tasks in the other domain. Some recent studies indicated that the latent information, which has a close relationship with the high-level concepts, are more suitable for cross-domain text classification than learning raw features. To obtain more latent information existing in the latent feature space, some previous methods constructed multiple latent feature spaces. However, those methods ignored that the latent information of different latent spaces may lack the relevance for promoting the adaptability of transfer learning models, even may lead to negative knowledge transfer when there exists a glaring discrepancy among the different latent spaces. Additionally, since those methods learn the latent space distributions using a strategy of direct-promotion, their computational complexity increases exponentially as the number of latent spaces increases. To tackle this challenge, this paper proposes a Multiple Groups Transfer Learning (MGTL) method. MGTL first constructs multiple different latent feature spaces and then integrates the adjacent ones that have a similar latent feature dimension into one latent space group. Along this way, multiple latent space groups can be obtained. To enhance the relevance among these latent space groups, MGTL makes the adjacent groups contain one same latent space at least. Then, different groups will have more relevance than raw latent spaces. Second, MGTL utilizes an indirect-promotion strategy to connect different latent space groups. The computational complexity of MGTL increases linearly as the number of latent space groups increases and is superior to those multiple latent space methods based on direct-promotion. In addition, an iterative algorithm is proposed to solve the optimization problem. Finally, a set of systematic experiments demonstrate that MGTL outperforms all the compared existing methods.

Keywords: multiple latent; space; transfer learning; latent spaces; latent space

Journal Title: IEEE Access
Year Published: 2020

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.