LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Attention-Based Multi-Source Domain Adaptation

Photo from wikipedia

Multi-source domain adaptation (MSDA) aims to transfer knowledge from multi-source domains to one target domain. Inspired by single-source domain adaptation, existing methods solve MSDA by aligning the data distributions between… Click to show full abstract

Multi-source domain adaptation (MSDA) aims to transfer knowledge from multi-source domains to one target domain. Inspired by single-source domain adaptation, existing methods solve MSDA by aligning the data distributions between the target domain and each source domain. However, aligning the target domain with the dissimilar source domain would harm the representation learning. To address the above issue, an intuitive motivation of MSDA is using the attention mechanism to enhance the positive effects of the similar domains, and suppress the negative effects of the dissimilar domains. Therefore, we propose Attention-Based Multi-Source Domain Adaptation (ABMSDA) by considering the domain correlations to alleviate the effects caused by dissimilar domains. To obtain the domain correlations between source and target domains, ABMSDA firstly trains a domain recognition model to calculate the probability that the target images belong to each source domain. Based on the domain correlations, Weighted Moment Distance (WMD) is proposed to pay more attention on the source domains with higher similarities. Furthermore, Attentive Classification Loss (ACL) is developed to constrain that the feature extractor can generate the alignment and discriminative visual representations. The evaluations on two benchmarks demonstrate the effectiveness of the proposed model, e.g., an average of 6.1% improvement on the challenging DomainNet dataset.

Keywords: multi source; domain adaptation; source; domain; source domain

Journal Title: IEEE Transactions on Image Processing
Year Published: 2021

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.