LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Multiexpert Adversarial Regularization for Robust and Data-Efficient Deep Supervised Learning

Photo from wikipedia

Deep neural networks (DNNs) can achieve high accuracy when there is abundant training data that has the same distribution as the test data. In practical applications, data deficiency is often… Click to show full abstract

Deep neural networks (DNNs) can achieve high accuracy when there is abundant training data that has the same distribution as the test data. In practical applications, data deficiency is often a concern. For classification tasks, the lack of enough labeled images in the training set often results in overfitting. Another issue is the mismatch between the training and the test domains, which results in poor model performance. This calls for the need to have robust and data efficient deep learning models. In this work, we propose a deep learning approach called Multi-Expert Adversarial Regularization learning (MEAR) with limited computational overhead to improve the generalization and robustness of deep supervised learning models. The MEAR framework appends multiple classifier heads (experts) to the feature extractor of the legacy model. MEAR aims to learn the feature extractor in an adversarial fashion by leveraging complementary information from the individual experts as well as the ensemble of the experts to be more robust for an unseen test domain. We train state-of-the-art networks with MEAR for two important computer vision tasks, image classification and semantic segmentation. We compare MEAR to a variety of baselines on multiple benchmarks. We show that MEAR is competitive with other methods and more successful at learning robust features.

Keywords: supervised learning; robust data; data efficient; deep supervised; efficient deep; adversarial regularization

Journal Title: IEEE Access
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.