LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Accommodating Multiple Tasks’ Disparities With Distributed Knowledge-Sharing Mechanism

Photo from wikipedia

Deep multitask learning (MTL) shares beneficial knowledge across participating tasks, alleviating the impacts of extreme learning conditions on their performances such as the data scarcity problem. In practice, participators stemming… Click to show full abstract

Deep multitask learning (MTL) shares beneficial knowledge across participating tasks, alleviating the impacts of extreme learning conditions on their performances such as the data scarcity problem. In practice, participators stemming from different domain sources often have varied complexities and input sizes, for example, in the joint learning of computer vision tasks with RGB and grayscale images. For adapting to these differences, it is appropriate to design networks with proper representational capacities and construct neural layers with corresponding widths. Nevertheless, most of the state-of-the-art methods pay little attention to such situations, and actually fail to handle the disparities. To work with the dissimilitude of tasks’ network designs, this article presents a distributed knowledge-sharing framework called tensor ring multitask learning (TRMTL), in which the relationship between knowledge sharing and original weight matrices is cut up. The framework of TRMTL is flexible, which is not only capable of sharing knowledge across heterogenous networks but also able to jointly learn tasks with varied input sizes, significantly improving performances of data-insufficient tasks. Comprehensive experiments on challenging datasets are conducted to empirically validate the effectiveness, efficiency, and flexibility of TRMTL in dealing with the disparities in MTL.

Keywords: accommodating multiple; tasks disparities; knowledge sharing; knowledge; multiple tasks; distributed knowledge

Journal Title: IEEE Transactions on Cybernetics
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.