LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Sequential Label Enhancement.

Label distribution learning (LDL) is a novel machine learning paradigm for solving ambiguous tasks, where the degree to which each label describing the instance is ambiguous. However, obtaining the label… Click to show full abstract

Label distribution learning (LDL) is a novel machine learning paradigm for solving ambiguous tasks, where the degree to which each label describing the instance is ambiguous. However, obtaining the label distribution is high cost and the description degree is difficult to quantify. Most existing research works focus on designing an objective function to obtain the whole description degrees at once but seldom care about the sequentiality in the process of recovering the label distribution. In this article, we formulate the label distribution recovering task as a sequential decision process called sequential label enhancement (Seq_LE), which is more consistent with the process of annotating the label distribution in human brains. Specifically, the discrete label and its description degree are serially mapped by the reinforcement learning (RL) agent. Besides, we carefully design a joint reward function to drive the agent to fully learn the optimal decision policy. Extensive experiments on 16 LDL datasets are conducted under various evaluation metrics. The experimental results demonstrate convincingly that the proposed sequential label enhancement (LE) leads to better performance over the state-of-the-art methods.

Keywords: label distribution; label enhancement; label; sequential label

Journal Title: IEEE transactions on neural networks and learning systems
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.