LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Self-Weighted Unsupervised LDA.

Photo from academic.microsoft.com

As a hot topic in unsupervised learning, clustering methods have been greatly developed. However, the model becomes more and more complex, and the number of parameters becomes more and more… Click to show full abstract

As a hot topic in unsupervised learning, clustering methods have been greatly developed. However, the model becomes more and more complex, and the number of parameters becomes more and more with the continuous development of clustering methods. And parameter-tuning in most methods is a laborious work due to its complexity and unpredictability. How to propose a concise and beautiful model in which the parameters can be learned adaptively becomes a very meaningful problem. Aim at tackling this problem, we develop a novel self-weighted unsupervised linear discriminative analysis method, namely SWULDA. The proposed method not only avoids adjusting parameters but also explains the link between k-means and linear discriminant analysis (LDA). To obtain superior structural performance, the idea of minimizing the within-class scatter matrix and maximizing the between-class scatter matrix is embedded in the unsupervised model. Moreover, equipped with the proposed quadratic weighted optimization framework, the parameter can be adaptively learned. The extensive experiments on several datasets are conducted to validate the effectiveness of our method.

Keywords: self weighted; unsupervised lda; method; weighted unsupervised

Journal Title: IEEE transactions on neural networks and learning systems
Year Published: 2021

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.