LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

On the rotational invariant L1-norm PCA

Photo from archive.org

Abstract Principal component analysis (PCA) is a powerful tool for dimensionality reduction. Unfortunately, it is sensitive to outliers, so that various robust PCA variants were proposed in the literature. One… Click to show full abstract

Abstract Principal component analysis (PCA) is a powerful tool for dimensionality reduction. Unfortunately, it is sensitive to outliers, so that various robust PCA variants were proposed in the literature. One of the most frequently applied methods for high dimensional data reduction is the rotational invariant L 1 -norm PCA of Ding and coworkers. So far no convergence proof for this algorithm was available. The main topic of this paper is to fill this gap. We reinterpret this robust approach as a conditional gradient algorithm and show moreover that it coincides with a gradient descent algorithm on Grassmann manifolds. Based on the latter point of view, we prove global convergence of the whole series of iterates to a critical point using the Kurdyka-Ɓojasiewicz property of the objective function, where we have to pay special attention to so-called anchor points, where the function is not differentiable.

Keywords: algorithm; rotational invariant; invariant norm; pca; norm pca

Journal Title: Linear Algebra and its Applications
Year Published: 2020

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.