LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Harden Deep Convolutional Classifiers via K-Means Reconstruction

Photo from wikipedia

Adversarial examples are carefully perturbed input examples that aim to mislead the deep neural network models into producing unexpected outputs. In this paper, we employ a K-means clustering algorithm as… Click to show full abstract

Adversarial examples are carefully perturbed input examples that aim to mislead the deep neural network models into producing unexpected outputs. In this paper, we employ a K-means clustering algorithm as a pre-processing method to defend against adversarial examples. Specifically, we reconstruct adversarial examples according to their cluster assignments in pixel level to reduce the impact of the injected perturbation. Our approach does not rely on any neural network architectures and can also work with existing pre-processing defenses to provide better protection for modern classifiers. Comprehensive comparison and evaluation have been conducted to investigate our proposal, where the models protected by the proposed defense show substantial robustness to strong adversarial attacks. As a by-product of our exploration of ensemble defense, we identify that the order of defense methods has a crucial impact on the final performance. Additionally, the limitation of K-means reconstruction and the impact of the number of clusters have also been studied to provide an in-deep understanding of pre-processing defenses.

Keywords: pre processing; harden deep; convolutional classifiers; deep convolutional; adversarial examples; means reconstruction

Journal Title: IEEE Access
Year Published: 2020

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.