LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Feature autoencoder for detecting adversarial examples

Photo from wikipedia

Deep neural networks (DNNs) have gained widespread adoption in computer vision. Unfortunately, state‐of‐the‐art DNNs are vulnerable to adversarial example (AE) attacks, where an adversary introduces imperceptible perturbations to a test… Click to show full abstract

Deep neural networks (DNNs) have gained widespread adoption in computer vision. Unfortunately, state‐of‐the‐art DNNs are vulnerable to adversarial example (AE) attacks, where an adversary introduces imperceptible perturbations to a test example for defrauding DNNs. The obstacles have urged intensive research on improving the DNN robustness via adversarial training, that is, the clean data set is blended with adversarial examples to carry out training. However, the adversarial example attack technologies are open‐ended, and the adversarial training is insufficient to focus on improving robustness performance. To circumvent this limitation, we mitigate adversarial example attacks from another perspective, which aims at detecting adversarial examples. Feature autoencoder detector (FADetector), a novel defense framework that exploits feature knowledge is proposed. One of the hallmarks of FADetector is to not involve adversarial examples to train the detector. Our extensive evaluation on MNIST and CIFAR‐10 data sets demonstrates that our defense outperforms the conventional autoencoder detectors in terms of detection accuracy.

Keywords: detecting adversarial; adversarial examples; adversarial example; examples feature; feature autoencoder

Journal Title: International Journal of Intelligent Systems
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.