LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Adversarial Attacks in Modulation Recognition With Convolutional Neural Networks

Photo from wikipedia

Deep learning (DL) models are vulnerable to adversarial attacks, by adding a subtle perturbation which is imperceptible to the human eye, a convolutional neural network (CNN) can lead to erroneous… Click to show full abstract

Deep learning (DL) models are vulnerable to adversarial attacks, by adding a subtle perturbation which is imperceptible to the human eye, a convolutional neural network (CNN) can lead to erroneous results, which greatly reduces the reliability and security of the DL tasks. Considering the wide application of modulation recognition in the communication field and the rapid development of DL, by adding a well-designed adversarial perturbation to the input signal, this article explores the performance of attack methods on modulation recognition, measures the effectiveness of adversarial attacks on signals, and provides the empirical evaluation of the reliabilities of CNNs. The results indicate that the accuracy of the target model reduce significantly by adversarial attacks, when the perturbation factor is 0.001, the accuracy of the model could drop by about 50${\%}$ on average. Among them, iterative methods show greater attack performances than that of one-step method. In addition, the consistency of the waveform before and after the perturbation is examined, to consider whether the added adversarial examples are small enough (i.e., hard to distinguish by human eyes). This article also aims at inspiring researchers to further promote the CNNs reliabilities against adversarial attacks.

Keywords: modulation recognition; perturbation; convolutional neural; adversarial attacks

Journal Title: IEEE Transactions on Reliability
Year Published: 2021

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.