LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Black-Box Audio Adversarial Attack Using Particle Swarm Optimization

Photo from wikipedia

The development of artificial neural networks and artificial intelligence has helped to address problems and improve services in various fields, such as autonomous driving, image classification, medical diagnosis, and speech… Click to show full abstract

The development of artificial neural networks and artificial intelligence has helped to address problems and improve services in various fields, such as autonomous driving, image classification, medical diagnosis, and speech recognition. However, this technology has raised security threats that are different from existing ones. Recent studies have shown that artificial neural networks can easily malfunction by adversarial examples. The adversarial examples operate the neural network model as intended by the adversary. In particular, adversarial examples targeting speech recognition models is an area that has been actively studied in recent years. Existing studies have focused more on white-box methods. However, most speech recognition services are provided online and involve black-box, making it difficult or impossible for adversaries to attack. Black-box attacks have several challenges. Typically, they have a low success rate and a high risk of detection. In particular, previously proposed genetic algorithm (GA)-based attacks are at a high risk of detection because they require numerous queries. Therefore, we propose an adversarial attack system using particle swarm optimization (PSO) algorithms to address these problems. The proposed system uses adversarial candidates as particles to obtain adversarial examples through iterative optimization. PSO-based adversarial attacks are more efficient in queries and have a higher attack success rate than the adversarial methods using GAs. In particular, our key function is that temporary particle generation maximizes query efficiency to reduce detection risk and prevent wastage of system resources. On average, our system exhibits 96% attack success rates with 1416.17 queries, indicating that is 71.41% and 8% better in terms of query and success rates than existing GA-based attacks, respectively.

Keywords: black box; optimization; adversarial attack; adversarial examples; box; particle

Journal Title: IEEE Access
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.