LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

An Interpretive Perspective: Adversarial Trojaning Attack on Neural-Architecture-Search Enabled Edge AI Systems

Photo by davidvives from unsplash

In this article, we propose and analyze a group of adversarial backdoor attack methods on neural-architecture-search (NAS) enabled edge AI systems in industrial Internet of Things (IIoT) domain. NAS is… Click to show full abstract

In this article, we propose and analyze a group of adversarial backdoor attack methods on neural-architecture-search (NAS) enabled edge AI systems in industrial Internet of Things (IIoT) domain. NAS is a new popular way to generate scale-adaptive deep neural networks which can meet the respective requirements of cloud, edge, and terminal AI computing in IIoT domain. However, since most users in NAS-enabled edge side are not the generators of AI models, the deployed edge AI models may have some vulnerabilities such as backdoors. These might pose serious security issues in IIoT. We propose some effective policies to attack such edge AI systems and provide advice about how to defend them. The most significant attack through third-party pretrained NAS in IIoT may occur by backdoor attacks while the third party might introduce vulnerability in the training dataset. The article designs backdoor attack processes to NAS-enabled edge devices to identify NAS’s vulnerability to adversarial trojaning attacks and interpret the backdoor attacks. It shows that the existence of high impact nodes greatly weakens the robustness of the network. A malicious attacker can quickly paralyze the network by only selecting a few high impact nodes. Finally, it provides advice and possible solution on defending the adversarial backdoor attacks to NAS.

Keywords: edge systems; adversarial trojaning; enabled edge; neural architecture; architecture search; edge

Journal Title: IEEE Transactions on Industrial Informatics
Year Published: 2023

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.