Full Text:   <41>

CLC number: 

On-line Access: 2024-05-01

Received: 2023-12-25

Revision Accepted: 2024-04-07

Crosschecked: 0000-00-00

Cited: 0

Clicked: 51

Citations:  Bibtex RefMan EndNote GB/T7714

-   Go to

Article info.
Open peer comments

Journal of Zhejiang University SCIENCE C 1998 Vol.-1 No.-1 P.

http://doi.org/10.1631/FITEE.2300867


A comprehensive survey of physical adversarial vulnerabilities in autonomous driving systems


Author(s):  Shuai ZHAO, Boyuan ZHANG, Yucheng SHI, Yang ZHAI, Yahong HAN, Qinghua HU

Affiliation(s):  College of Intelligence and Computing, Tianjin University, China; more

Corresponding email(s):   yahong@tju.edu.cn

Key Words:  Physical adversarial attacks, Physical adversarial defenses, AI safety, Deep learning, Autonomous driving system, Data-fusion, Adversarial vulnerability


Shuai ZHAO, Boyuan ZHANG, Yucheng SHI, Yang ZHAI, Yahong HAN, Qinghua HU. A comprehensive survey of physical adversarial vulnerabilities in autonomous driving systems[J]. Frontiers of Information Technology & Electronic Engineering, 1998, -1(-1): .

@article{title="A comprehensive survey of physical adversarial vulnerabilities in autonomous driving systems",
author="Shuai ZHAO, Boyuan ZHANG, Yucheng SHI, Yang ZHAI, Yahong HAN, Qinghua HU",
journal="Frontiers of Information Technology & Electronic Engineering",
volume="-1",
number="-1",
pages="",
year="1998",
publisher="Zhejiang University Press & Springer",
doi="10.1631/FITEE.2300867"
}

%0 Journal Article
%T A comprehensive survey of physical adversarial vulnerabilities in autonomous driving systems
%A Shuai ZHAO
%A Boyuan ZHANG
%A Yucheng SHI
%A Yang ZHAI
%A Yahong HAN
%A Qinghua HU
%J Journal of Zhejiang University SCIENCE C
%V -1
%N -1
%P
%@ 2095-9184
%D 1998
%I Zhejiang University Press & Springer
%DOI 10.1631/FITEE.2300867

TY - JOUR
T1 - A comprehensive survey of physical adversarial vulnerabilities in autonomous driving systems
A1 - Shuai ZHAO
A1 - Boyuan ZHANG
A1 - Yucheng SHI
A1 - Yang ZHAI
A1 - Yahong HAN
A1 - Qinghua HU
J0 - Journal of Zhejiang University Science C
VL - -1
IS - -1
SP -
EP -
%@ 2095-9184
Y1 - 1998
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/FITEE.2300867


Abstract: 
autonomous driving systems (ADSs) have attracted wide attention in the machine learning communities. With the help of deep neural networks (DNNs), ADSs have shown both satisfactory performance under significant uncertainties in the environment and the ability to compensate for system failures without external intervention. However, the vulnerability of ADSs has raised concerns since DNNs have been proven vulnerable to adversarial attacks. In this paper, we present a comprehensive survey of current physical adversarial vulnerabilities in ADSs. We first divide the physical adversarial attack and defense methods by their restrictions of deployment into three scenarios: the real-world, the simulated, and the digital-world scenarios. Then, we consider the adversarial vulnerabilities that focus on various sensors in ADSs and separate them as camera-based, Light Detection And Ranging (LiDAR)-based, and multifusion-based attacks. Subsequently, we divide the attack tasks by traffic elements. For the physical defenses, we establish the taxonomy with reference to image preprocessing, adversarial detections, and model enhancement for the DNN models to achieve full coverage of the adversarial defenses. Based on the above survey, we finally discuss the challenges in this research field and provide further outlook on future directions.

Darkslateblue:Affiliate; Royal Blue:Author; Turquoise:Article

Open peer comments: Debate/Discuss/Question/Opinion

<1>

Please provide your name, email address and a comment





Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou 310027, China
Tel: +86-571-87952783; E-mail: cjzhang@zju.edu.cn
Copyright © 2000 - 2024 Journal of Zhejiang University-SCIENCE