A neural network classifier is fragile in the sense that a small perturbation to its input can lead to a wrong classification. Thus, an attacker can leverage this fragility to perturb an input. For example, in autonomous driving, adding small noise to a Stop Sign can change the Sign to a Yield (see figure). If the output of the neural network is used in safety-critical systems, such as driving, an adversarial perturbation can compromise safety, e.g. leading to collision. The objective of this project is to explore adversarial attacks and their consequences in safety-critical motion planning and control problems.
The student will first explore approaches to adversarial attack in a particular problem arising in robotics. She/he will simulate these models in a neural network classification/function approximation benchmark. Based on this initial study, she/he will explore possible defence mechanisms against the attack. The resulting models will be tested. The theoretically inclined student will advance the theory in modelling and mitigating the attacks for the robotics problem under consideration. The practically inclined student will implement potential algorithms on a realistic robotics testbed. This work requires strong background in neural network and their implementation for function approximation and image or text classification. Furthermore, it requires a basic background on
convex optimization and probability. An understanding of reinforcement learning and feedback control systems is an asset. The student will develop skills in analyzing properties of neural networks and their robustness. She/he will have a chance to develop and apply the theory based
on concrete problems arising in robotics. The project can have a more theoretical or applied focus.
Furthermore, it can be taken as a semester or Master’s project.
To apply, send an email containing 1. one paragraph on your background and fit for the project, 2. your BS and MS transcripts to [email protected].
The students who have suitable
track record will be contacted.