Robustness in neural networks

Thumbnail

Event details

Date 20.06.2016
Hour 14:0016:00
Speaker El Mahdi El Mhamdi
Location
Category Conferences - Seminars
EDIC Candidacy Exam
Exam President: Prof. Wulfram Gerstner
Thesis Director: Prof. Rachid Guerraoui
Co-examiner: Prof. Viktor Kuncak

Background papers:
Robustness in Multilayer Perceptrons, by Kerlerzin & Vallet.
Maximally Fault Tolerant Neural Networks, by  Neti et al.
Implementing Fault-Tolerant Services Using the State Machine
Approach: A Tutorial
, by Schneider.

Abstract
Neural networks have been traditionally considered robust in the sense that their precision degrades gracefully with the failure of neurons and can be compensated by additional learning phases. Nevertheless, in a critical applications for which neural networks are now appealing solutions, we require a high level of precision and cannot afford any additional learning at run-time.
In fact, it has been experimentally observed that over-provisioning leads to robustness. Yet, the relation between the over-provision and the actual number of failures to be tolerated has never been precisely established.
We view a multilayer (often called deep) neural network as a distributed system of which neurons can fail independently, and we evaluate its robustness in the absence of any (recovery) learning phase.
We give tight bounds on the number of neurons that can fail, without harming the result of a computation. Starting from our preliminary results, we explore more robust distributed learning algorithms for neural networks.

Practical information

  • General public
  • Free

Contact

  • Cecilia Chapuis EDIC

Tags

EDIC candidacy exam

Share