BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:Robustness in neural networks
DTSTART:20160620T140000
DTEND:20160620T160000
DTSTAMP:20260407T211002Z
UID:59152eaf317aad065f8047a93b76126ce0a7161ebd7c317f0b347dfa
CATEGORIES:Conferences - Seminars
DESCRIPTION:El Mahdi El Mhamdi \nEDIC Candidacy Exam\nExam President: Prof
 . Wulfram Gerstner\nThesis Director: Prof. Rachid Guerraoui\nCo-examiner: 
 Prof. Viktor Kuncak\nBackground papers:Robustness in Multilayer Perceptron
 s\, by Kerlerzin & Vallet.Maximally Fault Tolerant Neural Networks\, by  
 Neti et al.Implementing Fault-Tolerant Services Using the State Machine\nA
 pproach: A Tutorial\, by Schneider.Abstract\nNeural networks have been tra
 ditionally considered robust in the sense that their precision degrades gr
 acefully with the failure of neurons and can be compensated by additional 
 learning phases. Nevertheless\, in a critical applications for which neura
 l networks are now appealing solutions\, we require a high level of precis
 ion and cannot afford any additional learning at run-time.\nIn fact\, it h
 as been experimentally observed that over-provisioning leads to robustness
 . Yet\, the relation between the over-provision and the actual number of f
 ailures to be tolerated has never been precisely established.\nWe view a m
 ultilayer (often called deep) neural network as a distributed system of wh
 ich neurons can fail independently\, and we evaluate its robustness in the
  absence of any (recovery) learning phase.\nWe give tight bounds on the nu
 mber of neurons that can fail\, without harming the result of a computatio
 n. Starting from our preliminary results\, we explore more robust distribu
 ted learning algorithms for neural networks.
LOCATION:BC 229 https://plan.epfl.ch/?room==BC%20229
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
