BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:Harmonic analysis of deep convolutional neural networks
DTSTART:20170227T140000
DTEND:20170227T150000
DTSTAMP:20260413T235549Z
UID:9ea9999e638334ef8393c5ef5526ff2061a1aeba66e93bac72948633
CATEGORIES:Conferences - Seminars
DESCRIPTION:Prof. Helmut Bölcskei - ETH Zürich - Communication Technolog
 y Laboratory\nDeep convolutional neural networks (DCNNs) have led to break
 through results in numerous machine learning tasks\, yet a comprehensive m
 athematical theory explaining this success seems distant.\n\nIn this talk\
 , we explain how the structure of DCNNs naturally leads to desirable invar
 iance and deformation insensitivity properties for feature extraction and 
 classification.\n\nWe then establish formally that deep neural networks w
 ith arbitrary convolution kernels\,  general Lipschitz-continuous non-lin
 earities (e.g.\, rectified linear units\, shifted logistic sigmoids\, hype
 rbolic tangents\, and modulus functions)\, and a wide range of pooling ope
 rators exhibit vertical translation invariance and small deformation sensi
 tivity. We also analyze energy conservation and energy decay properties of
  DCNNs and identify networks with exponential energy decay. In particular\
 , we provide estimates on the number of layers needed to have most of the 
 input signal energy be contained in the network’s feature maps. On a con
 ceptual level our results establish that deformation insensitivity\, verti
 cal translation invariance\, and energy conservation are guaranteed by th
 e network structure per se rather than the specific convolution kernels\, 
 non-linearities\, and pooling operators. This offers a mathematical explan
 ation for certain aspects of the tremendous practical success of DCNNs.\n
  
LOCATION:SV 1717 https://plan.epfl.ch/?room==SV%201717
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
