BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:Imaging Seminar: Normalizing Flows and the Power of Patches in Inv
 erse Problems
DTSTART:20230928T170000
DTEND:20230928T180000
DTSTAMP:20260404T091657Z
UID:430783b4fbad77f1c91a90bd637089ed347ca653d06aa73a0c4ef098
CATEGORIES:Conferences - Seminars
DESCRIPTION:Prof. Gabriele Steidl \n<< Registration below >>\n\nAbstract:
  \nLearning neural networks using only a small amount of data is an impor
 tant ongoing research topic  with tremendous potential for applications.
   We introduce a regularizer for the variational modeling of inverse prob
 lems in imaging based on normalizing flows\, called patchNR. It involves
  a normalizing flow learned on patches of very few images.  The subsequen
 t reconstruction method is completely unsupervised and the same regularize
 r can be used for different forward operators acting on the same class of
  images. By investigating the distribution of patches versus those of the
  whole image class\, we prove that our variational model is indeed a MAP 
 approach. Numerical examples for low-dose CT\, limited-angle CT and supe
 rresolution of material images demonstrate that our method provides high q
 uality results among unsupervised methods\, but requires only very few da
 ta. Further\, the appoach also works if only the low resolution image is 
 available.\n\nIn the second part of the talk I will generalize normalizing
  flows to stochastic normalizing flows to improve their expressivity. Norm
 alizing flows\, diffusion normalizing flows and variational autoencoders a
 re powerful generative models. A unified framework to handle these approac
 hes are Markov chains. We consider stochastic normalizing flows as a pair 
 of Markov chains fulfilling some properties and show how many state-of-the
 -art models for data generation fit into this framework. Indeed including
  stochastic layers improves the expressivity of the network and allows fo
 r generating multimodal distributions from unimodal ones. The Markov chai
 ns point of view enables us to couple both deterministic layers as inverti
 ble neural networks and stochastic layers as Metropolis-Hasting layers\, 
 Langevin layers\, variational autoencoders and diffusion normalizing flows
  in a mathematically sound way. Our framework establishes a useful mathe
 matical tool to combine the various approaches.\n\nJoint work with F. Alte
 kr¸ger\, P. Hagemann\, J. Hertrich\, P. Maass.\n\nBiography: \nGabriele 
 Steidl received her PhD and habilitation in mathematics from the Universit
 y of Rostock. After positions as associated professor for Mathematics at t
 he TU Darmstadt and full professor at the University of Mannheim and TU K
 aiserslautern\, she is currently professor at the TU Berlin. She was a Po
 stdoc\, resp. visiting Professor at the Universities of Debrecen\, Z¸rich
 \, ENS Cachan/Paris Univ. Paris Est\, Sorbonne/IHP Paris and worked as co
 nsultant of the Fraunhofer ITWM Kaiserslautern. Gabriele Steidl is Editor
 -in-Chief of the SIAM Journal of Imaging Sciences and SIAM Fellow. Her res
 earch interests include Harmonic Analysis\, Optimization\, Inverse Problem
  and Machine Learning with applications in Image Processing.\n\nThe talk i
 s followed by an aperitif. \nRegistration appreciated\nMore info here\n
  
LOCATION:BM 5202 https://plan.epfl.ch/?room==BM%205202
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
