Learning with Generative Priors

Thumbnail

Event details

Date 25.08.2022
Hour 09:0011:00
Speaker Freya Behrens
Category Conferences - Seminars
EDIC candidacy exam
Exam president: Prof. Volkan Cevher
Thesis advisor: Prof. Lenka Zdeborová
Co-examiner: Prof. Michaël Unser

Abstract
Recovering data points from their possibly nonlinear
measurements is a ubiquotous problem in signal processing.
When the measurement process is ill-posed, prior knowledge
is required to identify the original input uniquely. In order to
specify this knowledge, deep generative models and denoisers
have been succesfully employed in place of classical priors like
sparsity. Based on the recent survey on the topic, [Ong+20],
this report reviews some of these methods and applications and
highlights their wide empirical success. Then, we describe a line
of work that aims to develop a theoretical understanding of these
models via message passing algorithms [Man+17; Pan+20]. Such
an analysis admits a computationally efficient inference algorithm and gives asymptotically exact
predictions of its performanceaccording to order parameters such as the sample size.
However, as it concerns only multi-layered models with random parameters,
this theory still falls short to describe real world settings
where the generative models are trained on a data distribution
and thus have learned parameters. We conclude the report with
a discussion of open questions and possible research directions
related to this gap.

Background papers
- Multi-Layer Generalized Linear Estimation (http://arxiv.org/abs/1701.06981)
- Deep Learning Techniques for Inverse Problems in Imaging (10.1109/JSAIT.2020.2991563; https://arxiv.org/abs/2005.06001)
- Inference With Deep Generative Priors in High Dimensions (10.1109/JSAIT.2020.2986321; https://ieeexplore.ieee.org/document/9061052/)

Practical information

  • General public
  • Free

Tags

EDIC candidacy exam

Share