BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:Fall 2023 SDSC PhD fellows workshop
DTSTART:20231128T102500
DTEND:20231128T162000
DTSTAMP:20260502T134707Z
UID:f96876a1ff89fe19dabcd44c780116b480fbb71268b67b92f2dfaabb
CATEGORIES:Conferences - Seminars
DESCRIPTION:Martin Josifoski\, Laura Manduchi\, Clément Vignac\, Andreas 
 Schlaginhaufen\, Aditya Varre\nSince 2019\, the Swiss Data Science Center 
 (SDSC) has awarded fellowships to 19 PhD students working on various aspec
 ts of data science theory and methods at EPFL and ETH Zürich\, from stati
 stics to computer vision\, including machine learning\, statistical signal
  processing\, and natural language processing.\n\nWe are pleased to invite
  you to the PhD fellows' workshop of Fall 2023. These workshops are organi
 zed once per semester\, alternating between EPFL and ETH Zürich\, to prov
 ide a forum of exchange on methods in data science and an opportunity for 
 five of these fellows to present their work in person. The program will al
 so feature in the afternoon four short presentations of SDSC collaborative
  projects. Everyone from the EPFL community is welcome to attend.\n\nDate 
 and Time: Tuesday 28 November 2023\, 10:25am\nRoom: BC 420\n\nProgram (ple
 ase find the abstract below):\n\n	10:25 - Foreword and session opening 
 – Dr. Guillaume Obozinski\, Deputy Chief Data Scientist\, SDSC\n	10:30 -
  Martin Josifoski\, Data Science Lab\, EPFL: \n	\n		Symbolic Intermediate
  Representations for Reasoning and Collaborating AI\n	\n	\n	11:10 - Laura 
 Manduchi\, Medical Data Science group\, ETH Zürich: \n	\n		Tree Variati
 onal Autoencoders\n	\n	\n	12:00 - Lunch\n	13:00 - Clément Vignac\, LTS4\,
  EPFL: \n	\n		Discrete denoising diffusion models for graph generation\n	
 \n	\n	13:40 - Short presentations of SDSC collaborative projects:\n	\n		Tr
 ansformer Models for the Estimation of Transit Time in Watersheds\, Quenti
 n Duchemin\, SDSC.\n		Self-guided ML Algorithms for Real-Time Assimilation
 \, Interpolation and Rendering of Flow Data\, Victor Cohen\, SDSC.\n		Prob
 abilistic State Space Models for Predicting Turbulence on Pitching Airfoil
 s\, Christian Donner\, SDSC.\n		A Graph Matching Approach to Tracking Neur
 ons in Freely-Moving C. elegans\, Corinne Jones\, SDSC.\n	\n	\n	14:30 - C
 offee break\n	14:50 - Andreas Schlaginhaufen\, Sycamore\, EPFL:\n	\n		Iden
 tifiability and Transferability of Rewards in Inverse Reinforcement Learni
 ng\n	\n	\n	15:30 - Aditya Varre\, TML\, EPFL\,\n	\n		On the spectral bias 
 of two-layer linear networks\n	\n	\n	16:10 - Closing words\n\n\nAbstracts
 :\n\n\n	Tree Variational Autoencoders\, Laura Manduchi\, Medical Data Scie
 nce group\, ETH Zürich\n\n	\n		Abstract: In this talk\, I will present t
 he Tree Variational Autoencoder (TreeVAE)\, a new generative hierarchical 
 clustering model that learns a flexible tree-based posterior distribution 
 over latent variables. It adapts its architecture to discover the optimal 
 tree for encoding dependencies between latent variables and it hierarchica
 lly divides samples according to their intrinsic characteristics\, sheddin
 g light on hidden structures in the data.\n	\n	\n	Discrete denoising diffu
 sion models for graph generation\,  Clément Vignac\, LTS4\, EPFL\n	\n		A
 bstract: Graph generation has posed significant challenges in the past yea
 rs. Traditional methods\, such as Variational Autoencoders (VAEs)\, Genera
 tive Adversarial Networks (GANs)\, and Normalizing Flows\, have been prima
 rily limited to generating small graphs\, and often evaluated on molecules
  composed of no more than 9 atoms. In this talk\, we introduce DiGress\, a
  denoising diffusion model that scales to substantially larger graphs\, as
  well as its extension MiDi for molecule generation in 3D. These models le
 verage a discrete diffusion process progressively edits graphs with noise\
 , through the process of adding or removing edges and changing the node fe
 atures. In contrast to Gaussian noise\, the use of a discrete diffusion pr
 ocess allows to preserve the sparsity of graphs\, which allows for various
  architectural improvements. We show experimentally the effectiveness of t
 his approach\, and propose hypotheses in order to explain why these models
  outperform GANs and VAEs by such a large margin.\n		References:\n		\n			D
 iGress: Discrete denoising diffusion for graph generation — (ICLR 2023)\
 n			MiDi: Mixed Graph and 3D Denoising diffusion for molecule generation 
 — (ECML 2023)\n		\n		\n	\n	\n	Identifiability and Transferability of Rew
 ards in Regularized Inverse Reinforcement Learning\, Andreas Schlaginhaufe
 n\, Sycamore\, EPFL\n	\n		Abstract: A common reason for favoring inverse r
 einforcement learning (IRL) over imitation learning is that rewards offer 
 a more succinct and transferable description of a task. However\, due to t
 he ill-posedness of the inverse problem\, it is widely unclear whether rew
 ards learned via IRL are indeed transferable to new environments. In this 
 talk\, we focus on regularized IRL and provide sufficient conditions for i
 dentifiability and transferability to new environments.\n	\n	\n	On the spe
 ctral bias of two-layer linear networks\, Aditya Varre\, TML\, EPFL\n	\n		
 Abstract: In this talk I will present the behaviour of two-layer fully con
 nected networks with linear activations trained with gradient flow on the 
 square loss. We further discuss how the opti- mization process carries an 
 implicit bias on the parameters that depends on the scale of its initializ
 ation. The main result is a variational characteriza- tion of the loss min
 imizers retrieved by the gradient flow for a specific initialization shape
 . This characterization reveals that\, in the small scale initialization r
 egime\, the linear neural network’s hidden layer is biased toward having
  a low-rank struc- ture. To complement our results\, we present a hidden m
 irror flow that tracks the dynamics of the singular values of the weights 
 matrices and describe their time evolution. We support our findings with n
 umerical experiments illustrating the phenomena.\n	\n	\n
LOCATION:BC 420 https://plan.epfl.ch/?room==BC%20420
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
