BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:Symbolic representation learning
DTSTART:20230811T130000
DTEND:20230811T150000
DTSTAMP:20260407T064507Z
UID:94a2b72065180e4536cdfb8de87e00d2a38dd5e78211c3b5dc6d2bc7
CATEGORIES:Conferences - Seminars
DESCRIPTION:Mohammad Hossein Amani\nEDIC candidacy exam\nExam president: P
 rof. Antoine Bosselut\nThesis advisor: Prof. Robert West\nCo-examiner: Pro
 f. Tanja Käser\n\nAbstract\nThis proposal explores using discrete latent 
 representations in sequential neural models for enabling systematic\ngener
 alization under limited labeled data.\nWe study self-supervised learning o
 f discrete symbolic-like representations using variational autoencoders.\n
 We investigate non-parametric statistical methods to model the probability
  distribution of sequences of latent\nrepresentations\, which serves as a 
 natural information bottleneck to encode prior knowledge for data-efficien
 t learning.\nFinally\, to evaluate our approach\, we analyze different not
 ions of systematic generalization and propose using\nformal languages as c
 ustomizable benchmarks for compositionality. Specifically\, we focus on fi
 nite state\ntransducers for generating compositional sequence-to-sequence 
 datasets with adjustable levels of task\ncomplexity/compositionality.\nThe
  key challenges addressed are methods for training discrete representation
 s\, modeling sequence priors for VAEs\,\nand benchmarking compositional ge
 neralization in neural models.\n\nBackground papers\n\n	Neural Discrete Re
 presentation Learning - https://arxiv.org/abs/1711.00937\n	A VAE for tran
 sformers with non-parametric variational information bottleneck - https:/
 /openreview.net/forum?id=6QkjC_cs03X \n	Benchmarking Compositionality wit
 h Formal Languages - https://aclanthology.org/2022.coling-1.525.pdf\n
LOCATION:BC 233 https://plan.epfl.ch/?room==BC%20233
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
