BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:Analysis of Transformer Language Models at Finer Granularities
DTSTART:20220825T140000
DTEND:20220825T160000
DTSTAMP:20260505T014829Z
UID:52f7b3cb10316778652cfadb29eb972c5181673454ad5763c1034518
CATEGORIES:Conferences - Seminars
DESCRIPTION:Deniz Bayazit\nEDIC candidacy exam\nExam president: Prof. Boi 
 Faltings\nThesis advisor: Prof. Antoine Bosselut\nCo-examiner: Prof. Marti
 n Jaggi\n\nAbstract\nPre-trained language models (LMs)\, particularly Tran
 sformer-based ones\, achieve good performance on knowledge-related tasks\,
  hinting that such models are able to encode and manipulate structured inf
 ormation depicted by natural language.\nHowever\, we do not have a complet
 e grasp of the internal processes of LMs and what part of the network is r
 esponsible for this ability. Recent work decoupling Transformer processes 
 often focus on a surface-level granularity of the model\, making it harder
  to structurally probe and edit LMs with confidence.\nIn this proposal\, w
 e will show 3 different granularities of analysis done in the past: (1) th
 e layer level\, (2) the neuron level\, and (3) the weight level. At each l
 evel\, we consider how the methods and findings push our understanding of 
 how structured information in natural language is processed by Transformer
 -based LMs.\nWe argue that finding structured information encoding subnetw
 orks in LMs can allow us to manipulate them. Consequently\, we propose a 
 study on locating subnetworks of parameters responsible with encoding conc
 eptual knowledge within LMs.\n\nBackground papers\n\n	Transformer Feed-For
 ward Layers Build Predictions by Promoting Concepts in the Vocabulary Spac
 e https://arxiv.org/abs/2203.14680\, Preprint on arxiv.\n\n\n\n	Analyzing 
 Individual Neurons in Pre-trained Language Models https://aclanthology.org
 /2020.emnlp-main.395/ \, EMNLP 2020\n\n\n\n	Are Neural Nets Modular? Inspe
 cting Functional Modularity Through Differentiable Weight Masks https://op
 enreview.net/forum?id=7uVcpu-gMD\, ICLR 2021\n\n\n 
LOCATION:BC 233 https://plan.epfl.ch/?room==BC%20233
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
