BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:Augment Language Models with Explicit Structured Information
DTSTART:20210826T100000
DTEND:20210826T120000
DTSTAMP:20260407T163642Z
UID:73fed1c57346aeb19496639613d4f2c8bb2ddf0454c4120858daab75
CATEGORIES:Conferences - Seminars
DESCRIPTION:Angeliki Romanou\nEDIC candidacy exam\nexam president: Prof. T
 anja Käser\nthesis advisor: Prof. Karl Aberer\nco-examiner: Prof. Antoine
  Bosselut\n\nAbstract\nRecent natural language approaches try to leverage\
 nthe factual power of explicit knowledge by incorporating\nretrieval-based
  techniques into neural language models.\nIn this doctoral candidacy propo
 sal\, we present three papers\naimed at augmenting neural language models 
 with explicit nonparametric\nknowledge. The first paper combines Transform
 erlike\npre-trained language models with knowledge bases\, and show\nimpro
 vements in many downstream tasks. The second paper\nadds a jointly trained
 \, fully differentiable document retriever\ninto the pre-training phase of
  the language model. The third\npaper extends this approach by incorporati
 ng a text retriever\nin sequence-to-sequence (seq2seq) models aiming for m
 ore factual\nand knowledge-intense text generation. Although retrievalaugm
 ented\nmodels enjoy great popularity in the NLP research\ncommunity the re
 cent years\, there is still a great amount of work\nthat needs to be done 
 in order to improve the explainability\,\nmodularity\, and efficiency of t
 hese models. We build upon these\nworks to propose a research agenda tryin
 g to tackle the existing\nlimitations by creating neural language models t
 hat aim for\ngeneralization power as well as model specificity.\n\nBackgro
 und papers\n- Peters\, Matthew E.\, et al. "Knowledge enhanced contextual
  word representations." arXiv preprint arXiv:1909.04164 (2019). [https://a
 rxiv.org/abs/1909.04164] \n- Guu\, Kelvin\, et al. "Realm: Retrieval-augm
 ented language model pre-training." arXiv preprint arXiv:2002.08909 (2020)
 . [https://arxiv.org/abs/2002.08909]\n- Lewis\, Patrick\, et al. "Retrieva
 l-augmented generation for knowledge-intensive nlp tasks." arXiv preprint 
 arXiv:2005.11401 (2020). [https://arxiv.org/abs/2005.11401]\n\n\n 
LOCATION:
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
