BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:Unsupervised and semi-supervised embeddings for word sequences
DTSTART:20170612T140000
DTEND:20170612T160000
DTSTAMP:20260407T051214Z
UID:a0dea835ea8b7c53b37725560963127e0bd69cfa88e7425d947ccbaa
CATEGORIES:Conferences - Seminars
DESCRIPTION:Prakhar Gupta\nEDIC candidacy exam\nExam president: Prof. Volk
 an Cevher\nThesis advisor: Prof. Martin Jaggi\nCo-examiner: Prof. Boi Falt
 ings\n\nAbstract\nThe recent tremendous success of unsupervised word embed
 dings in a multitude of applications raises the obvious question if simila
 r methods could be derived to improve embeddings (i.e. semantic representa
 tions) of word sequences as well. We plan to use a variety of machine lear
 ning methods as well as try to devise new model formulations to generate r
 obust representations of word sequences in an unsupervised/semi-supervised
  fashion. We also plan to explore the mathematical underpinnings behind th
 ese models.\n\nBackground papers\nSkip-Thought Vectors\, Kiros et al. Arxi
 v.org.\nDistributed Representations of Sentences and Documents\, Le and Mi
 kolov. Stanford edu.\nJointly optimizing word representations for lexical 
 and sentential tasks with the C-PHRASE model\, Pham et al.
LOCATION:INJ 322 https://plan.epfl.ch/theme/generalite_thm_plan_public?req
 uest_locale=en&room=INJ%20322&domain=places&dim_floor=3&lang=en&dim_lang=e
 n&baselayer_ref=grp_backgrounds&tree_groups=centres_nevralgiques%2Cac
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
