BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:Building Multi-task Models: From Multi-task Learning to Model Merg
 ing
DTSTART:20240227T130000
DTEND:20240227T150000
DTSTAMP:20260407T000048Z
UID:bd28f666a2d72bbdf01ed0403bfceff4967bfd556cbfa29a4e00c5da
CATEGORIES:Conferences - Seminars
DESCRIPTION:Ke Wang\nEDIC candidacy exam\nExam president: Prof. Amir Zamir
 \nThesis advisor: Prof. Pascal Frossard\nCo-examiner: Prof. Martin Jaggi\n
 \nAbstract\nMulti-task models efficiently utilize shared representations t
 o handle various tasks with reduced storage requirements. While many multi
 -task learning (MTL) methods have been developed to train such models join
 tly and deliver good performance\, they often incur non-trivial costs for 
 training and require simultaneous access to all tasks. Recently\, model me
 rging techniques like task arithmetic have emerged as train-free alternati
 ves to construct multi-task models by directly merging separately fine-tun
 ed models. However\, these methods tend to exhibit performance degradation
  compared to traditional MTL approaches. In this report\, we will explore 
 three methods for constructing multi-task models: one based on MTL and two
  utilizing model-merging approaches. Additionally\, we will delve into our
  research on understanding the reasons behind the performance decline obse
 rved in model merging methods\, along with our proposed approach to enhanc
 e their performance.\n\nBackground papers\n\n	Towards Impartial Multi-task
  Learning\, ICLR 2021.\n	Editing Models with Task Arithmetic\, ICLR 2023\n
 	TIES-Merging: Resolving Interference When Merging Models\, NeurIPS 2023\n
LOCATION:ELE 242 https://plan.epfl.ch/?room==ELE%20242
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
