BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:Network Classifier
DTSTART:20190827T140000
DTEND:20190827T160000
DTSTAMP:20260410T174215Z
UID:bbf0867ffd9ae7a4caf9a8f5ceda3cf654c3ef44c7be32de93a8bd05
CATEGORIES:Conferences - Seminars
DESCRIPTION:Elsa Rizk\nEDIC candidacy exam\nExam president: Prof. Rüdiger
  Urbanke\nThesis advisor: Prof. Ali Sayed\nCo-examiner: Prof. Volkan Cevhe
 r\n\nAbstract\nThe surge of data defined over a network calls for tools to
  be adapted to handle them. We have already explored designing machine lea
 rning algorithms for data defined in Euclidean space\, but now we wish to 
 extend it to data defined over graphs. Some works have already explored th
 is extension\, such as graph neural networks and representation learning. 
 However\, we focus on designing a network of classifiers that collaborate 
 towards either a common goal\, or towards multiple related goals. In this 
 research proposal\, we define the problem to be tackled: network classifie
 rs. We present and discuss three publications that will help steer the res
 earch plan. The first paper introduces federated learning which is closely
  linked to network classifiers\; the latter encapsulate the problems solve
 d by federated learning. Next\, the second paper looks at multitask networ
 ks\, and develops a distributed optimization algorithm that enforces smoot
 hness on the parameter vectors. It functions as a building block for formu
 lating an optimization problem of the network classifiers. Finally\, the t
 hird paper aims at privatizing distributed online learning algorithms whil
 e relying on the concept of differential privacy. This paper shall also fu
 nction as a building block when the privacy level is important to the netw
 ork classifiers. Thus\, in our work\, we hope to formulate the optimizatio
 n problem of the network classifiers with all its variations\, solve it in
  a private way\, and study the trade-off between privacy and utility.\n\nB
 ackground papers\nCommunication-Efficient Learning of Deep Networks from D
 ecentralized Data\, by Brendan McMahan\, H.\, et al.\nDistributed Inferenc
 e over Multitask Graphs under Smoothness\, by Nassif\, R.\, et al.\nDiffe
 rentially Private Distributed Online Learning\, by Li\, C.\, et al.\n\n\n
 \n 
LOCATION:AAC 1 06 https://plan.epfl.ch/?room==AAC%201%2006
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
