BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:System Support for Decentralized and Federated Learning
DTSTART:20220621T130000
DTEND:20220621T150000
DTSTAMP:20260509T225652Z
UID:c2c59cfd8a189471c7976537af3d472d53a9f7a3d4d6bb932283718c
CATEGORIES:Conferences - Seminars
DESCRIPTION:Rishi Sharma\nEDIC candidacy exam\nExam president: Prof. Kater
 ina Argyraki\nThesis advisor: Prof. Anne-Marie Kermarrec\nCo-examiner: Pro
 f. Martin Jaggi\n\nAbstract\nDeep learning algorithms perform well on a va
 riety of artificial intelligence tasks such as image classification\, text
  recognition\, and recommendation. Traditionally\, the training of these d
 eep neural networks is done in data centers over huge chunks of data. Movi
 ng this data from the producer to the data centers owned by companies such
  as Google and Amazon poses serious privacy risks. Several collaborative l
 earning approaches with and without a central server have been proposed to
  alleviate some privacy concerns by allowing data to stay with the produce
 r. These come with their own challenges\, including high communication cos
 ts and slow convergence.\n\nWe propose a research plan for improving decen
 tralized learning systems in terms of communication\, computation\, and fa
 ult tolerance. With an optimal selective parameter sharing scheme\, the co
 mmunication costs can be minimized. Improved CPU and GPU utilization\, and
  an optimal overlap between computation during training\, computation of s
 elective sharing and parameter communication can speed up training. Finall
 y\, by also handling network delays\, packet drops and nodes joining and l
 eaving\, we can design a fault-tolerant decentralized learning system that
  is also communication and computation efficient.\n\nBackground papers\n1.
  Reza Shokri and Vitaly Shmatikov. 2015. Privacy-Preserving Deep Learning
 . In Proceedings of the 22nd ACM SIGSAC Conference on Computer and Communi
 cations Security (CCS '15). Association for Computing Machinery\, New York
 \, NY\, USA\, 1310–1321.\n\n2. Kevin Hsieh\, Amar Phanishayee\, Onur Mu
 tlu\, and Phillip B. Gibbons. 2020. The non-IID data quagmire of decentral
 ized machine learning. In Proceedings of the 37th International Conference
  on Machine Learning (ICML'20). JMLR.org\, Article 408\, 4387–4398.\n\n3
 . Hyungjun Oh\, Junyeol Lee\, Hyeongju Kim\, and Jiwon Seo. 2022. Out-of-
 order backprop: an effective scheduling technique for deep learning. In Pr
 oceedings of the Seventeenth European Conference on Computer Systems (Euro
 Sys '22). Association for Computing Machinery\, New York\, NY\, USA\, 435
 –452.\n 
LOCATION:BC 133 https://plan.epfl.ch/?room==BC%20133
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
