BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:IEM Distinguished Lecturers Seminar: Kernel-driven and Learnable S
 elf-Supervision over Graphs
DTSTART:20240920T131500
DTEND:20240920T140000
DTSTAMP:20260509T195820Z
UID:8043ce08bef5eae1cb2ebff34884abac5792817a3b2a9cc8722c0048
CATEGORIES:Conferences - Seminars
DESCRIPTION:Prof. Georgios B. Giannakis\, University of Minnesota\, USA\n*
 ** NEW PLACE: BM 5202 ***\n\nCoffee and cookies will be served at 13:00 be
 fore the seminar \n\nAbstract \nSelf-supervision (SeSu) has gained popular
 ity for “data-hungry” training of machine learning models\, especially
  those involving large-scale graphs\, where labeled samples are scarce or 
 unavailable. Main learning tasks in such setups are ill-posed\, and SeSu r
 enders them well-posed by relying on abundant unlabeled data as input\, to
  yield low-dimensional embeddings of a reference (auxiliary) model output.
  In this talk\, we first outline SeSu approaches\, specialized reference m
 odels\, and their links with (variational) auto-encoders\, regularization\
 , semi-supervised\, transfer\, meta\, and multi-view learning\; but also t
 heir challenges and opportunities when multi-layer graph topologies and mu
 lti-view data are present\, when nodal features are absent\, and when the 
 ad hoc selection of a reference model yields embeddings not optimally desi
 gned for the downstream main learning task. Next\, we present our novel Se
 Su approach which selects the reference model to output either a prescribe
 d kernel or a learnable weighted superposition of kernels from a prescribe
 d dictionary. As a result\, the learned embeddings offer a novel\, reduced
  dimensionality estimate of the basis kernel\, and thus an efficient param
 etric estimate of the main learning function at hand that belongs to a rep
 roducing kernel Hilbert space. If time allows\, we will also cover online 
 variants for dynamic settings\, and regret analysis founded on the so-term
 ed neural-tangent-kernel framework to assess how effectively the learned e
 mbeddings approximate the underlying optimal kernel(s). We will wrap up wi
 th numerical tests using synthetic and real datasets to showcase the merit
 s of kernel-driven and learnable (KeLe) SeSu relative to alternatives. The
  real data will also compare KeLe-SeSu with auto-encoders and graph neural
  networks (GNNs)\, and further test KeLe-SeSu on reference maps with maske
 d inputs and predicted-outputs that are popular in large language models (
 LLMs).\n\nBio\nGeorgios B. GIANNAKIS is a Professor of Electrical and Comp
 uter Engineering at the University of Minnesota\, where he holds a Preside
 ntial Chair. His interests span the areas of statistical learning\, commun
 ications\, and networking - subjects on which he has published over 495 jo
 urnal papers\, 805 conference papers\, 26 book chapters\, two edited books
 \, and two research monographs. His current research focuses on Data Scien
 ce with applications to IoT\, and power networks with renewables. He is th
 e (co-) inventor of 36 issued patents\, and the (co-)recipient of 10 best 
 journal paper awards from the IEEE Signal Processing (SP) and Communicatio
 ns Societies\, including the G. Marconi Prize. He received the IEEE-SPS No
 rbert Wiener Society Award (2019)\; EURASIP's A. Papoulis Society Award (2
 020)\; Technical Achievement Awards from the IEEE-SPS (2000) and from EURA
 SIP (2005)\; the IEEE ComSoc Education Award (2019)\; and the IEEE Fourier
  Technical Field Award (2015). He is a member of the Academia Europaea\, G
 reece's Academy of Athens\, and Fellow of the National Academy of Inventor
 s\, the European Academy of Sciences\, UK's Royal Academy of Engineering\,
  Life Fellow of IEEE\, and EURASIP. He has served the IEEE in several post
 s\, including as a Distinguished Lecturer for the IEEE-SPS.
LOCATION:BM 5202 https://plan.epfl.ch/?room==BM%205202 https://epfl.zoom.u
 s/j/67449285011
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
