BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:Multiplication Free Neural Network Architectures
DTSTART:20230831T120000
DTEND:20230831T140000
DTSTAMP:20260408T064419Z
UID:90edc004da3cb7cad425bfdaa9adc5bb9f29ec1e30ab3b290c13c8ee
CATEGORIES:Conferences - Seminars
DESCRIPTION:Bettina Messmer\nEDIC candidacy exam\nExam president: Prof. Fr
 ançois Fleuret\nThesis advisor: Prof. Martin Jaggi\nCo-examiner: Prof. Ma
 thieu Salzmann\n\nAbstract\nThe remarkable progress of deep learning model
 s\nhas come with increased energy consumption\, impacting the\nenvironment
 \, and limiting deployment on resource constrained\ndevices\, such as mobi
 le devices. Our research goal is to explore\nresource-efficient alternativ
 es to state-of-art network architectures\,\nspecifically\, we focus on des
 igns that substantially reduce\nmultiplications. We discuss three signific
 ant works: a baseline\narchitecture notably reducing the multiplicative pa
 rameters\, a\ntheoretical model reducing the number of multiplications to 
 a\nminimum while retaining universal approximation capabilities\,\nand a m
 ore general exploration of resource efficiency in the\nnatural language pr
 ocessing (NLP) domain.\n\nBackground papers\n\n	AdderNet: Do We Really Nee
 d Multiplications in Deep Learning? (here)\n	Min-Max-Plus Neural Networks 
 (here)\n	pNLP-Mixer: an Efficient all-MLP Architecture for Language (here)
 \n
LOCATION:BC 129 https://plan.epfl.ch/?room==BC%20129
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
