BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:Interpersonal Human-Robot Interaction & Engagement
DTSTART:20181030T161500
DTEND:20181030T170000
DTSTAMP:20260408T105348Z
UID:82207d5307508a864a374ded9542cde861f986c16b6ea667922cd210
CATEGORIES:Conferences - Seminars
DESCRIPTION:Prof. Mohamed Chetouani is the head of the IMI2S (Interaction\
 , Multimodal Integration and Social Signal) research group at the Institut
 e for Intelligent Systems and Robotics (CNRS UMR 7222)\, Sorbonne Universi
 ty. He is currently a Full Professor in Signal Processing and Machine Lear
 ning for Human-Machine Interaction. He is also CSO at Batvoice Technologie
 s. His research activities cover the areas of social signal processing\, s
 ocial robotics and interactive machine learning with main applications in 
 psychiatry\, psychology\, social neuroscience and education. In 2016\, he 
 was a Visiting Professor at the Human Media Interaction group of Universit
 y of Twente (NL). He is the Deputy Director of the Laboratory of Excellenc
 e SMART Human/Machine/Human Interactions In The Digital Society. He coordi
 nates the Autonomy Programme of the Institute of Engineering for Health at
  Sorbonne University. Since 2018\, he is the coordinator of the ANIMATAS H
 2020 Marie Sklodowska Curie European Training Network.\nSynchrony\, engage
 ment and learning are important abilities that allow sustaining dynamics o
 f social interaction. In this talk\, we will address these topics with an 
 interpersonal interaction point of view. In particular\, we will introduce
  interpersonal human-machine interactions schemes and models with a focus 
 on definitions\, sensing and evaluations of social signals and behaviors. 
 We will show how these models are currently applied to detect engagement i
 n multi-party human-robot interactions\, detect human’s personality trai
 ts and task learning.\n \n 
LOCATION:RLC D1 661 https://plan.epfl.ch/?room==RLC%20D1%20661
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
