Special BMI Seminar in Computational Neuroscience // Aran Nayebi & Hans Scherberger

Thumbnail

Event details

Date 12.02.2025
Hour 14:0016:00
Speaker Aran Nayebi, Carnegie Mellon University (CMU), Pittsburgh, PA, USA  & Hans Scherberger, German Primate Center (DPZ), Göttingen, Germany
Location Online
Category Conferences - Seminars
Event Language English
Aran NayebiUsing embodied agents for "why" questions in systems neuroscience

Deep neural networks trained on high-variation tasks ("goals”) have had immense success as predictive models of the human and non-human primate visual pathways. More specifically, a positive relationship has been observed between model performance on ImageNet categorization and neural predictivity. Past a point, however, improved categorization performance on ImageNet does not yield improved neural predictivity, even between very different architectures. In this talk, I will present two case studies in both rodents and primates, that demonstrate a more general correspondence between self-supervised learning of visual representations relevant to high-dimensional embodied control and increased gains in neural predictivity.
In the first study, we develop the (currently) most precise model of the mouse visual system, and show that self-supervised, contrastive algorithms outperform supervised approaches in capturing neural response variance across visual areas. By “implanting” these visual networks into a biomechanically-realistic rodent body to navigate to rewards in a novel maze environment, we observe that the artificial rodent with a contrastively-optimized visual system is able to obtain more reward across episodes compared to its supervised counterpart. The second case study examines mental simulations in primates, wherein we show that self-supervised video foundation models that predict the future state of their environment in latent spaces that can support a wide range of sensorimotor tasks, align most closely with human error patterns and macaque frontal cortex neural dynamics. Taken together, our findings suggest that representations that are reusable for downstream embodied tasks may be a promising way forward to study the evolutionary constraints of neural circuits in multiple species.

Hans ScherbergerCoding and decoding of hand movements in the primate brain

Hand function plays an important role in all primate species, and its loss is associated with severe disability. Grasping movements are complex actions for which the primate brain integrates sensory and cognitive signals to generate meaningful behavior. To achieve this computation, specialized brain areas are functionally connected, in particular in the parietal (anterior intra parietal area, AIP), premotor (area F5), and primary motor cortex (M1 hand area). This presentation will highlight recent experimental results in non-human primates to characterize how populations of individual neurons in these cortical areas interact to generate grasping movements based on sensory signals, and how such neuronal population signals can be used to decode hand actions, e.g., for operating a neural prosthesis.

 

Practical information

  • Informed public
  • Free

Organizer

  • EPFL BMI  Host: Alexander Mathis

Event broadcasted in

Share