Dissociating curiosity-driven exploration algorithms.

Thumbnail

Event details

Date 02.05.2024
Hour 09:0011:00
Speaker Lucas Gruaz
Location
Category Conferences - Seminars
EDIC candidacy exam
Exam president: Prof. Martin Jaggi
Thesis advisor: Prof. Wulfram Gerstner
Co-examiner: Prof. Nicolas Flammarion

Abstract
Exploration is a fundamental concept both in Reinforcement
Learning (RL) and human behavior. In RL, exploration
involves the agent actively seeking information about
its environment to discover optimal strategies for maximizing
rewards. Similarly, in human behavior, exploration manifests
as curiosity, experimentation, and risk-taking, all of which
contribute to learning and adaptation. By exploring the unknown,
both RL agents and humans can discover novel solutions, adapt
to changing circumstances, and ultimately improve their performance
and understanding of the world around them. Various
methods have been developed to encourage exploration in RL
agents. Their specificity and application scenarios are varied, and
their similarity with exploration strategies observed in humans
remains unclear. In this proposal, we review three paper related
to this question. The first paper gives an overview of exploration
techniques in deep RL, the second paper shows an example of
a successful application of such techniques, and the third paper
explore human exploratory behavior, serving as a starting point
to assess differences with previously introduced techniques.

Background papers

Practical information

  • General public
  • Free

Tags

EDIC candidacy exam

Share