BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:Talk by Richard Lee Davis\, Stanford Graduate School of Education
DTSTART:20200115T111500
DTEND:20200115T120000
DTSTAMP:20260427T203354Z
UID:0244d26cd4c6102f3e629912be51de70a5e2da0fdf84081b7689dce8
CATEGORIES:Conferences - Seminars
DESCRIPTION:Richard Lee Davis is a doctoral student in Learning Sciences a
 nd Technology Design at the Stanford Graduate School of Education. His res
 earch interests include using machine learning to understand student learn
 ing in open-ended\, project-based learning environments. He received an M.
 S. in computer science at Stanford with a focus in artificial intelligence
 \, and is currently working with the Piech Lab at Stanford on the developm
 ent of variational Bayesian methods for estimating student knowledge. He i
 s a recipient of the Stanford Interdisciplinary Graduate Fellowship\, and 
 holds a B.A. in Philosophy and Studio Art with a minor in Physics from the
  University of Virginia. Prior to joining Stanford\, he worked as a softwa
 re and firmware developer while co-directing an artist collective in Phila
 delphia.\nSpecialized Machine Learning Methods for Attacking the Small-Dat
 a Problem in Education\n\nOver the past decade there have been a number of
  breakthroughs in machine learning\, including achieving human-level perfo
 rmance on image classification\, speech recognition\, strategic game playi
 ng\, and text generation. These methods have enormous potential to transfo
 rm education by making it easier for teachers to understand classroom dyna
 mics\, monitor student learning\, grade more quickly and with fewer mistak
 es\, provide feedback to students in real time\, support collaboration\, a
 nd to measure learning during activities taking place in situated\, active
  learning environments. However\, there are a number of obstacles to overc
 ome before these promises can be realized. One of the most significant has
  to do with the amount of data available—most education datasets are sim
 ply too small to properly train the types of deep neural networks that hav
 e led to recent breakthroughs. In this talk I will discuss promising appro
 aches to overcoming this obstacle. I will describe how combining theoretic
 al insights from the learning sciences with unsupervised learning methods 
 made it possible to find meaningful structure in a small dataset of 40 stu
 dents working on hands-on problems in a makerspace. I will also discuss ho
 w deep probabilistic programming languages can be used to design models of
  students based on domain knowledge\, and then used to perform inference a
 bout student knowledge and question characteristics in small-data regimes.
  \n 
LOCATION:RLC D1 661 https://plan.epfl.ch/?room==RLC%20D1%20661
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
