BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:Decision Trees and CLT's: Inference and Machine Learning
DTSTART:20220603T151500
DTEND:20220603T170000
DTSTAMP:20260511T081215Z
UID:d65fb0ac1f2f196cd470387247f9e85196e203ef0d36b5e8a78dc0e7
CATEGORIES:Conferences - Seminars
DESCRIPTION:Giles Hooker\, Department of Statistics\, UC Berkeley\nThis ta
 lk develops methods of statistical inference based around ensembles of dec
 ision trees: bagging\, random forests\, and boosting. Recent results have 
 shown that when the bootstrap procedure in bagging methods is replaced by 
 sub-sampling\, predictions from these methods can be analyzed using the th
 eory of U-statistics which have a limiting normal distribution. Moreover\,
  the limiting variance that can be estimated within the sub-sampling struc
 ture.\n\nUsing this result\, we can compare the predictions made by a mode
 l learned with a feature of interest\, to those made by a model learned wi
 thout it and ask whether the differences between these could have arisen b
 y chance. By evaluating the model at a structured set of points we can als
 o ask whether it differs significantly from an additive model. We demonstr
 ate these results in an application to citizen-science data collected by C
 ornell's Laboratory of Ornithology.\n\nGiven time\, we will examine recent
  developments that extend distributional results to boosting-type estimato
 rs. Boosting allows trees to be incorporated into more structured regressi
 on such as additive or varying coefficient models and often outperforms ba
 gging by reducing bias.
LOCATION:https://epfl.zoom.us/j/66136073806
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
