BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:IC Colloquium : Probabilistic representation learning and scalable
Bayesian inference
DTSTART;VALUE=DATE-TIME:20171012T161500
DTEND;VALUE=DATE-TIME:20171012T173000
UID:db3bfdce0aa59704867b0c4a4deed50145b6fd81c1211d82778435a8
CATEGORIES:Conferences - Seminars
DESCRIPTION:**By : **Stephan Mandt - Disney Research Pittsbur
gh

\n**Video of
his talk**

\n

\n**Abstract :** Probabilisti
c modeling is a powerful paradigm which has seen dramatic innovations in r
ecent years. These innovations in approximate inference\, mainly due to au
tomatic differentiation and stochastic optimization\, have made probabilis
tic modeling scalable and broadly applicable to many complex model classes
. I start my talk by reviewing the dynamic skip-gram model (ICML 2017) as
an example of this class. The model results from combining a probabilistic
interpretation of word embeddings with latent diffusion priors\, and allo
ws us to study the dynamics of word embeddings for text data that are asso
ciated with different time stamps. Our Bayesian approach allows us to shar
e information across the time domain\, and is robust even when the data at
individual points in time is small. As a result\, we can automatically de
tect words that change their meanings even in moderately-sized corpora. Ye
t\, the model is Bayesian non-conjugate\, and therefore we have to draw on
modern variational inference methods to train it efficiently on large dat
a. The second part of my talk is therefore devoted to advances in variatio
nal inference. Here\, I will review our very recent perturbative black box
variational inference algorithm (NIPS 2017)\, that uses variational pertu
rbation theory of statistical physics to construct corrections to the stan
dard variational lower bound. Last\, I will demonstrate that simple stocha
stic gradient descent with a constant step size is a form of approximate B
ayesian inference (JMLR and ICML 2016).

\n

\n**Bio :** Stephan Mandt is a Research Scientist and head of the statistical machin
e learning group at Disney Research Pittsburgh\, co-located with CMU. Prev
iously\, he was a postdoctoral researcher with David Blei at Columbia Univ
ersity (2014-2016)\, and a PCCM postdoctoral fellow at Princeton Universit
y (2012-2014). Stephan Mandt holds a Ph.D. in theoretical physics from the
University of Cologne\, supported by the German National Merit Foundation
. His interests include large-scale probabilistic modeling\, representatio
n learning\, variational inference\, and media analytics.

\n

\n**More information**
LOCATION:BC 420 https://plan.epfl.ch/?room=BC420
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR