Optimal inference in tractable tree generative models

Thumbnail

Event details

Date 03.07.2024
Hour 16:0017:00
Speaker Luca Saglietti (Bocconi)
Location
Category Conferences - Seminars
Event Language English

Hierarchical correlation structures are ubiquitous in real-world data, from images to text, and seem to play a crucial role in human and machine learning. In this work, we consider a tractable generative model, that produces discrete-valued sequences by collecting the leaves from a tree generated according to a specific stochastic rule (grammar). Belief Propagation (BP), a physics-inspired message passing algorithm, performs exact inference on this class of models when the topology of the tree ---i.e., the so-called "factor graph"--- is known. In this setting, a simple heuristic called Expectation-Maximization (E-M) can be applied to efficiently infer the generative grammar from a sufficient number of sequences. However, the BP approach can no longer be applied in a more realistic setting where the topology is allowed to vary and is not known at inference time. In this work, we derive an algorithm that combines BP with dynamic programming to exactly (and efficiently) solve this problem, exploiting a recursive relationship between the partition functions for the various sub-trees contained in the generative tree. We then show that an analogous E-M approach can be implemented to achieve grammar reconstruction and optimal inference in this setting. We believe this novel algorithm might provide a basis for interpreting how deep networks can learn to perform these tasks.

Practical information

  • Informed public
  • Free

Organizer

  • Lénaïc Chizat

Share