Continual Learning on Dynamic Graphs and Its Applications

Thumbnail

Event details

Date 10.06.2022
Hour 09:0011:00
Speaker Shaobo Cui
Location
Category Conferences - Seminars
EDIC candidacy exam
Exam president: Prof. Matthias Grossglauser
Thesis advisor: Prof. Boi Faltings
Thesis co-advisor: Prof. Antoine Bosselut
Co-examiner: Prof. Robert West

Abstract
Current graph neural networks(GNNs) lack scalability when the graph evolves over time(addition/deletion of edges/nodes, changes of edge weights, etc.). Training the whole graph from scratch every time step is time-consuming and not practical in real-world scenarios. Inspired by the fact that most alternations influence only certain local parts of the whole graph while the major parts remain the same, we design an incremental learning approach for evolving graphs based on subgraphs. Specifically, we relearn the influenced subgraphs while retaining the ability on the uninfluenced part of the whole graph.
We conduct experiments on a citation network dataset and prove the feasibility of our proposed method. Furthermore, we design a more precise continual learning approach that takes the evolving degree of subgraph into account.

Background papers
  1. DyRep: Learning Representations over Dynamic Graphs (https://par.nsf.gov/servlets/purl/10099025)
  2. DyKgChat: Benchmarking Dialogue Generation Grounding on Dynamic Knowledge Graphs (https://arxiv.org/abs/1910.00610)
  3. Graph Meta Learning via Local Subgraphs (https://arxiv.org/abs/2006.07889)

Practical information

  • General public
  • Free

Tags

EDIC candidacy exam

Share