BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:Graph Embedding Methods for Scalable Knowledge Graph Completion
DTSTART:20220905T090000
DTEND:20220905T110000
DTSTAMP:20260428T211153Z
UID:d01bdee1e7dfd7144d721bd6a22d3caf22db433acf867e158c94b1fb
CATEGORIES:Conferences - Seminars
DESCRIPTION:Andrej Janchevski\nEDIC candidacy exam\nExam president: Prof. 
 Pierre Vandergheynst\nThesis advisor: Prof. Volkan Cevher\nCo-examiner: Pr
 of. Matthias Grossglauser\n\nAbstract\nKnowledge graphs have recently attr
 acted significant attention from both industry and academia in scenarios t
 hat require exploiting large-scale heterogeneous data collections. They ha
 ve witnessed numerous applications in a diverse range: from social media t
 o the telecom industry. Thus\, many companies in various industry sectors 
 have started building and maintaining their own knowledge graph for intern
 al use in the last few years. All of the applications of the graph involve
  reasoning over the heterogeneous relational data it stores\, to infer nov
 el information not already present. We refer to this process as knowledge 
 graph completion.\nHowever\, modern knowledge graph data possesses an addi
 tional property that gives rise to a new challenge: the graphs can contain
  more than hundreds of millions of entities. When graph sizes reach high o
 rders of magnitude a delicate balance between scalability with respect to 
 model performance on one hand and scalability with respect to computationa
 l cost on the other might be required and such a model has yet to be propo
 sed.\nIn this discussion\, we will present three published techniques for 
 scaling up knowledge graph completion\, based on graph coarsening\, end-to
 -end learnable graph clustering and improved knowledge graph training quer
 y sampling\, as well as discuss their benefits and limitations. The gaps i
 n scientific knowledge these and other works on the topic leave motivate o
 ur proposed research directions.\n\nBackground papers\n\n	Chen\, Haochen\,
  Bryan Perozzi\, Yifan Hu\, and Steven Skiena. “HARP: Hierarchical Repre
 sentation Learning for Networks.” Proceedings of the AAAI Conference on 
 Artificial Intelligence 32\, no. 1 (April 26\, 2018). https://doi.org/10.1
 609/aaai.v32i1.11849.\n	Ying\, Zhitao\, Jiaxuan You\, Christopher Morris\,
  Xiang Ren\, Will Hamilton\, and Jure Leskovec. “Hierarchical Graph Repr
 esentation Learning with Differentiable Pooling.” In Advances in Neural 
 Information Processing Systems\, Vol. 31. Curran Associates\, Inc.\, 2018.
  https://proceedings.neurips.cc/paper/2018/hash/e77dbaf6759253c7c6d0efc569
 0369c7-Abstract.html.\n	Ren\, Hongyu\, Hanjun Dai\, Bo Dai\, Xinyun Chen\,
  Denny Zhou\, Jure Leskovec\, and Dale Schuurmans. “SMORE: Knowledge Gra
 ph Completion and Multi-Hop Reasoning in Massive Knowledge Graphs.” arXi
 v\, November 1\, 2021. https://doi.org/10.48550/arXiv.2110.14890.\n
LOCATION:ELE 111 https://plan.epfl.ch/?room==ELE%20111
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
