Commonsense Fact Linking for Natural Language Inference and Understanding


Event details

Date 24.08.2022
Hour 16:0018:00
Speaker Silin Gao
Category Conferences - Seminars
EDIC candidacy exam
Exam president: Prof. Karl Aberer
Thesis advisor: Prof. Antoine Bosselut
Co-examiner: Prof. Robert West

Natural language processing (NLP) systems require commonsense knowledge to perform better inference and understanding on the contexts (e.g., dialogs and narratives) being processed. However, most NLP systems link facts from commonsense knowledge graphs based on simple heuristics, which fail to achieve promising performances and therefore leave current commonsense fact linking under-explored. Our goal is to conduct more rigorous study on the commonsense fact linking to enable stronger NLP systems. In this proposal, we first introduce three papers that are related to our research topic. The first paper provides background on commonsense knowledge graphs, while the second and third paper further illustrate how commonsense facts from knowledge graphs are applied to current NLP applications. Then we present our proposed commonsense fact linking benchmark facing current challenges in this technique, and finally point out potential directions for future work.

Background papers
1. COMET-ATOMIC 2020: On Symbolic and Neural Commonsense Knowledge Graphs
Jena D. Hwang, Chandra Bhagavatula, Ronan Le Bras, Jeff Da, Keisuke Sakaguchi, Antoine Bosselut, Yejin Choi

2. Think Before You Speak: Explicitly Generating Implicit Commonsense Knowledge for Response Generation
Pei Zhou, Karthik Gopalakrishnan, Behnam Hedayatnia, Seokhwan Kim, Jay Pujara, Xiang Ren, Yang Liu, Dilek Hakkani-Tur

3. A Knowledge-Enhanced Pretraining Model for Commonsense Story Generation
Jian Guan, Fei Huang, Zhihao Zhao, Xiaoyan Zhu, Minlie Huang

Practical information

  • General public
  • Free


EDIC candidacy exam