Commonsense Fact Linking for Natural Language Inference and Understanding

Event details
Date | 24.08.2022 |
Hour | 16:00 › 18:00 |
Speaker | Silin Gao |
Location | |
Category | Conferences - Seminars |
EDIC candidacy exam
Exam president: Prof. Karl Aberer
Thesis advisor: Prof. Antoine Bosselut
Co-examiner: Prof. Robert West
Abstract
Natural language processing (NLP) systems require commonsense knowledge to perform better inference and understanding on the contexts (e.g., dialogs and narratives) being processed. However, most NLP systems link facts from commonsense knowledge graphs based on simple heuristics, which fail to achieve promising performances and therefore leave current commonsense fact linking under-explored. Our goal is to conduct more rigorous study on the commonsense fact linking to enable stronger NLP systems. In this proposal, we first introduce three papers that are related to our research topic. The first paper provides background on commonsense knowledge graphs, while the second and third paper further illustrate how commonsense facts from knowledge graphs are applied to current NLP applications. Then we present our proposed commonsense fact linking benchmark facing current challenges in this technique, and finally point out potential directions for future work.
Background papers
1. COMET-ATOMIC 2020: On Symbolic and Neural Commonsense Knowledge Graphs
Jena D. Hwang, Chandra Bhagavatula, Ronan Le Bras, Jeff Da, Keisuke Sakaguchi, Antoine Bosselut, Yejin Choi
https://arxiv.org/pdf/2010.05953.pdf
2. Think Before You Speak: Explicitly Generating Implicit Commonsense Knowledge for Response Generation
Pei Zhou, Karthik Gopalakrishnan, Behnam Hedayatnia, Seokhwan Kim, Jay Pujara, Xiang Ren, Yang Liu, Dilek Hakkani-Tur
https://arxiv.org/pdf/2110.08501.pdf
3. A Knowledge-Enhanced Pretraining Model for Commonsense Story Generation
Jian Guan, Fei Huang, Zhihao Zhao, Xiaoyan Zhu, Minlie Huang
https://arxiv.org/pdf/2001.05139.pdf
Exam president: Prof. Karl Aberer
Thesis advisor: Prof. Antoine Bosselut
Co-examiner: Prof. Robert West
Abstract
Natural language processing (NLP) systems require commonsense knowledge to perform better inference and understanding on the contexts (e.g., dialogs and narratives) being processed. However, most NLP systems link facts from commonsense knowledge graphs based on simple heuristics, which fail to achieve promising performances and therefore leave current commonsense fact linking under-explored. Our goal is to conduct more rigorous study on the commonsense fact linking to enable stronger NLP systems. In this proposal, we first introduce three papers that are related to our research topic. The first paper provides background on commonsense knowledge graphs, while the second and third paper further illustrate how commonsense facts from knowledge graphs are applied to current NLP applications. Then we present our proposed commonsense fact linking benchmark facing current challenges in this technique, and finally point out potential directions for future work.
Background papers
1. COMET-ATOMIC 2020: On Symbolic and Neural Commonsense Knowledge Graphs
Jena D. Hwang, Chandra Bhagavatula, Ronan Le Bras, Jeff Da, Keisuke Sakaguchi, Antoine Bosselut, Yejin Choi
https://arxiv.org/pdf/2010.05953.pdf
2. Think Before You Speak: Explicitly Generating Implicit Commonsense Knowledge for Response Generation
Pei Zhou, Karthik Gopalakrishnan, Behnam Hedayatnia, Seokhwan Kim, Jay Pujara, Xiang Ren, Yang Liu, Dilek Hakkani-Tur
https://arxiv.org/pdf/2110.08501.pdf
3. A Knowledge-Enhanced Pretraining Model for Commonsense Story Generation
Jian Guan, Fei Huang, Zhihao Zhao, Xiaoyan Zhu, Minlie Huang
https://arxiv.org/pdf/2001.05139.pdf
Practical information
- General public
- Free