Deep learning models for dependency parsing and retrieval tasks
Event details
Date | 28.08.2019 |
Hour | 14:00 › 16:00 |
Speaker | Alireza Mohammadshahi |
Location | |
Category | Conferences - Seminars |
EDIC candidacy exam
Exam president: Prof. Martin Jaggi
Thesis advisor: Prof. Karl Aberer
Thesis co-advisor: Prof. James Henderson
Co-examiner: Dr. Martin Rajman
Abstract
Natural language processing (NLP) is one of the most important technologies of the information age. Understanding complex language utterances is also a crucial part of artificial intelligence. There is a wide range of NLP applications such as machine translation, question answering, parsing, retrieval task, etc. In recent years, the striking success of deep learning in a wide variety of natural language processing (NLP) applications has served as a new benchmark for future advances. In this report, I especially concentrate on two critical applications of NLP, named dependency parsing and image-caption retrieval task, and I describe conventional models for these challenging tasks.
Background papers
Transition-Based Dependency Parsing with Stack Long Short-Term Memory, by Dyer, C., et al.
Image Pivoting for Learning Multilingual Multimodal Representations, by Gella S., et al.
Graph-based Dependency Parsing with Bidirectional LSTM, by Wang, W., Chang, B.
Exam president: Prof. Martin Jaggi
Thesis advisor: Prof. Karl Aberer
Thesis co-advisor: Prof. James Henderson
Co-examiner: Dr. Martin Rajman
Abstract
Natural language processing (NLP) is one of the most important technologies of the information age. Understanding complex language utterances is also a crucial part of artificial intelligence. There is a wide range of NLP applications such as machine translation, question answering, parsing, retrieval task, etc. In recent years, the striking success of deep learning in a wide variety of natural language processing (NLP) applications has served as a new benchmark for future advances. In this report, I especially concentrate on two critical applications of NLP, named dependency parsing and image-caption retrieval task, and I describe conventional models for these challenging tasks.
Background papers
Transition-Based Dependency Parsing with Stack Long Short-Term Memory, by Dyer, C., et al.
Image Pivoting for Learning Multilingual Multimodal Representations, by Gella S., et al.
Graph-based Dependency Parsing with Bidirectional LSTM, by Wang, W., Chang, B.
Practical information
- General public
- Free