Learning the Underlying Structure of NLP Tasks

Thumbnail

Event details

Date 09.07.2019
Hour 10:0012:00
Speaker Jean-Baptiste Cordonnier
Location
Category Conferences - Seminars
EDIC candidacy exam
Exam president: Dr. François Fleuret
Thesis advisor: Prof. Martin Jaggi
Co-examiner: Prof. Robert West

Abstract
Progress in Natural Language Processing has been
driven by the quest of architectures capturing the structure of
text and more recently by novel pre-training tasks on large
text corpora. In this report, we discuss two important papers
that shifted the NLP researchers’ attention toward better semisupervised
tasks and downstream training using Multi-Task
Learning. We then step back and study how the Computer Vision
community studies relationships between visual tasks and which
lessons apply to NLP. I finally outline my research proposal for
the rest of the doctoral studies.

Background papers
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding, by Devlin,J., et al.
Multi-Task Deep Neural Networks for Natural Language Understanding, by Liu, X., et al.
Taskonomy: Disentangling Task Transfer Learning, by  Zamir, A., et al.

Practical information

  • General public
  • Free

Tags

EDIC candidacy exam

Share