On theory of contrastive self-supervised learning

Thumbnail

Event details

Date 16.08.2023
Hour 13:0015:00
Speaker Oguz Yüksel
Location
Category Conferences - Seminars
EDIC candidacy exam
Exam president: Prof. Volkan Cevher
Thesis advisor: Prof. Nicolas Flammarion
Co-examiner: Prof. Lenka Zdeborova

Abstract
Self-supervised learning has taken over classical supervised
learning by enabling the use of large datasets needed to
train big models through a pretext task. These pretext tasks allow
the learning of representations that, after adaptation with limited
task-specific labeled samples, yield good downsteam performance
across various tasks. Contrastive learning, in particular, has
been particularly effective and has seen wide adoption from
practitioners due to its simplicity and success. As a result, there
has been growing interest in developing a theory of contrastive
learning. In this write-up, some of the recent progress on the
theory of contrastive learning is summarized, along with a
follow-up on important open questions and proposals for future
investigations.

Background papers
[1] T. Chen, S. Kornblith, M. Norouzi, and G. Hinton, "A simple framework for contrastive learning of visual representations," in International conference on machine learning. PMLR, 2020, pp. 1597–1607.
[2] N. Saunshi, O. Plevrakis, S. Arora, M. Khodak, and H. Khandeparkar, "A theoretical analysis of contrastive unsupervised representation learning," in International Conference on Machine Learning. PMLR, 2019, pp. 5628–5637.
[3] J. Z. HaoChen, C. Wei, A. Gaidon, and T. Ma, "Provable guarantees for self-supervised deep learning with spectral contrastive loss," Advances in Neural Information Processing Systems, vol. 34, pp. 5000–5011, 2021.

Practical information

  • General public
  • Free

Tags

EDIC candidacy exam

Share