Second order statistics in deep learning

Thumbnail

Event details

Date 26.06.2017
Hour 09:3011:30
Speaker Kaicheng Yu
Location
Category Conferences - Seminars

EDIC candidacy exam
Exam president: Prof. Martin Jaggi
Thesis advisor: Prof. Pascal Fua
Thesis co-advisor: Dr. Mathieu Salzmann
Co-examiner: Prof. Robert West

Abstract
Convolutional neural networks (CNNs) have been shown effective in many visual recognition tasks recently. However, a CNN is formulated by weighted summation which limits the network to explicitly learn second-order statistics.
I will study three related works in this report, region covariance descriptors in pedestrian detection, a general back-propagation algorithm to support matrix level gradient descent and a recent state-of-the-art deep network architecture DenseNet and demonstrate the relevance to our future research goal, which is to introduce second-order statistics in deep networks. In addition, I will briefly discuss our current research, global covariance descriptor in CNNs, and our future research plan.

Background papers
Pedestrian Detection via Classification on Riemannian Manifolds, IEEE paper.
Matrix Backpropagation for Deep Networks with Structured Layers Shorter version, Ionescu C., et al.
Densely Connected Convolutional Networks, Huang G., et al.

Practical information

  • General public
  • Free

Contact

Tags

EDIC candidacy exam

Share