Diffusion Models for Neural Network Weight Representation

Thumbnail

Event details

Date 11.11.2024
Hour 14:3016:30
Speaker Zhuoqian Yang
Location
Category Conferences - Seminars
EDIC candidacy exam
Exam president: Prof. Alexandre Alahi
Thesis advisor: Prof. Mathieu Salzmann
Thesis coadvisor: Prof. Sabine Süsstrunk
Co-examiner: Prof. Haitham Hassanieh

Abstract
Diffusion models traditionally rely on grid-based
representations that introduce limitations in scalability and
adaptability. This research explores the potential of diffusion
models trained on neural network weights as an implicit representation.
Specifically, we propose a novel method that leverages
Low-Rank Adaptation (LoRA) weights rather than full MLP
weights as a space in which a diffusion model operate. This
approach addresses the challenges of parameter redundancy and
representation ambiguity in full MLP weights. By generating
LoRA weights, the proposed model can efficiently encode highfidelity
signals with reduced computational requirements, while
the representation is constrained to a subspace defined by the
base model. This research has potential applications in efficient
generation of 2D, 3D and 4D data.

Background papers
[1] Rombach et al., High-Resolution Image Synthesis with Latent Diffusion Models. 
https://arxiv.org/abs/2112.10752

[2] Erko砥t al., HyperDiffusion: Generating Implicit Neural Fields with Weight-Space Diffusion.
https://arxiv.org/abs/2303.17015

[3] Hu et al., LoRA: Low-Rank Adaptation of Large Language Models. 
https://arxiv.org/abs/2106.09685
 

Practical information

  • General public
  • Free

Tags

EDIC candidacy exam

Share