Score Forgetting Distillation: A Swift, Data-Free Method for Machine Unlearning in Diffusion Models

We propose Score Forgetting Distillation (SFD), a fast, data-free machine unlearning method for diffusion models. SFD uses a teacher–student score distillation objective to make the student diffusion model rapidly forget specified classes or concepts (including individual celebrities and the broader concept of nudity) while preserving its generative quality on the rest of the data distribution. As a side effect, the distilled student supports up to 1000× faster sampling.

arXiv · Code

Recommended citation: T. Chen, S. Zhang, and M. Zhou. Score Forgetting Distillation: A Swift, Data-Free Method for Machine Unlearning in Diffusion Models. ICLR, 2025. · https://arxiv.org/abs/2409.11219