arxiv preprint – Compositional Abilities Emerge Multiplicatively: Exploring Diffusion Models on a Synthetic Task


In this episode we discuss Compositional Abilities Emerge Multiplicatively: Exploring Diffusion Models on a Synthetic Task
by Maya Okawa, Ekdeep Singh Lubana, Robert P. Dick, Hidenori Tanaka. The paper investigates how conditional diffusion models generalize compositionally by studying their ability to generate novel data combinations within a controlled synthetic environment. Key discoveries include that compositional ability hinges on the data-generating process structure, and there’s a sudden emergence of compositional performance linked to individual task proficiency. The findings also show that rarely seen concepts in training are tougher to compose for new outputs, shedding light on the generative models’ capabilities from the perspective of data availability and structure.


Posted

in

by

Tags: