Breaking the Algorithmic Ceiling: Efficient Generative Pre-training with Inductive Moment Matching (IMM)

2025-03-12
Breaking the Algorithmic Ceiling: Efficient Generative Pre-training with Inductive Moment Matching (IMM)

Luma Labs introduces Inductive Moment Matching (IMM), a novel pre-training technique addressing the stagnation in algorithmic innovation within generative pre-training. IMM significantly outperforms diffusion models in both sample quality and sampling efficiency, achieving over a tenfold increase in the latter. By incorporating the target timestep, IMM enhances the flexibility of each inference iteration, overcoming the limitations of linear interpolation in diffusion models. Experiments demonstrate state-of-the-art FID scores on ImageNet and CIFAR-10, along with superior training stability. This research marks a significant advance in generative pre-training algorithms, paving the way for future advancements in multi-modal foundation models.