View a PDF of the paper titled Improving Consistency Models with Generator-Augmented Flows, by Thibaut Issenhuth and 5 other authors
View PDF
HTML (experimental)
Abstract:Consistency models imitate the multi-step sampling of score-based diffusion in a single forward pass of a neural network. They can be learned in two ways: consistency distillation and consistency training. The former relies on the true velocity field of the corresponding differential equation, approximated by a pre-trained neural network. In contrast, the latter uses a single-sample Monte Carlo estimate of this velocity field. The related estimation error induces a discrepancy between consistency distillation and training that, we show, still holds in the continuous-time limit. To alleviate this issue, we propose a novel flow that transports noisy data towards their corresponding outputs derived from a consistency model. We prove that this flow reduces the previously identified discrepancy and the noise-data transport cost. Consequently, our method not only accelerates consistency training convergence but also enhances its overall performance. The code is available at: this https URL.
Submission history
From: Thibaut Issenhuth [view email]
[v1]
Thu, 13 Jun 2024 20:22:38 UTC (1,682 KB)
[v2]
Mon, 14 Oct 2024 09:21:15 UTC (3,327 KB)
[v3]
Wed, 5 Feb 2025 15:57:34 UTC (3,424 KB)
[v4]
Wed, 2 Jul 2025 14:42:54 UTC (2,310 KB)