Paper Detail

Calibri: Enhancing Diffusion Transformers via Parameter-Efficient Calibration

Danil Tokhchukov, Aysel Mirzoeva, Andrey Kuznetsov, Konstantin Sobolev

huggingface Score 6.8

Published 2026-03-25 · First seen 2026-03-27

General AI

Abstract

In this paper, we uncover the hidden potential of Diffusion Transformers (DiTs) to significantly enhance generative tasks. Through an in-depth analysis of the denoising process, we demonstrate that introducing a single learned scaling parameter can significantly improve the performance of DiT blocks. Building on this insight, we propose Calibri, a parameter-efficient approach that optimally calibrates DiT components to elevate generative quality. Calibri frames DiT calibration as a black-box reward optimization problem, which is efficiently solved using an evolutionary algorithm and modifies just ~100 parameters. Experimental results reveal that despite its lightweight design, Calibri consistently improves performance across various text-to-image models. Notably, Calibri also reduces the inference steps required for image generation, all while maintaining high-quality outputs.

Workflow Status

Review status
pending
Role
unreviewed
Read priority
later
Vote
Not set.
Saved
no
Collections
Not filed yet.
Next action
Not filled yet.

Reading Brief

No structured notes yet. Add `summary_sections`, `why_relevant`, `claim_impact`, or `next_action` in `papers.jsonl` to enrich this view.

Why It Surfaced

No ranking explanation is available yet.

Tags

No tags.

BibTeX

@misc{tokhchukov2026calibri,
  title = {Calibri: Enhancing Diffusion Transformers via Parameter-Efficient Calibration},
  author = {Danil Tokhchukov and Aysel Mirzoeva and Andrey Kuznetsov and Konstantin Sobolev},
  year = {2026},
  abstract = {In this paper, we uncover the hidden potential of Diffusion Transformers (DiTs) to significantly enhance generative tasks. Through an in-depth analysis of the denoising process, we demonstrate that introducing a single learned scaling parameter can significantly improve the performance of DiT blocks. Building on this insight, we propose Calibri, a parameter-efficient approach that optimally calibrates DiT components to elevate generative quality. Calibri frames DiT calibration as a black-box rew},
  url = {https://huggingface.co/papers/2603.24800},
  keywords = {Diffusion Transformers, denoising process, learned scaling parameter, parameter-efficient approach, DiT blocks, generative quality, black-box reward optimization, evolutionary algorithm, code available, huggingface daily},
  eprint = {2603.24800},
  archiveprefix = {arXiv},
}

Metadata

{}