Score-based generative modeling, informally referred to as diffusion models,
continue to grow in popularity across several important domains and tasks.
While they provide high-quality and diverse samples from empirical
distributions, important questions remain on the reliability and
trustworthiness of these sampling procedures for their responsible use in
critical scenarios. Conformal prediction is a modern tool to construct
finite-sample, distribution-free uncertainty guarantees for any black-box
predictor. In this work, we focus on image-to-image regression tasks and we
present a generalization of the Risk-Controlling Prediction Sets (RCPS)
procedure, that we term K-RCPS, which allows to (i) provide entrywise
calibrated intervals for future samples of any diffusion model, and (ii)
control a certain notion of risk with respect to a ground truth image with
minimal mean interval length. Differently from existing conformal risk control
procedures, ours relies on a novel convex optimization approach that allows for
multidimensional risk control while provably minimizing the mean interval
length. We illustrate our approach on two real-world image denoising problems:
on natural images of faces as well as on computed tomography (CT) scans of the
abdomen, demonstrating state of the art performance