PriorGuide: Test-Time Prior Adaptation for Simulation-Based Inference

ICLR 2026
1University of Helsinki, 2ELLIS Institute Finland, 3Aalto University, 4DataCrunch

TLDR: PriorGuide enables efficient incorporation of arbitrary priors at inference time for amortized diffusion-based simulation-based inference, without retraining the model.

PriorGuide Overview

Abstract

Amortized simulator-based inference offers a powerful framework for tackling Bayesian inference in computational fields such as engineering or neuroscience, increasingly leveraging modern generative methods like diffusion models to map observed data to model parameters or future predictions. These approaches yield posterior or posterior-predictive samples for new datasets without requiring further simulator calls after training on simulated parameter-data pairs. However, their applicability is often limited by the prior distribution(s) used to generate model parameters during this training phase.

To overcome this constraint, we introduce PriorGuide, a technique specifically designed for diffusion-based amortized inference methods. PriorGuide leverages a novel guidance approximation that enables flexible adaptation of the trained diffusion model to new priors at test time, crucially without costly retraining. This allows users to readily incorporate updated information or expert knowledge post-training, enhancing the versatility of pre-trained inference models.

Method Overview

PriorGuide takes a diffusion-based amortized SBI model trained on a broad training prior and adapts it to a new target prior at inference time by adjusting the score guidance during the diffusion sampling process. The key insight is that the score of the target posterior can be decomposed into the original trained score plus a guidance term derived from the prior ratio between the new and training priors.

We approximate this guidance term analytically using a Gaussian mixture model (GMM) representation of the prior ratio, combined with a Gaussian approximation of the reverse transition kernel. This yields a closed-form guidance update that can be efficiently computed at each diffusion step.

Test-Time Refinement

PriorGuide supports improving sampling quality through corrective Langevin dynamics steps interleaved with the diffusion process. This transforms sampling into an annealed MCMC process, enabling a principled trade-off: users can invest more computational resources at inference time to achieve higher inference fidelity.

The Pareto frontiers show that combining moderate diffusion steps (N ∼ 25–50) with increasing Langevin corrections yields the best posterior inference under a fixed computational budget.

Key Results

Posterior Inference

We evaluate PriorGuide on SBI problems ranging from established benchmarks to real models from engineering and neuroscience: Two Moons, Ornstein-Uhlenbeck Process (OUP), Turin radio propagation model, Gaussian Linear (10D and 20D), and a Bayesian Causal Inference (BCI) model. PriorGuide largely improves inference accuracy over the base Simformer model across all scenarios and achieves leading performance in most cases, especially with stronger prior beliefs.

Posterior inference results

Posterior Predictive Inference

PriorGuide is readily applied to posterior predictive distributions under new priors. On forecasting/retrocasting tasks with the OUP and Turin models, PriorGuide generates reliable posterior predictive samples that closely match the true data, achieving performance on par with or better than existing methods.

Posterior predictive distributions

Test-Time Compute Trade-off

PriorGuide supports a principled trade-off between computational cost and inference accuracy. By adjusting the number of diffusion steps and Langevin corrections, users can calibrate sampling quality to their computational budget. The Pareto frontiers demonstrate that sample quality generally improves with more compute, and the best results come from combining moderate diffusion steps with increasing Langevin corrections.

Pareto frontiers of test-time compute

Related Links

PriorGuide builds upon and relates to several lines of work in simulation-based inference and diffusion models.

Simformer trains a diffusion model on the joint distribution of parameters and data, enabling flexible amortized inference. PriorGuide extends Simformer by enabling test-time prior adaptation.

ACE (Amortized Conditioning Engine) is a neural process-based method that supports prior adaptation through pre-training on a meta-prior of factorized histograms.

For a comprehensive discussion of related work, see Appendix A.1 of the paper.

BibTeX

@inproceedings{yang2026priorguide,
  title     = {PriorGuide: Test-Time Prior Adaptation for Simulation-Based Inference},
  author    = {Yang Yang and Severi Rissanen and Paul Edmund Chang and Nasrulloh Ratu Bagus Satrio Loka and Daolang Huang and Arno Solin and Markus Heinonen and Luigi Acerbi},
  booktitle = {The Fourteenth International Conference on Learning Representations},
  year      = {2026},
  url       = {https://openreview.net/forum?id=G4I23g5Ugh}
}