Yang Yang
Yang Yang
Home
Publications
Contact
Light
Dark
Automatic
Conference
Efficient Autoregressive Inference for Transformer Probabilistic Models
We propose an architecture extension to existing tabular foundations to generate joint predictive samples ~20x faster with minimal increase in training overhead and drop in performance.
Conor Hassan
,
Nasrulloh Loka
,
Cen-You Li
,
Daolang Huang
,
Paul E. Chang
,
Yang Yang
,
Francesco Silvestrin
,
Samuel Kaski
,
Luigi Acerbi
PDF
Cite
Code
PriorGuide: Test-Time Prior Adaptation for Simulation-Based Inference
PriorGuide enables efficient incorporation of arbitrary priors at inference time for amortized diffusion-based simulation-based inference, without retraining the model.
Yang Yang
,
Severi Rissanen
,
Paul E. Chang
,
Nasrulloh Loka
,
Daolang Huang
,
Arno Solin
,
Markus Heinonen
,
Luigi Acerbi
PDF
Cite
Code
Probabilistic Multi-Dimensional Classification
We propose a formal framework for probabilistic MDC in which learning an optimal multi-dimensional classifier can be decomposed, without loss of generality, into learning a set of (smaller) single-variable multi-class probabilistic classifiers and a directed acyclic graph. Current and future developments of both probabilistic classification and graphical model learning can directly enhance our framework, which is flexible and provably optimal.
Vu-Linh Nguyen
,
Yang Yang
,
Cassio De Campos
PDF
Cite
Code
Bayesian Structure Scores for Probabilistic Circuits
In this paper, we develop Bayesian structure scores for deterministic PCs, i.e., the structure likelihood with parameters marginalized out, which are well known as rigorous objectives for structure learning in probabilistic graphical models. When used within a greedy cutset algorithm, our scores effectively protect against overfitting and yield a fast and almost hyper-parameter-free structure learner, distinguishing it from previous approaches.
Yang Yang
,
Gennaro Gala
,
Robert Peharz
PDF
Cite
Code
Cite
×