Pareto-Conditioned Diffusion Models for Offline Multi-Objective Optimization

Abstract

Multi-objective optimization (MOO) arises in many real-world applications where trade-offs between competing objectives must be carefully balanced. In the offline setting, where only a static dataset is available, the main challenge is generalizing beyond observed data. We introduce Pareto-Conditioned Diffusion (PCD), a novel framework that formulates offline MOO as a conditional sampling problem. By conditioning directly on desired trade-offs, PCD avoids the need for explicit surrogate models. To effectively explore the Pareto front, PCD employs a reweighting strategy that focuses on high-performing samples and a reference-direction mechanism to guide sampling towards novel, promising regions beyond the training data. Experiments on standard offline MOO benchmarks show that PCD achieves highly competitive performance and, importantly, demonstrates greater consistency across diverse tasks than existing offline MOO approaches.

Publication
In International Conference on Learning Representations 2026
Severi Rissanen
Severi Rissanen
PhD student in Machine Learning

My research interests are in generative modelling, especially inductive biases in diffusion generative models, adding inference-time constraints to diffusion models and applications to the natural sciences.