Task (in ML perspective)
Image generation
Related works
•
To learn a nonlinear function to transform a simple reference dist. to the target dist. as a data generating mechanisms
◦
Likelihood-based model
▪
e.g.) VAE
▪
- Consistency results require that the data dist. is within the model family, which is often hard to hold in practive
◦
Implicit generative model
▪
e.g.) GAN
◦
Evolving time to go to infinity at the population level
▪
e.g.) flow-based model, Stochastic differential equations
▪
- Needs strong assumption to achieve model consistency: the target must be long-concave or satisfy the log-Sovolev inequality
Goal
To generate high-quality image
•
with consistency based on mild assumption
Consistency?
Problem definition
•
Schrödinger Bridge tackles the problem by interpolating a reference distribution to a target distribution based on the Kullback-Leibler divergence.
•
Formulated via an SDE on a finite time interval [0, 1] with a time-varying drift term at the population level
Jargons
-field (i.e. -algebra)
Methods
Two- stage Schro ̈dinger Bridge algorithm
by plugging the drift term estimated by a deep score estimator and a deep density estimator in the Euler-Maruyama method
•
Dataset: CIFAR-10, Celeb A
•
Method
◦
input : Observed data points
◦
output : Data dist.
◦
Generative process: Extract data point from learned data dist.
◦
Detailed method
▪
Background
Problem formulation of Schröndinger bridge (SBP)
[Proposed method]: two-stage approach
Consistency 증명
Results
•
Image interpolation
•
Image inpainting