Research
-
Maximum Likelihood Training of Score-Based Diffusion ModelsResearch/Generative Model_2 2025. 2. 15. 16:59
https://arxiv.org/pdf/2101.09258https://github.com/yang-song/score_flowOct 2021 (NeurIPS 2021)Abstract Score-based diffusion models synthesize samples by reversing a stochastic process that diffuses data to noise, and are trained by minimizing a weighted combination of score matching losses. The log-likelihood of score-based diffusion models can be tractably computed through a connection to cont..
-
Score-based Generative Modeling through Stochastic Differential EquationsResearch/Generative Model_2 2025. 2. 15. 11:35
https://arxiv.org/pdf/2011.13456https://github.com/yang-song/score_sde(Feb 2021 ICLR 2021)AbstractCreating noise from data is easy; creating data from noise is generative modeling. We present a stochastic differential equation (SDE) that smoothly transforms a complex data distribution to a known prior distribution by slowly injecting noise, and a corresponding reverse-time SDE that transforms th..
-
Improved Techniques for Training Score-Based Generative ModelsResearch/Generative Model_2 2025. 2. 15. 09:20
https://arxiv.org/pdf/2006.09011https://github.com/ermongroup/ncsnv2(Oct 2020 NeurIPS 2020)AbstractScore-based generative models can produce high quality image samples comparable to GANs, without requiring adversarial optimization. However, existing training procedures are limited to images of low resolution (typically below 32 × 32), and can be unstable under some settings. We provide a new the..
-
Generative Modeling by Estimating Gradients of the Data DistributionResearch/Generative Model_2 2025. 2. 15. 07:27
https://arxiv.org/pdf/1907.05600https://github.com/ermongroup/ncsnOct 2020 (NeurIPS 2019)AbstractWe introduce a new generative model where samples are produced via Langevin dynamics using gradients of the data distribution estimated with score matching. Because gradients can be ill-defined and hard to estimate when the data resides on low-dimensional manifolds, we perturb the data with different..
-
FFJORD: Free-Form Continuous Dynamics for Scalable Reversible Generative ModelsResearch/Generative Model_2 2025. 1. 31. 08:44
https://arxiv.org/pdf/1810.01367https://github.com/rtqichen/ffjord(ICLR 2019)AbstractA promising class of generative models maps points from a simple distribution to a complex distribution through an invertible neural network. Likelihood-based training of these models requires restricting their architectures to allow cheap computation of Jacobian determinants. Alternatively, the Jacobian trace c..
-
Neural Ordinary Differential EquationsResearch/Generative Model_2 2025. 1. 30. 13:27
https://arxiv.org/pdf/1806.07366https://github.com/rtqichen/torchdiffeq(Dec 2019 NeurIPS 2018)AbstractWe introduce a new family of deep neural network models. Instead of specifying a discrete sequence of hidden layers, we parameterize the derivative of the hidden state using a neural network. The output of the network is computed using a blackbox differential equation solver. These continuous-de..
-
[VDMs] 계속 등장하는 수식들Research/Generative Model_2 2025. 1. 27. 13:23
https://arxiv.org/pdf/2401.06281나중에 참고하려고 정리 여러 번 line by line으로 도출해봄..2.1. Forward Process: Gaussian Diffusion2.1.1. Linear Gaussian Transitions: q(zt | zs) 2.1.2. Top-down Posterior: q(zs | zt, x)2.1.3. Learning the Noise Schedule2.2. Reverse Process: Discrete-Time Generative Model2.2.1. Generative Transitions: p(zs | zt) Two other notable parameterizations not elaborated upon in this article ..
-
Progressive Distillation for Fast Sampling of Diffusion ModelsResearch/Diffusion 2025. 1. 23. 18:13
https://arxiv.org/pdf/2202.00512https://github.com/google-research/google-research/tree/master/diffusion_distillationAbstractDiffusion models have recently shown great promise for generative modeling, outperforming GANs on perceptual quality and autoregressive models at density estimation. A remaining downside is their slow sampling time: generating high quality samples takes many hundreds or th..