*Generative Model/Generative Model_2
-
[On-going] Flow Matching*Generative Model/Generative Model_2 2025. 7. 12. 09:08
역시 Neural ODE에서 출발한다. probability path의 dynamics (vector field, FM에서는 velocity field라고 표현함)를 Neural network가 학습한다. 즉, FM 또한 Neural ODE이다. 학습한 후에는 그 dynamics를 따라서 integrate해주면, sample generation을 할 수가 있다. 차이가 있다면 CNF (Neural ODE)에서는 parameter update를 위한 loss gradient를 Backward ODESolve (Adjoint method)로 구했는데 FM에서는 명시적으로 probability path (p_t)의 target vector field u_t 와의 MSE loss를 minimize하도록 o..
-
Review of Flow-based Models*Generative Model/Generative Model_2 2025. 7. 9. 15:33
Generative model 복습 중.VAE, Diffusion, score-based model은 잘 넘어갔는데, Normalizing Flow는다시 봐도 참 미묘하다. Pieter Abbeel 교수님께서 하신 말씀이 큰 위안이 된다. (저번에도 위로가 되었는데 이번에도 위로가 된다 ㅋㅋㅋ ㅠㅠ)매년 강의를 해도, 다시 강의를 하기 전에 스스로 convincing하기까지 상당히 오래 걸린다는 말씀.(저도 그래요 교수님 ㅎㅎ 복습할 때마다 아리송해요.. ㅎㅎ) 참 희한한 게,simple distribution ----> transformation ----> complex distributionidea는 단순한데왜 이렇게 머릿 속에 그려지기가 힘든지 모르겠어. 오히려 수식은 VAE, diffusion, ..
-
Maximum Likelihood Training of Score-Based Diffusion Models*Generative Model/Generative Model_2 2025. 2. 15. 16:59
https://arxiv.org/pdf/2101.09258https://github.com/yang-song/score_flowOct 2021 (NeurIPS 2021)Abstract Score-based diffusion models synthesize samples by reversing a stochastic process that diffuses data to noise, and are trained by minimizing a weighted combination of score matching losses. The log-likelihood of score-based diffusion models can be tractably computed through a connection to cont..
-
Score-based Generative Modeling through Stochastic Differential Equations*Generative Model/Generative Model_2 2025. 2. 15. 11:35
https://arxiv.org/pdf/2011.13456https://github.com/yang-song/score_sde(Feb 2021 ICLR 2021)AbstractCreating noise from data is easy; creating data from noise is generative modeling. We present a stochastic differential equation (SDE) that smoothly transforms a complex data distribution to a known prior distribution by slowly injecting noise, and a corresponding reverse-time SDE that transforms th..
-
Improved Techniques for Training Score-Based Generative Models*Generative Model/Generative Model_2 2025. 2. 15. 09:20
https://arxiv.org/pdf/2006.09011https://github.com/ermongroup/ncsnv2(Oct 2020 NeurIPS 2020)AbstractScore-based generative models can produce high quality image samples comparable to GANs, without requiring adversarial optimization. However, existing training procedures are limited to images of low resolution (typically below 32 × 32), and can be unstable under some settings. We provide a new the..
-
Generative Modeling by Estimating Gradients of the Data Distribution*Generative Model/Generative Model_2 2025. 2. 15. 07:27
https://arxiv.org/pdf/1907.05600https://github.com/ermongroup/ncsnOct 2020 (NeurIPS 2019)AbstractWe introduce a new generative model where samples are produced via Langevin dynamics using gradients of the data distribution estimated with score matching. Because gradients can be ill-defined and hard to estimate when the data resides on low-dimensional manifolds, we perturb the data with different..
-
[VDMs] 계속 등장하는 수식들*Generative Model/Generative Model_2 2025. 1. 27. 13:23
https://arxiv.org/pdf/2401.06281나중에 참고하려고 정리 여러 번 line by line으로 도출해봄..2.1. Forward Process: Gaussian Diffusion2.1.1. Linear Gaussian Transitions: q(zt | zs) 2.1.2. Top-down Posterior: q(zs | zt, x)2.1.3. Learning the Noise Schedule2.2. Reverse Process: Discrete-Time Generative Model2.2.1. Generative Transitions: p(zs | zt) Two other notable parameterizations not elaborated upon in this article ..
-
Variational Diffusion Models*Generative Model/Generative Model_2 2025. 1. 23. 00:21
https://arxiv.org/pdf/2107.00630https://github.com/google-research/vdm(NeurIPS 2021) Jul 2021 ★ ★ ★ ★ 이걸 먼저 봐야 함!!!!! ★ ★ ★ ★ ↓ ↓ ↓ ↓ ↓ ↓ ↓Demystifying Variational Diffusion Models https://arxiv.org/pdf/2401.06281 ↑ ↑ ↑ ↑ ↑ ↑ ↑ ↑ 「 Variational Diffusion Models 」 이 논문에서의 수식이 Classifier-free guidance, distillation, Imagen 등등..에서 계속 이어지는데,(score-based model 계열 논문에서도 수식, 개념이..