Research/Generative Model
-
Variational autoencodersResearch/Generative Model 2024. 5. 10. 22:55
다시 복습 中...※ https://www.jeremyjordan.me/variational-autoencoders/A variational autoencoder (VAE) provides a probabilistic manner for describing an observation in latent space. Thus, rather than building an encoder which outputs a single value to describe each latent state attribute, we'll formulate our encoder to describe a probability distribution for each latent attribute.IntuitionTo provide a..
-
Understanding Generative Adversarial Networks (GANs)Research/Generative Model 2024. 5. 5. 13:54
https://towardsdatascience.com/understanding-generative-adversarial-networks-gans-cd6e4651a29IntroductionYann LeCun described it as “the most interesting idea in the last 10 years in Machine Learning”. Of course, such a compliment coming from such a prominent researcher in the deep learning area is always a great advertisement for the subject we are talking about! And, indeed, Generative Adversa..
-
How to Train Your Energy-Based ModelsResearch/Generative Model 2024. 5. 4. 16:25
https://arxiv.org/pdf/2101.03288Abstract Energy-Based Models (EBMs), also known as non-normalized probabilistic models, specify probability density or mass functions up to an unknown normalizing constant. Unlike most other probabilistic models, EBMs do not place a restriction on the tractability of the normalizing constant, thus are more flexible to parameterize and can model a more expressive f..
-
Understanding Diffusion Models: A Unified PerspectiveResearch/Generative Model 2024. 5. 3. 22:20
https://arxiv.org/pdf/2208.11970 1. Introduction: Generative Models Given observed samples x from a distribution of interest, the goal of a generative model is to learn to model its true data distribution p(x). Once learned, we can generate new samples from our approximate model at will. Furthermore, under some formulations, we are able to use the learned model to evaluate the likelihood of obse..
-
Variational Inference with Normalizing Flows on MNISTResearch/Generative Model 2024. 4. 30. 22:37
https://towardsdatascience.com/variational-inference-with-normalizing-flows-on-mnist-9258bbcf8810IntroductionIn this post, I will explain what normalizing flows are and how they can be used in variational inference and designing generative models. The materials of this article mostly comes from [Rezende and Mohamed, 2015], which I believe is the first paper that introduces the concept of flow-ba..
-
Understanding Diffusion Probabilistic Models (DPMs)Research/Generative Model 2024. 4. 27. 16:04
https://towardsdatascience.com/understanding-diffusion-probabilistic-models-dpms-1940329d6048IntroductionModelling complex probability distributions is a central problem in machine learning. If this problem can appear under different shapes, one of the most common setting is the following: given a complex probability distribution only described by some available samples, how can one generates a ..
-
[DDPM] Denoising Diffusion Probabilistic Models - Theory to ImplementationResearch/Generative Model 2024. 4. 22. 15:12
https://learnopencv.com/denoising-diffusion-probabilistic-models/ Diffusion probabilistic models are an exciting new area of research showing great promise in image generation. In retrospect, diffusion-based generative models were first introduced in 2015 and popularized in 2020 when Ho et al. published the paper “Denoising Diffusion Probabilistic Models” (DDPM). DDPMs are responsible for making..