|
You are here |
sander.ai | ||
| | | | |
proceedings.neurips.cc
|
|
| | | | | ||
| | | | |
jaketae.github.io
|
|
| | | | | In this short post, we will take a look at variational lower bound, also referred to as the evidence lower bound or ELBO for short. While I have referenced ELBO in a previous blog post on VAEs, the proofs and formulations presented in the post seems somewhat overly convoluted in retrospect. One might consider this a gentler, more refined recap on the topic. For the remainder of this post, I will use the terms "variational lower bound" and "ELBO" interchangeably to refer to the same concept. I was heavily inspired by Hugo Larochelle's excellent lecture on deep belief networks. | |
| | | | |
christopher-beckham.github.io
|
|
| | | | | Techniques for label conditioning in Gaussian denoising diffusion models | |
| | | | |
jxmo.io
|
|
| | | A primer on variational autoencoders (VAEs) culminating in a PyTorch implementation of a VAE with discrete latents. | ||