|
You are here |
iclr-blogposts.github.io | ||
| | | | |
yang-song.net
|
|
| | | | | This blog post focuses on a promising new direction for generative modeling. We can learn score functions (gradients of log probability density functions) on a large number of noise-perturbed data distributions, then generate samples with Langevin-type sampling. The resulting generative models, often called score-based generative models, has several important advantages over existing model families: GAN-level sample quality without adversarial training, flexible model architectures, exact log-likelihood ... | |
| | | | |
jxmo.io
|
|
| | | | | A primer on variational autoencoders (VAEs) culminating in a PyTorch implementation of a VAE with discrete latents. | |
| | | | |
www.depthfirstlearning.com
|
|
| | | | | [AI summary] The user has provided a detailed and complex set of questions and reading materials related to normalizing flows, variational inference, and generative models. The content covers topics such as the use of normalizing flows to enhance variational posteriors, the inference gap, and the implementation of models like NICE and RealNVP. The user is likely seeking guidance on how to approach these questions, possibly for academic or research purposes. | |
| | | | |
sander.ai
|
|
| | | More thoughts on diffusion guidance, with a focus on its geometry in the input space. | ||