Explore >> Select a destination


You are here

iclr-blogposts.github.io
| | yang-song.net
1.9 parsecs away

Travel
| | This blog post focuses on a promising new direction for generative modeling. We can learn score functions (gradients of log probability density functions) on a large number of noise-perturbed data distributions, then generate samples with Langevin-type sampling. The resulting generative models, often called score-based generative models, has several important advantages over existing model families: GAN-level sample quality without adversarial training, flexible model architectures, exact log-likelihood ...
| | blog.evjang.com
2.6 parsecs away

Travel
| | This is a tutorial on common practices in training generative models that optimize likelihood directly, such as autoregressive models and ...
| | christopher-beckham.github.io
2.8 parsecs away

Travel
| | Techniques for label conditioning in Gaussian denoising diffusion models
| | kyunghyuncho.me
11.3 parsecs away

Travel
|