Explore >> Select a destination


You are here

christopher-beckham.github.io
| | yang-song.net
7.9 parsecs away

Travel
| | This blog post focuses on a promising new direction for generative modeling. We can learn score functions (gradients of log probability density functions) on a large number of noise-perturbed data distributions, then generate samples with Langevin-type sampling. The resulting generative models, often called score-based generative models, has several important advantages over existing model families: GAN-level sample quality without adversarial training, flexible model architectures, exact log-likelihood ...
| | tiao.io
11.6 parsecs away

Travel
| | An in-depth practical guide to variational encoders from a probabilistic perspective.
| | fa.bianp.net
10.6 parsecs away

Travel
| | The Langevin algorithm is a simple and powerful method to sample from a probability distribution. It's a key ingredient of some machine learning methods such as diffusion models and differentially private learning. In this post, I'll derive a simple convergence analysis of this method in the special case when the ...
| | www.analyticsvidhya.com
34.7 parsecs away

Travel
| ProGAN is an extension of the training process of GAN that allows the generator models to train with stability in python