Explore >> Select a destination


You are here

sander.ai
| | jxmo.io
10.5 parsecs away

Travel
| | A primer on variational autoencoders (VAEs) culminating in a PyTorch implementation of a VAE with discrete latents.
| | christopher-beckham.github.io
9.0 parsecs away

Travel
| | Techniques for label conditioning in Gaussian denoising diffusion models
| | yang-song.net
6.4 parsecs away

Travel
| | This blog post focuses on a promising new direction for generative modeling. We can learn score functions (gradients of log probability density functions) on a large number of noise-perturbed data distributions, then generate samples with Langevin-type sampling. The resulting generative models, often called score-based generative models, has several important advantages over existing model families: GAN-level sample quality without adversarial training, flexible model architectures, exact log-likelihood ...
| | djalil.chafai.net
79.9 parsecs away

Travel
| Convergence in law to a constant. Let \( {{(X_n)}_{n\geq1}} \) be a sequence of random variables defined on a common probability space \( {(\Omega,\mathcal{A},\mathbb{P})} \), and taking their values in a metric space \( {(E,d)} \) equipped with its Borel sigma-field. It is well known that if \( {{(X_n)}_{n\geq1}} \) converges in law as \( {n\rightarrow\infty} \) to some Dirac...