Explore >> Select a destination


You are here

jaketae.github.io
| | yang-song.net
2.6 parsecs away

Travel
| | This blog post focuses on a promising new direction for generative modeling. We can learn score functions (gradients of log probability density functions) on a large number of noise-perturbed data distributions, then generate samples with Langevin-type sampling. The resulting generative models, often called score-based generative models, has several important advantages over existing model families: GAN-level sample quality without adversarial training, flexible model architectures, exact log-likelihood ...
| | akosiorek.github.io
2.5 parsecs away

Travel
| | Machine learning is all about probability.To train a model, we typically tune its parameters to maximise the probability of the training dataset under the mo...
| | sander.ai
3.1 parsecs away

Travel
| | Perspectives on diffusion, or how diffusion models are autoencoders, deep latent variable models, score function predictors, reverse SDE solvers, flow-based models, RNNs, and autoregressive models, all at once!
| | vankessel.io
15.8 parsecs away

Travel
| A blog for my thoughts. Mostly philosophy, math, and programming.