Explore >> Select a destination


You are here

christopher-beckham.github.io
| | sander.ai
1.0 parsecs away

Travel
| | More thoughts on diffusion guidance, with a focus on its geometry in the input space.
| | www.nicktasios.nl
2.3 parsecs away

Travel
| | In the Latent Diffusion Series of blog posts, I'm going through all components needed to train a latent diffusion model to generate random digits from the MNIST dataset. In the third, and last, post,
| | yang-song.net
1.7 parsecs away

Travel
| | This blog post focuses on a promising new direction for generative modeling. We can learn score functions (gradients of log probability density functions) on a large number of noise-perturbed data distributions, then generate samples with Langevin-type sampling. The resulting generative models, often called score-based generative models, has several important advantages over existing model families: GAN-level sample quality without adversarial training, flexible model architectures, exact log-likelihood ...
| | blog.vstelt.dev
12.2 parsecs away

Travel
| [AI summary] The article explains the process of building a neural network from scratch in Rust, covering forward and backward propagation, matrix operations, and code implementation.