Explore >> Select a destination


You are here

distill.pub
| | sander.ai
1.6 parsecs away

Travel
| | Thoughts on the tension between iterative refinement as the thing that makes diffusion models work, and our continual attempts to make it _less_ iterative.
| | yang-song.net
1.9 parsecs away

Travel
| | This blog post focuses on a promising new direction for generative modeling. We can learn score functions (gradients of log probability density functions) on a large number of noise-perturbed data distributions, then generate samples with Langevin-type sampling. The resulting generative models, often called score-based generative models, has several important advantages over existing model families: GAN-level sample quality without adversarial training, flexible model architectures, exact log-likelihood ...
| | resources.paperdigest.org
1.7 parsecs away

Travel
| | The Conference on Neural Information Processing Systems (NIPS) is one of the top machine learning conferences in the world. Paper Digest Team analyze all papers published on NIPS in the past years, and presents the 15 most influential papers for each year. This ranking list is automatically construc
| | www.chrisritchie.org
25.4 parsecs away

Travel
| [AI summary] A blog post discussing the simulation of artificial life with neural networks, focusing on agent behavior, population dynamics, and future development goals.