Explore >> Select a destination


You are here

distill.pub
| | resources.paperdigest.org
1.7 parsecs away

Travel
| | The Conference on Neural Information Processing Systems (NIPS) is one of the top machine learning conferences in the world. Paper Digest Team analyze all papers published on NIPS in the past years, and presents the 15 most influential papers for each year. This ranking list is automatically construc
| | sander.ai
1.6 parsecs away

Travel
| | Thoughts on the tension between iterative refinement as the thing that makes diffusion models work, and our continual attempts to make it _less_ iterative.
| | yang-song.net
1.9 parsecs away

Travel
| | This blog post focuses on a promising new direction for generative modeling. We can learn score functions (gradients of log probability density functions) on a large number of noise-perturbed data distributions, then generate samples with Langevin-type sampling. The resulting generative models, often called score-based generative models, has several important advantages over existing model families: GAN-level sample quality without adversarial training, flexible model architectures, exact log-likelihood ...
| | www.v7labs.com
12.6 parsecs away

Travel
| Autoencoders are a type of neural network that can be used for unsupervised learning. Explore different types of autoencoders and learn how they work.