Explore >> Select a destination


You are here

sander.ai
| | christopher-beckham.github.io
10.1 parsecs away

Travel
| | Techniques for label conditioning in Gaussian denoising diffusion models
| | resources.paperdigest.org
9.4 parsecs away

Travel
| | The Conference on Neural Information Processing Systems (NIPS) is one of the top machine learning conferences in the world. Paper Digest Team analyze all papers published on NIPS in the past years, and presents the 15 most influential papers for each year. This ranking list is automatically construc
| | yang-song.net
6.7 parsecs away

Travel
| | This blog post focuses on a promising new direction for generative modeling. We can learn score functions (gradients of log probability density functions) on a large number of noise-perturbed data distributions, then generate samples with Langevin-type sampling. The resulting generative models, often called score-based generative models, has several important advantages over existing model families: GAN-level sample quality without adversarial training, flexible model architectures, exact log-likelihood ...
| | comsci.blog
55.1 parsecs away

Travel
| In this blog post, we will learn about vision transformers (ViT), and implement an MNIST classifier with it. We will go step-by-step and understand every part of the vision transformers clearly, and you will see the motivations of the authors of the original paper in some of the parts of the architecture.