Explore >> Select a destination


You are here

folinoid.com
| | yang-song.net
2.8 parsecs away

Travel
| | This blog post focuses on a promising new direction for generative modeling. We can learn score functions (gradients of log probability density functions) on a large number of noise-perturbed data distributions, then generate samples with Langevin-type sampling. The resulting generative models, often called score-based generative models, has several important advantages over existing model families: GAN-level sample quality without adversarial training, flexible model architectures, exact log-likelihood ...
| | evjang.com
5.6 parsecs away

Travel
| | This blog post outlines a key engineering principle I've come to believe strongly in for building general AI systems with deep learning. This principle guides my present-day research tastes and day-to-day design choices in building large-scale, general-purpose ML systems. Discoveries around Neural Scaling Laws, unsupervised pretraining on Internet-scale datasets, and other work on Foundation Models have pointed to a simple yet exciting narrative for making progress in Machine Learning: Large amounts of d...
| | jxmo.io
3.4 parsecs away

Travel
| | A primer on variational autoencoders (VAEs) culminating in a PyTorch implementation of a VAE with discrete latents.
| | fodsi.us
21.3 parsecs away

Travel
| [AI summary] The ML4A Virtual Workshop explores how machine learning enhances classical algorithms through data-driven approaches, featuring talks on deep generative models, model-based deep learning, and learning-augmented algorithms.