|
You are here |
sander.ai | ||
| | | | |
distill.pub
|
|
| | | | | What we'd like to find out about GANs that we don't know yet. | |
| | | | |
xcorr.net
|
|
| | | | | 2022 was the year of generative AI models: DALL-E 2, MidJourney, Stable Diffusion, and Imagen all showed that it's possible to generate grounded, photorealistic images. These generative AIs are instances of conditional denoising diffusion probabilistic models, or DDPMs. Despite these flashy applications, DDPMs have thus far had little impact on neuroscience. An oil painting of... | |
| | | | |
yang-song.net
|
|
| | | | | This blog post focuses on a promising new direction for generative modeling. We can learn score functions (gradients of log probability density functions) on a large number of noise-perturbed data distributions, then generate samples with Langevin-type sampling. The resulting generative models, often called score-based generative models, has several important advantages over existing model families: GAN-level sample quality without adversarial training, flexible model architectures, exact log-likelihood ... | |
| | | | |
www.hhyu.org
|
|
| | | Science, programming, books, and other interesting stuff | ||