|
You are here |
sander.ai | ||
| | | | |
distill.pub
|
|
| | | | | What we'd like to find out about GANs that we don't know yet. | |
| | | | |
yang-song.net
|
|
| | | | | This blog post focuses on a promising new direction for generative modeling. We can learn score functions (gradients of log probability density functions) on a large number of noise-perturbed data distributions, then generate samples with Langevin-type sampling. The resulting generative models, often called score-based generative models, has several important advantages over existing model families: GAN-level sample quality without adversarial training, flexible model architectures, exact log-likelihood ... | |
| | | | |
www.depthfirstlearning.com
|
|
| | | | | [AI summary] The user has provided a detailed and complex set of questions and reading materials related to normalizing flows, variational inference, and generative models. The content covers topics such as the use of normalizing flows to enhance variational posteriors, the inference gap, and the implementation of models like NICE and RealNVP. The user is likely seeking guidance on how to approach these questions, possibly for academic or research purposes. | |
| | | | |
iamtrask.github.io
|
|
| | | A machine learning craftsmanship blog. | ||