You are here |
yang-song.net | ||
| | | |
lilianweng.github.io
|
|
| | | | [Updated on 2021-09-19: Highly recommend this blog post on score-based generative modeling by Yang Song (author of several key papers in the references)]. [Updated on 2022-08-27: Added classifier-free guidance, GLIDE, unCLIP and Imagen. [Updated on 2022-08-31: Added latent diffusion model. [Updated on 2024-04-13: Added progressive distillation, consistency models, and the Model Architecture section. | |
| | | |
sander.ai
|
|
| | | | Perspectives on diffusion, or how diffusion models are autoencoders, deep latent variable models, score function predictors, reverse SDE solvers, flow-based models, RNNs, and autoregressive models, all at once! | |
| | | |
sander.ai
|
|
| | | | Thoughts on the tension between iterative refinement as the thing that makes diffusion models work, and our continual attempts to make it _less_ iterative. | |
| | | |
haifengl.wordpress.com
|
|
| | Generative artificial intelligence (GenAI), especially ChatGPT, captures everyone's attention. The transformerbased large language models (LLMs), trained on a vast quantity of unlabeled data at scale, demonstrate the ability to generalize to many different tasks. To understand why LLMs are so powerful, we will deep dive into how they work in this post. LLM Evolutionary Tree... |