Explore >> Select a destination


You are here

peterbloem.nl
| | fanpu.io
18.3 parsecs away

Travel
| | An aperiodic and irreducible Markov chain will eventually converge to a stationary distribution. This is used in many applications in machine learning like Markov Chain Monte Carlo (MCMC) methods, where random walks on Markov chains are used to obtain a good estimate of the log likelihood of the partition function of a model, which is hard to compute directly as it is #P-hard (this is even harder than NP-hard). However, one common problem is that it is unclear how many steps we should take before we are guaranteed that the Markov chain has converged to the its stationary distribution. In this post, we understand how the spectral gap of the transition matrix of the Markov chain relates to its mixing time.
| | neuralnetworksanddeeplearning.com
19.3 parsecs away

Travel
| |
| | www.daniellowengrub.com
18.4 parsecs away

Travel
| |
| | lilianweng.github.io
84.8 parsecs away

Travel
| [Updated on 2021-09-19: Highly recommend this blog post on score-based generative modeling by Yang Song (author of several key papers in the references)]. [Updated on 2022-08-27: Added classifier-free guidance, GLIDE, unCLIP and Imagen. [Updated on 2022-08-31: Added latent diffusion model. [Updated on 2024-04-13: Added progressive distillation, consistency models, and the Model Architecture section.