|
You are here |
667-per-cm.net | ||
| | | | |
erikbern.com
|
|
| | | | | I made a New Year's resolution: every plot I make during 2018 will contain uncertainty estimates. Nine months in and I have learned a lot, so I put together a summary of some of the most useful methods. | |
| | | | |
weisser-zwerg.dev
|
|
| | | | | A series about Monte Carlo methods and generating samples from probability distributions. | |
| | | | |
gregorygundersen.com
|
|
| | | | | [AI summary] Hamiltonian Monte Carlo (HMC) is a Markov Chain Monte Carlo (MCMC) method that leverages Hamiltonian dynamics to generate samples from a probability distribution. Unlike traditional MCMC methods that rely on random walks, HMC introduces auxiliary momenta variables and simulates a physical system to produce correlated samples with higher efficiency. The method uses gradient information of the log density to guide the sampling process, enabling faster exploration of the target distribution and higher acceptance rates. The implementation of HMC involves defining the potential and kinetic energy functions, performing leapfrog integration to approximate the Hamiltonian dynamics, and using the Metropolis-Hastings acceptance criterion. An example using... | |
| | | | |
www.v7labs.com
|
|
| | | Autoencoders are a type of neural network that can be used for unsupervised learning. Explore different types of autoencoders and learn how they work. | ||