|
You are here |
gregorygundersen.com | ||
| | | | |
tiao.io
|
|
| | | | | One weird trick to make exact inference in Bayesian logistic regression tractable. | |
| | | | |
www.depthfirstlearning.com
|
|
| | | | | [AI summary] The user has provided a detailed and complex set of questions and reading materials related to normalizing flows, variational inference, and generative models. The content covers topics such as the use of normalizing flows to enhance variational posteriors, the inference gap, and the implementation of models like NICE and RealNVP. The user is likely seeking guidance on how to approach these questions, possibly for academic or research purposes. | |
| | | | |
fa.bianp.net
|
|
| | | | | The Langevin algorithm is a simple and powerful method to sample from a probability distribution. It's a key ingredient of some machine learning methods such as diffusion models and differentially private learning. In this post, I'll derive a simple convergence analysis of this method in the special case when the ... | |
| | | | |
sriku.org
|
|
| | | [AI summary] The article explains how to generate random numbers that follow a specific probability distribution using a uniform random number generator, focusing on methods involving inverse transform sampling and handling both continuous and discrete cases. | ||