|
You are here |
www.behind-the-enemy-lines.com | ||
| | | | |
www.randomservices.org
|
|
| | | | | [AI summary] The text presents a comprehensive overview of the beta-Bernoulli process and its related statistical properties. Key concepts include: 1) The Bayesian estimator of the probability parameter $ p $ based on Bernoulli trials, which is $ rac{a + Y_n}{a + b + n} $, where $ a $ and $ b $ are parameters of the beta distribution. 2) The stochastic process $ s{Z} = rac{a + Y_n}{a + b + n} $, which is a martingale and central to the theory of the beta-Bernoulli process. 3) The distribution of the trial number of the $ k $th success, $ V_k $, which follows a beta-negative binomial distribution. 4) The mean and variance of $ V_k $, derived using conditional expectations. 5) The connection between the beta distribution and the negative binomial distributi... | |
| | | | |
a.exozy.me
|
|
| | | | | Puns, probability, and more probability | |
| | | | |
jaketae.github.io
|
|
| | | | | So far on this blog, we have looked the mathematics behind distributions, most notably binomial, Poisson, and Gamma, with a little bit of exponential. These distributions are interesting in and of themselves, but their true beauty shines through when we analyze them under the light of Bayesian inference. In today's post, we first develop an intuition for conditional probabilities to derive Bayes' theorem. From there, we motivate the method of Bayesian inference as a means of understanding probability. | |
| | | | |
www.hamza.se
|
|
| | | A walkthrough of implementing a neural network from scratch in Python, exploring what makes these seemingly complex systems actually quite straightforward. | ||