|
You are here |
mikespivey.wordpress.com | ||
| | | | |
statisticaloddsandends.wordpress.com
|
|
| | | | | If $latex Z_1, \dots, Z_n$ are independent $latex \text{Cauchy}(0, 1)$ variables and $latex w= (w_1, \dots, w_n)$ is a random vector independent of the $latex Z_i$'s with $latex w_i \geq 0$ for all $latex i$ and $latex w_1 + \dots w_n = 0$, it is well-known that $latex \displaystyle\sum_{i=1}^n w_i Z_i$ also has a $latex... | |
| | | | |
ckrao.wordpress.com
|
|
| | | | | In this post I would like to prove the following identity, motivated by this tweet. $latex \displaystyle n! \prod_{k=0}^n \frac{1}{x+k} = \frac{1}{x\binom{x+n}{n}} = \sum_{k=0}^n \frac{(-1)^k \binom{n}{k}}{x+k}$ The first of these equalities is straightforward by the definition of binomial coefficients. To prove the second, we make use of partial fractions. We write the expansion $latex \displaystyle... | |
| | | | |
mathematicaloddsandends.wordpress.com
|
|
| | | | | The function $latex f(x) = x \log x$ occurs in various places across math/statistics/machine learning (e.g. in the definition of entropy), and I thought I'd put a list of properties of the function here that I've found useful. Here is a plot of the function: $latex f$ is defined on $latex (0, \infty)$. The only... | |
| | | | |
cardinalguzman.wordpress.com
|
|
| | | Encyclopedia Miscellaneous - 'quality' blogging since August 2011 | ||