|
You are here |
seanzhang.me | ||
| | | | |
mattkeeter.com
|
|
| | | | | ||
| | | | |
opensourc.es
|
|
| | | | | Bézier curve basics and animations | |
| | | | |
jaketae.github.io
|
|
| | | | | In this short post, we will take a look at variational lower bound, also referred to as the evidence lower bound or ELBO for short. While I have referenced ELBO in a previous blog post on VAEs, the proofs and formulations presented in the post seems somewhat overly convoluted in retrospect. One might consider this a gentler, more refined recap on the topic. For the remainder of this post, I will use the terms "variational lower bound" and "ELBO" interchangeably to refer to the same concept. I was heavily inspired by Hugo Larochelle's excellent lecture on deep belief networks. | |
| | | | |
almostsuremath.com
|
|
| | | Given a sequence $latex {X_1,X_2,\ldots}&fg=000000$ of real-valued random variables defined on a probability space $latex {(\Omega,\mathcal F,{\mathbb P})}&fg=000000$, it is a standard result that the supremum $latex \displaystyle \setlength\arraycolsep{2pt} \begin{array}{rl} &\displaystyle X\colon\Omega\rightarrow{\mathbb R}\cup\{\infty\},\smallskip\\ &\displaystyle X(\omega)=\sup_nX_n(\omega). \end{array} &fg=000000$ is measurable. To ensure that this is well-defined, we need to allow X to have values in $latex... | ||