You are here |
gregorygundersen.com | ||
| | | |
jonathanweisberg.org
|
|
| | | | Jonathan Weisberg's Homepage | |
| | | |
almostsuremath.com
|
|
| | | | It is quite common to consider functions of real-time stochastic process which depend on whether or not it crosses a specified barrier level K. This can involve computing expectations involving a real-valued process X of the form $latex \displaystyle V={\mathbb E}\left[f(X_T);\;\sup{}_{t\le T}X_t \ge K\right] &fg=000000$ (1) for a positive time T and function f:????. I... | |
| | | |
almostsuremath.com
|
|
| | | | The Rademacher distribution is probably the simplest nontrivial probability distribution that you can imagine. This is a discrete distribution taking only the two possible values $latex {\{1,-1\}}&fg=000000$, each occurring with equal probability. A random variable X has the Rademacher distribution if $latex \displaystyle {\mathbb P}(X=1)={\mathbb P}(X=-1)=1/2. &fg=000000$ A Randemacher sequence is an IID sequence of... | |
| | | |
www.jeremykun.com
|
|
| | Machine learning is broadly split into two camps, statistical learning and non-statistical learning. The latter we've started to get a good picture of on this blog; we approached Perceptrons, decision trees, and neural networks from a non-statistical perspective. And generally "statistical" learning is just that, a perspective. Data is phrased in terms of independent and dependent variables, and statistical techniques are leveraged against the data. In this post we'll focus on the simplest example of thi... |