|
You are here |
kevinkle.in | ||
| | | | |
sookocheff.com
|
|
| | | | | A common method of reducing the complexity of n-gram modeling is using the Markov Property. The Markov Property states that the probability of future states depends only on the present state, not on the sequence of events that preceded it. This concept can be elegantly implemented using a Markov Chain storing the probabilities of transitioning to a next state. | |
| | | | |
blog.jordan.matelsky.com
|
|
| | | | | The Short Story: I made a web-app that, given some starting text, naively tries to predict what words come next. Because the 'training' text was taken from F. Scott Fitzgerald's Tender is the Night (the first 10 chapters), we can (inaccurately) say that this robot talks like Fitzgerald. | |
| | | | |
gist.github.com
|
|
| | | | | Generate text from an input using a simple Markov chain generator - markov.py | |
| | | | |
www.jeremykun.com
|
|
| | | Machine learning is broadly split into two camps, statistical learning and non-statistical learning. The latter we've started to get a good picture of on this blog; we approached Perceptrons, decision trees, and neural networks from a non-statistical perspective. And generally "statistical" learning is just that, a perspective. Data is phrased in terms of independent and dependent variables, and statistical techniques are leveraged against the data. In this post we'll focus on the simplest example of thi... | ||