 
      
    | You are here | www.eblong.com | ||
| | | | | benhoyt.com | |
| | | | | Describes a simple Markov chain algorithm to generate reasonable-sounding but utterly nonsensical text, and presents some example outputs as well as a Python implementation. | |
| | | | | blog.jordan.matelsky.com | |
| | | | | The Short Story: I made a web-app that, given some starting text, naively tries to predict what words come next. Because the 'training' text was taken from F. Scott Fitzgerald's Tender is the Night (the first 10 chapters), we can (inaccurately) say that this robot talks like Fitzgerald. | |
| | | | | sookocheff.com | |
| | | | | A common method of reducing the complexity of n-gram modeling is using the Markov Property. The Markov Property states that the probability of future states depends only on the present state, not on the sequence of events that preceded it. This concept can be elegantly implemented using a Markov Chain storing the probabilities of transitioning to a next state. | |
| | | | | r2rt.com | |
| | | |||