|
You are here |
sookocheff.com | ||
| | | | |
setosa.io
|
|
| | | | | ||
| | | | |
parametricity.com
|
|
| | | | | If I say the word "swimming" to you, you've got a fair bit of information about what word I'm going to say next. | |
| | | | |
blog.jordan.matelsky.com
|
|
| | | | | The Short Story: I made a web-app that, given some starting text, naively tries to predict what words come next. Because the 'training' text was taken from F. Scott Fitzgerald's Tender is the Night (the first 10 chapters), we can (inaccurately) say that this robot talks like Fitzgerald. | |
| | | | |
rgoswami.me
|
|
| | | Explain why using bagging for prediction trees generally improves predictions over regular prediction trees. Introduction Bagging (or Bootstrap Aggregation) is one of the most commonly used ensemble method for improving the prediction of trees. We will broadly follow a historical development trend to understand the process. That is, we will begin by considering the Bootstrap method. This in turn requires knowledge of the Jacknife method, which is understandable from a simple bias variance perspective. | ||