Explore >> Select a destination


You are here

sookocheff.com
| | setosa.io
2.7 parsecs away

Travel
| |
| | parametricity.com
3.4 parsecs away

Travel
| | If I say the word "swimming" to you, you've got a fair bit of information about what word I'm going to say next.
| | blog.jordan.matelsky.com
4.5 parsecs away

Travel
| | The Short Story: I made a web-app that, given some starting text, naively tries to predict what words come next. Because the 'training' text was taken from F. Scott Fitzgerald's Tender is the Night (the first 10 chapters), we can (inaccurately) say that this robot talks like Fitzgerald.
| | rgoswami.me
103.3 parsecs away

Travel
| Explain why using bagging for prediction trees generally improves predictions over regular prediction trees. Introduction Bagging (or Bootstrap Aggregation) is one of the most commonly used ensemble method for improving the prediction of trees. We will broadly follow a historical development trend to understand the process. That is, we will begin by considering the Bootstrap method. This in turn requires knowledge of the Jacknife method, which is understandable from a simple bias variance perspective.