|
You are here |
dsaber.com | ||
| | | | |
www.jeremykun.com
|
|
| | | | | Machine learning is broadly split into two camps, statistical learning and non-statistical learning. The latter we've started to get a good picture of on this blog; we approached Perceptrons, decision trees, and neural networks from a non-statistical perspective. And generally "statistical" learning is just that, a perspective. Data is phrased in terms of independent and dependent variables, and statistical techniques are leveraged against the data. In this post we'll focus on the simplest example of thi... | |
| | | | |
dfm.io
|
|
| | | | | ||
| | | | |
twiecki.io
|
|
| | | | | [AI summary] This blog post discusses hierarchical linear regression in PyMC3, highlighting its advantages over non-hierarchical Bayesian modeling. The author explores how hierarchical models can effectively handle multi-level data by leveraging the 'shrinkage-effect', which improves predictions by borrowing strength from related groups. Using the radon dataset, the post compares individual and hierarchical models, demonstrating that the hierarchical approach provides more accurate and robust estimates, especially in cases with limited data. The key takeaway is that hierarchical models balance individual and group-level insights, offering the best of both worlds in data analysis. | |
| | | | |
ddarmon.github.io
|
|
| | | |||