Explore >> Select a destination


You are here

distill.pub
| | fanpu.io
8.8 parsecs away

Travel
| | Deep learning is currently dominated by parametric models, which are models with a fixed number of parameters regardless of the size of the training dataset. Examples include linear regression models and neural networks. However, it's good to occasionally take a step back and remember that that is not all there is. Non-parametric models like k-NN, decision trees, or kernel density estimation don't rely on a fixed set of weights, but instead grow in complexity based on the size of the data. In this post we'll talk about Gaussian processes, a conceptually important, but in my opinion under-appreciated non-parametric approach with deep connections with modern-day neural networks. An intersting motivating fact which we will eventually show is that neural networks initialized with Gaussian weights are equivalent to Gaussian processes in the infinite-width limit.
| | pberba.github.io
14.8 parsecs away

Travel
| | What is HDBSCAN and why does it work
| | christopher-beckham.github.io
8.3 parsecs away

Travel
| | Vicinal distributions as a statistical view on data augmentation
| | www.huber.embl.de
78.3 parsecs away

Travel
| If you are a biologist and want to get the best out of the powerful methods of modern computational statistics, this is your book.