Explore >> Select a destination


You are here

thenumb.at
| | francisbach.com
8.9 parsecs away

Travel
| |
| | www.jeremykun.com
8.2 parsecs away

Travel
| | Last time we investigated the naive (which I'll henceforth call "classical") notion of the Fourier transform and its inverse. While the development wasn't quite rigorous, we nevertheless discovered elegant formulas and interesting properties that proved useful in at least solving differential equations. Of course, we wouldn't be following this trail of mathematics if it didn't result in some worthwhile applications to programming. While we'll get there eventually, this primer will take us deeper down the rabbit hole of abstraction.
| | peterbloem.nl
9.6 parsecs away

Travel
| |
| | dustintran.com
118.5 parsecs away

Travel
| One aspect I always enjoy about machine learning is that questions often go back to the basics. The field essentially goes into an existential crisis every dozen years-rethinking our tools and asking foundational questions such as "why neural networks" or "why generative models".1 This was a theme in my conversations during NIPS 2016 last week, where a frequent topic was on the advantages of a Bayesian perspective to machine learning. Not surprisingly, this appeared as a big discussion point during the p...