Explore >> Select a destination


You are here

newvick.com
| | windowsontheory.org
2.0 parsecs away

Travel
| | (Updated and expanded 12/17/2021) I am teaching deep learning this week in Harvard's CS 182 (Artificial Intelligence) course. As I'm preparing the back-propagation lecture, Preetum Nakkiran told me about Andrej Karpathy's awesome micrograd package which implements automatic differentiation for scalar variables in very few lines of code. I couldn't resist using this to show how...
| | comsci.blog
3.6 parsecs away

Travel
| | In this tutorial, we will learn two different methods to implement neural networks from scratch using Python: Extremely simple method: Finite difference Still a very simple method: Backpropagation
| | dennybritz.com
4.7 parsecs away

Travel
| | This the thirdpart of the Recurrent Neural Network Tutorial.
| | www.ericekholm.com
28.7 parsecs away

Travel
| Learning maximum likelihood estimation by fitting logistic regression 'by hand' (sort of)