|
You are here |
newvick.com | ||
| | | | |
marcospereira.me
|
|
| | | | | In this post we summarize the math behind deep learning and implement a simple network that achieves 85% accuracy classifying digits from the MNIST dataset. | |
| | | | |
comsci.blog
|
|
| | | | | In this tutorial, we will learn two different methods to implement neural networks from scratch using Python: Extremely simple method: Finite difference Still a very simple method: Backpropagation | |
| | | | |
windowsontheory.org
|
|
| | | | | (Updated and expanded 12/17/2021) I am teaching deep learning this week in Harvard's CS 182 (Artificial Intelligence) course. As I'm preparing the back-propagation lecture, Preetum Nakkiran told me about Andrej Karpathy's awesome micrograd package which implements automatic differentiation for scalar variables in very few lines of code. I couldn't resist using this to show how... | |
| | | | |
www.v7labs.com
|
|
| | | Recurrent neural networks (RNNs) are well-suited for processing sequences of data. Explore different types of RNNs and how they work. | ||