Explore >> Select a destination


You are here

www.hamza.se
| | sirupsen.com
1.5 parsecs away

Travel
| | [AI summary] The article provides an in-depth explanation of how to build a neural network from scratch, focusing on the implementation of a simple average function and the introduction of activation functions for non-linear tasks. It discusses the use of matrix operations, the importance of GPUs for acceleration, and the role of activation functions like ReLU. The author also outlines next steps for further exploration, such as expanding the model, adding layers, and training on datasets like MNIST.
| | dennybritz.com
1.0 parsecs away

Travel
| | All the code is also available as an Jupyter notebook on Github.
| | www.paepper.com
0.9 parsecs away

Travel
| | [AI summary] This article explains how to train a simple neural network using Numpy in Python without relying on frameworks like TensorFlow or PyTorch, focusing on the implementation of ReLU activation, weight initialization, and gradient descent for optimization.
| | www.khanna.law
21.1 parsecs away

Travel
| You want to train a deep neural network. You have the data. It's labeled and wrangled into a useful format. What do you do now?