Explore >> Select a destination


You are here

dennybritz.com
| | blog.ephorie.de
1.0 parsecs away

Travel
| | [AI summary] The blog post explores the connection between logistic regression and neural networks, demonstrating how logistic regression can be viewed as the simplest form of a neural network through mathematical equivalence and practical examples.
| | sirupsen.com
1.1 parsecs away

Travel
| | [AI summary] The article provides an in-depth explanation of how to build a neural network from scratch, focusing on the implementation of a simple average function and the introduction of activation functions for non-linear tasks. It discusses the use of matrix operations, the importance of GPUs for acceleration, and the role of activation functions like ReLU. The author also outlines next steps for further exploration, such as expanding the model, adding layers, and training on datasets like MNIST.
| | datadan.io
1.3 parsecs away

Travel
| | Linear regression and gradient descent are techniques that form the basis of many other, more complicated, ML/AI techniques (e.g., deep learning models). They are, thus, building blocks that all ML/AI engineers need to understand.
| | sefiks.com
8.1 parsecs away

Travel
| Scientists tend to consume activation functions which have meaningful derivatives. That's why, sigmoid and hyperbolic tangent functions are the most common activation functions in literature. Herein, softplus is a newer function than sigmoid and tanh.