Explore >> Select a destination


You are here

www.arrsingh.com
| | adl1995.github.io
2.0 parsecs away

Travel
| | [AI summary] The article explains various activation functions used in neural networks, their properties, and applications, including binary step, tanh, ReLU, and softmax functions.
| | blog.ephorie.de
1.2 parsecs away

Travel
| | [AI summary] The blog post explores the connection between logistic regression and neural networks, demonstrating how logistic regression can be viewed as the simplest form of a neural network through mathematical equivalence and practical examples.
| | brandinho.github.io
2.2 parsecs away

Travel
| | [AI summary] This blog post explores the mathematical foundations of popular supervised learning loss functions, specifically focusing on linear, logistic, and softmax regression. It emphasizes that the derivatives for parameter updates are consistent across these models, even though they use different loss functions. The author provides detailed derivations for each model, showing that the final derivative of the loss with respect to the linear equation $z$ results in $\hat{y} - y$ for all three cases. The post also includes Python code examples and highlights the importance of proper matrix shaping and activation functions in neural networks.
| | wtfleming.github.io
14.1 parsecs away

Travel
| [AI summary] This post discusses achieving 99.1% accuracy in binary image classification of cats and dogs using an ensemble of ResNet models with PyTorch.