Explore >> Select a destination


You are here

www.hhyu.org
| | bdtechtalks.com
1.9 parsecs away

Travel
| | Gradient descent is the main technique for training machine learning and deep learning models. Read all about it.
| | blog.demofox.org
3.4 parsecs away

Travel
| | This article explains how these four things fit together and shows some examples of what they are used for. Derivatives Derivatives are the most fundamental concept in calculus. If you have a function, a derivative tells you how much that function changes at each point. If we start with the function $latex y=x^2-6x+13$, we can...
| | robotchinwag.com
5.2 parsecs away

Travel
| | Deriving the gradients for the backward pass for matrix multiplication using tensor calculus
| | theorydish.blog
19.7 parsecs away

Travel
| The chain rule is a fundamental result in calculus. Roughly speaking, it states that if a variable $latex c$ is a differentiable function of intermediate variables $latex b_1,\ldots,b_n$, and each intermediate variable $latex b_i$ is itself a differentiable function of $latex a$, then we can compute the derivative $latex \frac{{\mathrm d} c}{{\mathrm d} a}$ as...