Explore >> Select a destination


You are here

parametricity.com
| | www.mathsisfun.com
6.6 parsecs away

Travel
| |
| | stephenmalina.com
5.2 parsecs away

Travel
| | Selected Exercises # 5.A # 12. Define $ T \in \mathcal L(\mathcal P_4(\mathbf{R})) $ by $$ (Tp)(x) = xp'(x) $$ for all $ x \in \mathbf{R} $. Find all eigenvalues and eigenvectors of $ T $. Observe that, if $ p = a_0 + a_1 x + a_2 x^2 + a_3 x^3 + a_4 x^4 $, then $$ x p'(x) = a_1 x + 2 a_2 x^2 + 3 a_3 x^3 + 4 a_4 x^4.
| | nhigham.com
6.1 parsecs away

Travel
| | The Cayley-Hamilton Theorem says that a square matrix $LATEX A$ satisfies its characteristic equation, that is $latex p(A) = 0$ where $latex p(t) = \det(tI-A)$ is the characteristic polynomial. This statement is not simply the substitution ``$latex p(A) = \det(A - A) = 0$'', which is not valid since $latex t$ must remain a scalar...
| | blog.paperspace.com
27.0 parsecs away

Travel
| Follow this tutorial to learn what attention in deep learning is, and why attention is so important in image classification tasks. We then follow up with a demo on implementing attention from scratch with VGG.