 
      
    | You are here | arkadiusz-jadczyk.eu | ||
| | | | | nhigham.com | |
| | | | | A companion matrix $latex C\in\mathbb{C}^{n\times n}$ is an upper Hessenberg matrix of the form $latex \notag C = \begin{bmatrix} a_{n-1} & a_{n-2} & \dots &\dots & a_0 \\ 1 & 0 & \dots &\dots & 0 \\ 0 & 1 & \ddots & & \vdots \\ \vdots & & \ddots & 0 & 0 \\... | |
| | | | | stephenmalina.com | |
| | | | | Selected Exercises # 5.A # 12. Define $ T \in \mathcal L(\mathcal P_4(\mathbf{R})) $ by $$ (Tp)(x) = xp'(x) $$ for all $ x \in \mathbf{R} $. Find all eigenvalues and eigenvectors of $ T $. Observe that, if $ p = a_0 + a_1 x + a_2 x^2 + a_3 x^3 + a_4 x^4 $, then $$ x p'(x) = a_1 x + 2 a_2 x^2 + 3 a_3 x^3 + 4 a_4 x^4. | |
| | | | | hadrienj.github.io | |
| | | | | In this post, we will see special kinds of matrix and vectors the diagonal and symmetric matrices, the unit vector and the concept of orthogonality. | |
| | | | | michael-lewis.com | |
| | | This is a short summary of some of the terminology used in machine learning, with an emphasis on neural networks. I've put it together primarily to help my own understanding, phrasing it largely in non-mathematical terms. As such it may be of use to others who come from more of a programming than a mathematical background. | ||