|
You are here |
mattbaker.blog | ||
| | | | |
www.jeremykun.com
|
|
| | | | | Last time we defined and gave some examples of rings. Recapping, a ring is a special kind of group with an additional multiplication operation that "plays nicely" with addition. The important thing to remember is that a ring is intended to remind us arithmetic with integers (though not too much: multiplication in a ring need not be commutative). We proved some basic properties, like zero being unique and negation being well-behaved. | |
| | | | |
stephenmalina.com
|
|
| | | | | Selected Exercises # 5.A # 12. Define $ T \in \mathcal L(\mathcal P_4(\mathbf{R})) $ by $$ (Tp)(x) = xp'(x) $$ for all $ x \in \mathbf{R} $. Find all eigenvalues and eigenvectors of $ T $. Observe that, if $ p = a_0 + a_1 x + a_2 x^2 + a_3 x^3 + a_4 x^4 $, then $$ x p'(x) = a_1 x + 2 a_2 x^2 + 3 a_3 x^3 + 4 a_4 x^4. | |
| | | | |
nickhar.wordpress.com
|
|
| | | | | 1. Low-rank approximation of matrices Let $latex {A}&fg=000000$ be an arbitrary $latex {n \times m}&fg=000000$ matrix. We assume $latex {n \leq m}&fg=000000$. We consider the problem of approximating $latex {A}&fg=000000$ by a low-rank matrix. For example, we could seek to find a rank $latex {s}&fg=000000$ matrix $latex {B}&fg=000000$ minimizing $latex { \lVert A - B... | |
| | | | |
blog.fastforwardlabs.com
|
|
| | | By Chris and Melanie. The machine learning life cycle is more than data + model = API. We know there is a wealth of subtlety and finesse involved in data cleaning and feature engineering. In the same vein, there is more to model-building than feeding data in and reading off a prediction. ML model building requires thoughtfulness both in terms of which metric to optimize for a given problem, and how best to optimize your model for that metric! | ||