|
You are here |
antoinevastel.com | ||
| | | | |
hadrienj.github.io
|
|
| | | | | In this post, we will see special kinds of matrix and vectors the diagonal and symmetric matrices, the unit vector and the concept of orthogonality. | |
| | | | |
nhigham.com
|
|
| | | | | The pseudoinverse is an extension of the concept of the inverse of a nonsingular square matrix to singular matrices and rectangular matrices. It is one of many generalized inverses, but the one most useful in practice as it has a number of special properties. The pseudoinverse of a matrix $latex A\in\mathbb{C}^{m\times n}$ is an $latex... | |
| | | | |
blog.georgeshakan.com
|
|
| | | | | Principal Component Analysis (PCA) is a popular technique in machine learning for dimension reduction. It can be derived from Singular Value Decomposition (SVD) which we will discuss in this post. We will cover the math, an example in python, and finally some intuition. The Math SVD asserts that any $latex m \times d$ matrix $latex... | |
| | | | |
matthewmcateer.me
|
|
| | | Important mathematical prerequisites for getting into Machine Learning, Deep Learning, or any of the other space | ||