|
You are here |
hadrienj.github.io | ||
| | | | |
blog.georgeshakan.com
|
|
| | | | | Principal Component Analysis (PCA) is a popular technique in machine learning for dimension reduction. It can be derived from Singular Value Decomposition (SVD) which we will discuss in this post. We will cover the math, an example in python, and finally some intuition. The Math SVD asserts that any $latex m \times d$ matrix $latex... | |
| | | | |
nhigham.com
|
|
| | | | | The pseudoinverse is an extension of the concept of the inverse of a nonsingular square matrix to singular matrices and rectangular matrices. It is one of many generalized inverses, but the one most useful in practice as it has a number of special properties. The pseudoinverse of a matrix $latex A\in\mathbb{C}^{m\times n}$ is an $latex... | |
| | | | |
www.aleksandrhovhannisyan.com
|
|
| | | | | Some systems of equations do not have a unique solution, but we can find an approximate solution using the method of least squares. Applications of this method include linear and polynomial regression. | |
| | | | |
datascience.blog.wzb.eu
|
|
| | | Modern computers are equipped with processors that allow fast parallel computation at several levels: Vector or array operations, which allow to execute similar operations simultaneously on a bunch... | ||