Explore >> Select a destination


You are here

nhigham.com
| | blog.georgeshakan.com
2.4 parsecs away

Travel
| | Principal Component Analysis (PCA) is a popular technique in machine learning for dimension reduction. It can be derived from Singular Value Decomposition (SVD) which we will discuss in this post. We will cover the math, an example in python, and finally some intuition. The Math SVD asserts that any $latex m \times d$ matrix $latex...
| | nickhar.wordpress.com
1.3 parsecs away

Travel
| | 1. Low-rank approximation of matrices Let $latex {A}&fg=000000$ be an arbitrary $latex {n \times m}&fg=000000$ matrix. We assume $latex {n \leq m}&fg=000000$. We consider the problem of approximating $latex {A}&fg=000000$ by a low-rank matrix. For example, we could seek to find a rank $latex {s}&fg=000000$ matrix $latex {B}&fg=000000$ minimizing $latex { \lVert A - B...
| | hadrienj.github.io
1.6 parsecs away

Travel
| | This post will introduce you to special kind of matrices: the identity matrix and the inverse matrix. We will use Python/Numpy as a tool to get a better intu...
| | www.ethanepperly.com
27.6 parsecs away

Travel
|