|
You are here |
hadrienj.github.io | ||
| | | | |
blog.georgeshakan.com
|
|
| | | | | Principal Component Analysis (PCA) is a popular technique in machine learning for dimension reduction. It can be derived from Singular Value Decomposition (SVD) which we will discuss in this post. We will cover the math, an example in python, and finally some intuition. The Math SVD asserts that any $latex m \times d$ matrix $latex... | |
| | | | |
vxy10.github.io
|
|
| | | | | Course material for MEC 560: Advanced Control Systems taught at Stony Brook University by Dr. Vivek Yadav | |
| | | | |
robotchinwag.com
|
|
| | | | | Deriving the gradients for the backward pass for matrix multiplication using tensor calculus | |
| | | | |
nikita-volkov.github.io
|
|
| | | In this post I'm gonna highlight the issues of the "Internal" modularisation convention and provide a proper solution to the same set of problems. | ||