|
You are here |
hadrienj.github.io | ||
| | | | |
grigory.github.io
|
|
| | | | | Discussion of the class on Foundations of Data Science that I am teaching at IU this Fall. | |
| | | | |
stephenmalina.com
|
|
| | | | | Matrix Potpourri # As part of reviewing Linear Algebra for my Machine Learning class, I've noticed there's a bunch of matrix terminology that I didn't encounter during my proof-based self-study of LA from Linear Algebra Done Right. This post is mostly intended to consolidate my own understanding and to act as a reference to future me, but if it also helps others in a similar position, that's even better! | |
| | | | |
blog.georgeshakan.com
|
|
| | | | | Principal Component Analysis (PCA) is a popular technique in machine learning for dimension reduction. It can be derived from Singular Value Decomposition (SVD) which we will discuss in this post. We will cover the math, an example in python, and finally some intuition. The Math SVD asserts that any $latex m \times d$ matrix $latex... | |
| | | | |
www.opinosis-analytics.com
|
|
| | | Machine learning model deployment is also known "putting models into production" | ||