|
You are here |
www.lesswrong.com | ||
| | | | |
windowsontheory.org
|
|
| | | | | Previous post: ML theory with bad drawings Next post: What do neural networks learn and when do they learn it, see also all seminar posts and course webpage. Lecture video (starts in slide 2 since I hit record button 30 seconds too late - sorry!) - slides (pdf) - slides (Powerpoint with ink and animation)... | |
| | | | |
www.3blue1brown.com
|
|
| | | | | An overview of gradient descent in the context of neural networks. This is a method used widely throughout machine learning for optimizing how a computer performs on certain tasks. | |
| | | | |
datadan.io
|
|
| | | | | Linear regression and gradient descent are techniques that form the basis of many other, more complicated, ML/AI techniques (e.g., deep learning models). They are, thus, building blocks that all ML/AI engineers need to understand. | |
| | | | |
codeincomplete.com
|
|
| | | Personal Website for Jake Gordon | ||