|
You are here |
senthil.learntosolveit.com | ||
| | | | |
sander.ai
|
|
| | | | | This is an addendum to my post about typicality, where I try to quantify flawed intuitions about high-dimensional distributions. | |
| | | | |
teddykoker.com
|
|
| | | | | A few posts back I wrote about a common parameter optimization method known as Gradient Ascent. In this post we will see how a similar method can be used to create a model that can classify data. This time, instead of using gradient ascent to maximize a reward function, we will use gradient descent to minimize a cost function. Lets start by importing all the libraries we need: | |
| | | | |
e2eml.school
|
|
| | | | | ||
| | | | |
programminghistorian.org
|
|
| | | [AI summary] The text provides an in-depth explanation of using neural networks for image classification, focusing on the Teachable Machine and ml5.js tools. It walks through creating a model, testing it with an image, and displaying results on a canvas. The text also discusses the limitations of the model, the importance of training data, and suggests further resources for learning machine learning. | ||