|
You are here |
programmathically.com | ||
| | | | |
www.khanna.law
|
|
| | | | | You want to train a deep neural network. You have the data. It's labeled and wrangled into a useful format. What do you do now? | |
| | | | |
www.v7labs.com
|
|
| | | | | A neural network activation function is a function that is applied to the output of a neuron. Learn about different types of activation functions and how they work. | |
| | | | |
vankessel.io
|
|
| | | | | A blog for my thoughts. Mostly philosophy, math, and programming. | |
| | | | |
datadan.io
|
|
| | | Linear regression and gradient descent are techniques that form the basis of many other, more complicated, ML/AI techniques (e.g., deep learning models). They are, thus, building blocks that all ML/AI engineers need to understand. | ||