Explore >> Select a destination


You are here

programmathically.com
| | www.khanna.law
1.4 parsecs away

Travel
| | You want to train a deep neural network. You have the data. It's labeled and wrangled into a useful format. What do you do now?
| | www.v7labs.com
1.0 parsecs away

Travel
| | A neural network activation function is a function that is applied to the output of a neuron. Learn about different types of activation functions and how they work.
| | vankessel.io
1.0 parsecs away

Travel
| | A blog for my thoughts. Mostly philosophy, math, and programming.
| | datadan.io
9.6 parsecs away

Travel
| Linear regression and gradient descent are techniques that form the basis of many other, more complicated, ML/AI techniques (e.g., deep learning models). They are, thus, building blocks that all ML/AI engineers need to understand.