You are here |
yann.lecun.com | ||
| | | |
questionableengineering.com
|
|
| | | | John W Grun AbstractIn this paper, a manually implemented LeNet-5 convolutional NN with an Adam optimizer written in Numpy will be presented. This paper will also cover a description of the data use | |
| | | |
studywolf.wordpress.com
|
|
| | | | Back in September I had an review article in Science Robotics published, discussing new work from Abadia et al, 2021 titled A cerebellar-based solution to the nondeterministic time delay problem in robotic control. In my review I talk about the parallel's between the brain inspired approach Abadia et al used to create a neural circuit... | |
| | | |
www.wjst.de
|
|
| | | | But let your communication be Yea, yea; Nay, nay. For whatsoever is more than these cometh of evil. | |
| | | |
teddykoker.com
|
|
| | A few posts back I wrote about a common parameter optimization method known as Gradient Ascent. In this post we will see how a similar method can be used to create a model that can classify data. This time, instead of using gradient ascent to maximize a reward function, we will use gradient descent to minimize a cost function. Lets start by importing all the libraries we need: |