Explore >> Select a destination


You are here

sefiks.com
| | serengil.wordpress.com
4.8 parsecs away

Travel
| | Sigmoid function (aka logistic function) is moslty picked up as activation functionin neural networks.Because its derivative is easy to demonstrate. It produces output in scale of [0 ,1] whereas input is meaningful between [-5, +5]. Out of this range produces same outputs.In this post, we'll mention the proof of the derivative calculation. Sigmoid function is...
| | www.swyx.io
6.5 parsecs away

Travel
| | That one time we tried to emulate our brains with computer chips
| | adl1995.github.io
6.2 parsecs away

Travel
| |
| | saturncloud.io
48.2 parsecs away

Travel
| By combining Dask and PyTorch you can easily speed up training a model across a cluster of GPUs. But how much of a benefit does that bring? This blog post finds out!