Explore >> Select a destination


You are here

blog.mecheye.net
| | probablydance.com
4.6 parsecs away

Travel
| | The title of this blog post is obvious for any game programmer, but I notice that people outside of games often write clumsy code because they don't know how transform matrices work. Especially when people do some simple 2D rendering code, like if you just want to quickly visualize some data in a HTML canvas....
| | robotchinwag.com
3.9 parsecs away

Travel
| | Deriving the gradients for the backward pass for matrix multiplication using tensor calculus
| | jeskin.net
3.7 parsecs away

Travel
| | [AI summary] This blog post explains the concept of model, view, and projection matrices in 3D graphics rendering, detailing their roles in transforming 3D objects to 2D screen space and providing examples of their implementation in OpenGL ES.
| | blog.owulveryck.info
20.7 parsecs away

Travel
| You may know how enthusiast I am about machine learning. A while ago I discovered recurrent neural networks. I have read that this 'tool' allow to predict the future! Is this a kind of magic? I have read a lot of stuffs about the 'unreasonable effectiveness' of this mechanism. The litteracy that gives deep explanation exists and is excellent. There is also plehtora of examples, but most of them are using python and a calcul framework. To fully undestand how things work (as I am not a data-scientist), I needed to write my own tool 'from scratch'. This is what this post is about: a more-or-less 'from scratch' implementation of a RNN in go that can be used to applied to a lot of examples