Explore >> Select a destination


You are here

www.paepper.com
| | www.analyticsvidhya.com
3.2 parsecs away

Travel
| | Autoencoders are an essential part of AI; learn why this technology is important with this blog post aimed at understanding how autoencoders can help you
| | vxlabs.com
1.4 parsecs away

Travel
| | I have recently become fascinated with (Variational) Autoencoders and with PyTorch. Kevin Frans has a beautiful blog post online explaining variational autoencoders, with examples in TensorFlow and, importantly, with cat pictures. Jaan Altosaar's blog post takes an even deeper look at VAEs from both the deep learning perspective and the perspective of graphical models. Both of these posts, as well as Diederik Kingma's original 2014 paper Auto-Encoding Variational Bayes, are more than worth your time.
| | saturncloud.io
1.7 parsecs away

Travel
| | By combining Dask and PyTorch you can easily speed up training a model across a cluster of GPUs. But how much of a benefit does that bring? This blog post finds out!
| | comsci.blog
6.9 parsecs away

Travel
| In this blog post, we will learn about vision transformers (ViT), and implement an MNIST classifier with it. We will go step-by-step and understand every part of the vision transformers clearly, and you will see the motivations of the authors of the original paper in some of the parts of the architecture.