Explore >> Select a destination


You are here

github.com
| | blog.fastforwardlabs.com
11.3 parsecs away

Travel
| | The common approach in machine learning is to train and optimize one task at a time. In contrast, multitask learning (MTL) trains related tasks in parallel, using a shared representation. One advantage of MTL is improved generalization - using information regarding related tasks prevents a model from being overly focused on a single task, while it is also learning to produce better results. MTL is an approach, and is not restricted to any particular algorithm.
| | www.jeremymorgan.com
10.0 parsecs away

Travel
| | Want to learn about PyTorch? Of course you do. This tutorial covers PyTorch basics, creating a simple neural network, and applying it to classify handwritten digits.
| | www.ntentional.com
10.6 parsecs away

Travel
| | Highlights from my favorite Deep Learning efficiency-related papers at ICLR 2020
| | vxlabs.com
56.4 parsecs away

Travel
| I have recently become fascinated with (Variational) Autoencoders and with PyTorch. Kevin Frans has a beautiful blog post online explaining variational autoencoders, with examples in TensorFlow and, importantly, with cat pictures. Jaan Altosaar's blog post takes an even deeper look at VAEs from both the deep learning perspective and the perspective of graphical models. Both of these posts, as well as Diederik Kingma's original 2014 paper Auto-Encoding Variational Bayes, are more than worth your time.