Explore >> Select a destination


You are here

comsci.blog
| | www.paepper.com
0.9 parsecs away

Travel
| | When you have a big data set and a complicated machine learning problem, chances are that training your model takes a couple of days even on a modern GPU. However, it is well-known that the cycle of having a new idea, implementing it and then verifying it should be as quick as possible. This is to ensure that you can efficiently test out new ideas. If you need to wait for a whole week for your training run, this becomes very inefficient.
| | vxlabs.com
1.0 parsecs away

Travel
| | I have recently become fascinated with (Variational) Autoencoders and with PyTorch. Kevin Frans has a beautiful blog post online explaining variational autoencoders, with examples in TensorFlow and, importantly, with cat pictures. Jaan Altosaar's blog post takes an even deeper look at VAEs from both the deep learning perspective and the perspective of graphical models. Both of these posts, as well as Diederik Kingma's original 2014 paper Auto-Encoding Variational Bayes, are more than worth your time.
| | www.nicktasios.nl
0.9 parsecs away

Travel
| | In the Latent Diffusion Series of blog posts, I'm going through all components needed to train a latent diffusion model to generate random digits from the MNIST dataset. In this first post, we will tr
| | blog.evjang.com
14.4 parsecs away

Travel
| This tutorial will show you how to use normalizing flows like MAF, IAF, and Real-NVP to deform an isotropic 2D Gaussian into a complex cl...