Explore >> Select a destination


You are here

tomhume.org
| | coornail.net
3.6 parsecs away

Travel
| | Neural networks are a powerful tool in machine learning that can be trained to perform a wide range of tasks, from image classification to natural language processing. In this blog post, well explore how to teach a neural network to add together two numbers. You can also think about this article as a tutorial for tensorflow.
| | kpzhang93.github.io
5.1 parsecs away

Travel
| | Face detection and alignment in unconstrained environment are challenging due to various poses, illuminations and occlusions. Recent studies show that deep learning approaches can achieve impressive performance on these two tasks. In this paper, we propose a deep cascaded multi-task framework which exploits the inherent correlation between detection and alignment to boost up their performance. In particular, our framework leverages a cascaded architecture with three stages of carefully designed deep convolutional networks to predict face and landmark location in a coarse-to-fine manner. In addition, we propose a new online hard sample mining strategy that further improves the performance in practice. Our method achieves superior accuracy over the state-of-th...
| | polukhin.tech
2.1 parsecs away

Travel
| | As the field of Deep Learning continues to grow, the demand for efficient and lightweight neural networks becomes increasingly important. In this blog post, we will explore six lightweight neural network architectures.
| | vxlabs.com
14.9 parsecs away

Travel
| I have recently become fascinated with (Variational) Autoencoders and with PyTorch. Kevin Frans has a beautiful blog post online explaining variational autoencoders, with examples in TensorFlow and, importantly, with cat pictures. Jaan Altosaar's blog post takes an even deeper look at VAEs from both the deep learning perspective and the perspective of graphical models. Both of these posts, as well as Diederik Kingma's original 2014 paper Auto-Encoding Variational Bayes, are more than worth your time.