Explore >> Select a destination


You are here

lukealexdavis.co.uk
| | polukhin.tech
11.1 parsecs away

Travel
| | As the field of Deep Learning continues to grow, the demand for efficient and lightweight neural networks becomes increasingly important. In this blog post, we will explore six lightweight neural network architectures.
| | coornail.net
10.8 parsecs away

Travel
| | Neural networks are a powerful tool in machine learning that can be trained to perform a wide range of tasks, from image classification to natural language processing. In this blog post, well explore how to teach a neural network to add together two numbers. You can also think about this article as a tutorial for tensorflow.
| | sander.ai
10.3 parsecs away

Travel
| | Slides for my talk at the Deep Learning London meetup
| | www.paepper.com
72.2 parsecs away

Travel
| Today's paper: Rethinking 'Batch' in BatchNorm by Wu & Johnson BatchNorm is a critical building block in modern convolutional neural networks. Its unique property of operating on "batches" instead of individual samples introduces significantly different behaviors from most other operations in deep learning. As a result, it leads to many hidden caveats that can negatively impact model's performance in subtle ways. This is a citation from the paper's abstract and the emphasis is mine which caught my attention. Let's explore these subtle ways which can negatively impact your model's performance! The paper of Wu & Johnson can be found on arxiv.