Explore >> Select a destination


You are here

archives.argmin.net
| | sirupsen.com
14.1 parsecs away

Travel
| |
| | algobeans.com
12.0 parsecs away

Travel
| | Modern smartphone apps allow you to recognize handwriting and convert them into typed words. We look at how we can train our own neural network algorithm to do this.
| | www.paepper.com
13.8 parsecs away

Travel
| | Today's paper: Rethinking 'Batch' in BatchNorm by Wu & Johnson BatchNorm is a critical building block in modern convolutional neural networks. Its unique property of operating on "batches" instead of individual samples introduces significantly different behaviors from most other operations in deep learning. As a result, it leads to many hidden caveats that can negatively impact model's performance in subtle ways. This is a citation from the paper's abstract and the emphasis is mine which caught my attention. Let's explore these subtle ways which can negatively impact your model's performance! The paper of Wu & Johnson can be found on arxiv.
| | sefiks.com
78.0 parsecs away

Travel
| Heaviside step function is one of the most common activation function in neural networks. The functionproduces binary output. That is the reason why it alsocalled as binary step function. That's why, they are very useful for binary classification studies.