Explore >> Select a destination


You are here

proceedings.neurips.cc
| | automl.github.io
5.0 parsecs away

Travel
| |
| | tomhume.org
11.8 parsecs away

Travel
| | I don't remember how I came across it, but this is one of the most exciting papers I've read recently. The authors train a neural network that tries to identify the next in a sequence of MNIST samples, presented in digit order. The interesting part is that when they include a proxy for energy usage in the loss function (i.e. train it to be more energy-efficient), the resulting network seems to exhibit the characteristics of predictive coding: some units seem to be responsible for predictions, others for encoding prediction error.
| | resources.paperdigest.org
4.8 parsecs away

Travel
| | Download NIPS-2018-Paper-Digests.pdf- highlights of all NIPS-2018 papers. The Conference on Neural Information Processing Systems (NIPS) is one of the top machine learning conferences in the world. In 2018, it is to be held in Montreal, Canada. There were ~4,800 paper submissions, of which 1009
| | jaketae.github.io
27.6 parsecs away

Travel
| Recently, a friend recommended me a book, Deep Learning with Python by Francois Chollet. As an eager learner just starting to fiddle with the Keras API, I decided it was a good starting point. I have just finished the first section of Part 2 on Convolutional Neural Networks and image processing. My impression so far is that the book is more focused on code than math. The apparent advantage of this approach is that it shows readers how to build neural networks very transparently. It's also a good introduction to many neural network models, such as CNNs or LSTMs. On the flip side, it might leave some readers wondering why these models work, concretely and mathematically. This point notwithstanding, I've been enjoying the book very much so far, and this post is...