|
You are here |
mccormickml.com | ||
| | | | |
bdtechtalks.com
|
|
| | | | | The transformer model has become one of the main highlights of advances in deep learning and deep neural networks. | |
| | | | |
www.singlelunch.com
|
|
| | | | | This is the blog version of a talk of mine on embedding methods. It's the main slides and what I would say in the talk. Intended audience: Anyone interested in embedding methods. I don'... | |
| | | | |
towardsml.wordpress.com
|
|
| | | | | Unless you have been out of touch with the Deep Learning world, chances are that you have heard about BERT - it has been the talk of the town for the last one year. At the end of 2018 researchers at Google AI Language open-sourced a new technique for Natural Language Processing (NLP) called BERT... | |
| | | | |
jaketae.github.io
|
|
| | | Recently, a friend recommended me a book, Deep Learning with Python by Francois Chollet. As an eager learner just starting to fiddle with the Keras API, I decided it was a good starting point. I have just finished the first section of Part 2 on Convolutional Neural Networks and image processing. My impression so far is that the book is more focused on code than math. The apparent advantage of this approach is that it shows readers how to build neural networks very transparently. It's also a good introduction to many neural network models, such as CNNs or LSTMs. On the flip side, it might leave some readers wondering why these models work, concretely and mathematically. This point notwithstanding, I've been enjoying the book very much so far, and this post is... | ||