| 
	     You are here  | 
        magenta.tensorflow.org | ||
| | | | | 
            
              eigenfoo.xyz
             | 
        |
| | | | | My current project involves working with deep autoregressive models: a class of remarkable neural networks that aren't usually seen on a first pass through deep learning. These notes are a quick write-up of my reading and research: I assume basic familiarity with deep learning, and aim to highlight general trends and similarities across autoregressive models, instead of commenting on individual architectures. tldr: Deep autoregressive models are sequence models, yet feed-forward (i.e. not recurrent); generative models, yet supervised. They are a compelling alternative to RNNs for sequential data, and GANs for generation tasks. | |
| | | | | 
            
              swethatanamala.github.io
             | 
        |
| | | | | The authors developed a straightforward application of the Long Short-Term Memory (LSTM) architecture which can solve English to French translation. | |
| | | | | 
            
              coornail.net
             | 
        |
| | | | | Neural networks are a powerful tool in machine learning that can be trained to perform a wide range of tasks, from image classification to natural language processing. In this blog post, well explore how to teach a neural network to add together two numbers. You can also think about this article as a tutorial for tensorflow. | |
| | | | | 
            
              emptybranchesonthefamilytree.com
             | 
        |
| | | |||