| 
	     You are here  | 
        evjang.com | ||
| | | | | 
            
              dennybritz.com
             | 
        |
| | | | | Deep Learning is such a fast-moving field and the huge number of research papers and ideas can be overwhelming. | |
| | | | | 
            
              iclr-blogposts.github.io
             | 
        |
| | | | | Home to the 2024 ICLR Blogposts track | |
| | | | | 
            
              gwern.net
             | 
        |
| | | | | On GPT-3: meta-learning, scaling, implications, and deep theory. The scaling hypothesis: neural nets absorb data & compute, generalizing and becoming more Bayesian as problems get harder, manifesting new abilities even at trivial-by-global-standards-scale. The deep learning revolution has begun as foretold. | |
| | | | | 
            
              neptune.ai
             | 
        |
| | | The generative models method is a type of unsupervised learning. In supervised learning, the deep learning model learns to map the input to the output. In each iteration, the loss is being calculated and the model is optimised using backpropagation. In unsupervised learning, we don't feed the target variables to the deep learning model like... | ||