|
You are here |
www.paepper.com | ||
| | | | |
michael-lewis.com
|
|
| | | | | This is a short summary of some of the terminology used in machine learning, with an emphasis on neural networks. I've put it together primarily to help my own understanding, phrasing it largely in non-mathematical terms. As such it may be of use to others who come from more of a programming than a mathematical background. | |
| | | | |
sirupsen.com
|
|
| | | | | [AI summary] The article provides an in-depth explanation of how to build a neural network from scratch, focusing on the implementation of a simple average function and the introduction of activation functions for non-linear tasks. It discusses the use of matrix operations, the importance of GPUs for acceleration, and the role of activation functions like ReLU. The author also outlines next steps for further exploration, such as expanding the model, adding layers, and training on datasets like MNIST. | |
| | | | |
www.nicktasios.nl
|
|
| | | | | In the Latent Diffusion Series of blog posts, I'm going through all components needed to train a latent diffusion model to generate random digits from the MNIST dataset. In this first post, we will tr | |
| | | | |
torch.ch
|
|
| | | Torch is a scientific computing framework for LuaJIT. | ||