|
You are here |
dennybritz.com | ||
| | | | |
michael-lewis.com
|
|
| | | | | This is a short summary of some of the terminology used in machine learning, with an emphasis on neural networks. I've put it together primarily to help my own understanding, phrasing it largely in non-mathematical terms. As such it may be of use to others who come from more of a programming than a mathematical background. | |
| | | | |
neuralnetworksanddeeplearning.com
|
|
| | | | | [AI summary] The provided text discusses the implementation of a neural network using Theano, focusing on the structure of the network, its layers (FullyConnectedLayer, ConvPoolLayer, SoftmaxLayer), and the training process using stochastic gradient descent (SGD). It also references a paper by C. R. Shu et al. on the application of deep learning in medical image segmentation, particularly in brain tumor detection, and highlights the significance of such advancements in the field of medical imaging and diagnostics. | |
| | | | |
bytepawn.com
|
|
| | | | | I will show how to solve the standard A x = b matrix equation with PyTorch. This is a good toy problem to show some guts of the framework without involving neural networks. | |
| | | | |
www.nicktasios.nl
|
|
| | | In the Latent Diffusion Series of blog posts, I'm going through all components needed to train a latent diffusion model to generate random digits from the MNIST dataset. In the third, and last, post, | ||