|
You are here |
liorsinai.github.io | ||
| | | | |
windowsontheory.org
|
|
| | | | | (Updated and expanded 12/17/2021) I am teaching deep learning this week in Harvard's CS 182 (Artificial Intelligence) course. As I'm preparing the back-propagation lecture, Preetum Nakkiran told me about Andrej Karpathy's awesome micrograd package which implements automatic differentiation for scalar variables in very few lines of code. I couldn't resist using this to show how... | |
| | | | |
thenumb.at
|
|
| | | | | [AI summary] This text provides a comprehensive overview of differentiable programming, focusing on its application in machine learning and image processing. It explains the fundamentals of automatic differentiation, including forward and backward passes, and demonstrates how to implement these concepts in a custom framework. The text also discusses higher-order differentiation and its implementation in frameworks like JAX and PyTorch. A practical example is given using differentiable programming to de-blur an image, showcasing how optimization techniques like gradient descent can be applied to solve real-world problems. The text emphasizes the importance of differentiable programming in enabling efficient and flexible computation for various domains, includ... | |
| | | | |
jingnanshi.com
|
|
| | | | | Tutorial on automatic differentiation | |
| | | | |
blog.fastforwardlabs.com
|
|
| | | This article is available as a notebook on Github. Please refer to that notebook for a more detailed discussion and code fixes and updates. Despite all the recent excitement around deep learning, neural networks have a reputation among non-specialists as complicated to build and difficult to interpret. And while interpretability remains an issue, there are now high-level neural network libraries that enable developers to quickly build neural network models without worrying about the numerical details of floating point operations and linear algebra. | ||