|
You are here |
iclr-blogposts.github.io | ||
| | | | |
windowsontheory.org
|
|
| | | | | (Updated and expanded 12/17/2021) I am teaching deep learning this week in Harvard's CS 182 (Artificial Intelligence) course. As I'm preparing the back-propagation lecture, Preetum Nakkiran told me about Andrej Karpathy's awesome micrograd package which implements automatic differentiation for scalar variables in very few lines of code. I couldn't resist using this to show how... | |
| | | | |
jingnanshi.com
|
|
| | | | | Tutorial on automatic differentiation | |
| | | | |
thenumb.at
|
|
| | | | | [AI summary] This text provides a comprehensive overview of differentiable programming, focusing on its application in machine learning and image processing. It explains the fundamentals of automatic differentiation, including forward and backward passes, and demonstrates how to implement these concepts in a custom framework. The text also discusses higher-order differentiation and its implementation in frameworks like JAX and PyTorch. A practical example is given using differentiable programming to de-blur an image, showcasing how optimization techniques like gradient descent can be applied to solve real-world problems. The text emphasizes the importance of differentiable programming in enabling efficient and flexible computation for various domains, includ... | |
| | | | |
jeremykun.com
|
|
| | | Hard to believe Sanjeev Arora and his coauthors consider it"a basic tool [that should be] taught to all algorithms students together with divide-and-conquer, dynamic programming, and random sampling."Christos Papadimitriou calls it"so hard to believe that it has been discovered five times and forgotten." It has formed the basis of algorithms inmachine learning, optimization, game theory, | ||