Explore >> Select a destination


You are here

windowsontheory.org
| | thenumb.at
2.7 parsecs away

Travel
| | [AI summary] This text provides a comprehensive overview of differentiable programming, focusing on its application in machine learning and image processing. It explains the fundamentals of automatic differentiation, including forward and backward passes, and demonstrates how to implement these concepts in a custom framework. The text also discusses higher-order differentiation and its implementation in frameworks like JAX and PyTorch. A practical example is given using differentiable programming to de-blur an image, showcasing how optimization techniques like gradient descent can be applied to solve real-world problems. The text emphasizes the importance of differentiable programming in enabling efficient and flexible computation for various domains, includ...
| | theorydish.blog
1.8 parsecs away

Travel
| | The chain rule is a fundamental result in calculus. Roughly speaking, it states that if a variable $latex c$ is a differentiable function of intermediate variables $latex b_1,\ldots,b_n$, and each intermediate variable $latex b_i$ is itself a differentiable function of $latex a$, then we can compute the derivative $latex \frac{{\mathrm d} c}{{\mathrm d} a}$ as...
| | bytepawn.com
3.2 parsecs away

Travel
| | I will show how to solve the standard A x = b matrix equation with PyTorch. This is a good toy problem to show some guts of the framework without involving neural networks.
| | windowsontheory.org
25.1 parsecs away

Travel
| Previous post: ML theory with bad drawings Next post: What do neural networks learn and when do they learn it, see also all seminar posts and course webpage. Lecture video (starts in slide 2 since I hit record button 30 seconds too late - sorry!) - slides (pdf) - slides (Powerpoint with ink and animation)...