|
You are here |
longintuition.com | ||
| | | | |
www.jeremykun.com
|
|
| | | | | Machine learning is broadly split into two camps, statistical learning and non-statistical learning. The latter we've started to get a good picture of on this blog; we approached Perceptrons, decision trees, and neural networks from a non-statistical perspective. And generally "statistical" learning is just that, a perspective. Data is phrased in terms of independent and dependent variables, and statistical techniques are leveraged against the data. In this post we'll focus on the simplest example of thi... | |
| | | | |
akos.ma
|
|
| | | | | From the wonderful book by Ian Stewart, here are the equations themselves; read the book to know more about them. | |
| | | | |
www.depthfirstlearning.com
|
|
| | | | | [AI summary] The provided text is a detailed exploration of the mathematical and statistical foundations of neural networks, focusing on the Jacobian matrix, its spectral properties, and the implications for dynamical isometry. The key steps and results are as follows: 1. **Jacobian and Spectral Analysis**: The Jacobian matrix $ extbf{J} $ of a neural network is decomposed into $ extbf{J} = extbf{W} extbf{D} $, where $ extbf{W} $ is the weight matrix and $ extbf{D} $ is a diagonal matrix of derivatives. The spectral properties of $ extbf{J} extbf{J}^T $ are analyzed using the $ S $-transform, which captures the behavior of the eigenvalues of the Jacobian matrix. 2. **$ S $-Transform Derivation**: The $ S $-transform of $ extbf{J} extbf{J}^T $ is... | |
| | | | |
www.sewcraftycrochet.com
|
|
| | | School is out for summer, but during Teacher Appreciation Week, I made this orca as a teacher appreciation gift for my son's teacher. I noti... | ||