|
You are here |
francisbach.com | ||
| | | | |
michael-lewis.com
|
|
| | | | | This is a short summary of some of the terminology used in machine learning, with an emphasis on neural networks. I've put it together primarily to help my own understanding, phrasing it largely in non-mathematical terms. As such it may be of use to others who come from more of a programming than a mathematical background. | |
| | | | |
windowsontheory.org
|
|
| | | | | Previous post: ML theory with bad drawings Next post: What do neural networks learn and when do they learn it, see also all seminar posts and course webpage. Lecture video (starts in slide 2 since I hit record button 30 seconds too late - sorry!) - slides (pdf) - slides (Powerpoint with ink and animation)... | |
| | | | |
fa.bianp.net
|
|
| | | | | The Langevin algorithm is a simple and powerful method to sample from a probability distribution. It's a key ingredient of some machine learning methods such as diffusion models and differentially private learning. In this post, I'll derive a simple convergence analysis of this method in the special case when the ... | |
| | | | |
reasonabledeviations.com
|
|
| | | Academic blog about quantitative finance, programming, maths. | ||