|
You are here |
kawine.github.io | ||
| | | | |
windowsontheory.org
|
|
| | | | | Previous post: ML theory with bad drawings Next post: What do neural networks learn and when do they learn it, see also all seminar posts and course webpage. Lecture video (starts in slide 2 since I hit record button 30 seconds too late - sorry!) - slides (pdf) - slides (Powerpoint with ink and animation)... | |
| | | | |
thenumb.at
|
|
| | | | | [AI summary] This text provides an in-depth exploration of how functions can be treated as vectors, particularly in the context of signal and geometry processing. It discusses the representation of functions as infinite-dimensional vectors, the use of Fourier transforms in various domains (such as 1D, spherical, and mesh-based), and the application of linear algebra to functions for tasks like compression and smoothing. The text also touches on the mathematical foundations of these concepts, including the Laplace operator, eigenfunctions, and orthonormal bases. It concludes with a list of further reading topics and acknowledges the contributions of reviewers. | |
| | | | |
francisbach.com
|
|
| | | | | [AI summary] The blog post discusses the spectral properties of kernel matrices, focusing on the analysis of eigenvalues and their estimation using tools like the matrix Bernstein inequality. It also covers the estimation of the number of integer vectors with a given L1 norm and the relationship between these counts and combinatorial structures. The post includes a detailed derivation of bounds for the difference between true and estimated eigenvalues, highlighting the role of the degrees of freedom and the impact of regularization in kernel methods. Additionally, it touches on the importance of spectral analysis in machine learning and its applications in various domains. | |
| | | | |
teachingbattleground.wordpress.com
|
|
| | | Reblogged on WordPress.com | ||