Explore >> Select a destination


You are here

kawine.github.io
| | francisbach.com
6.5 parsecs away

Travel
| | [AI summary] The blog post discusses the spectral properties of kernel matrices, focusing on the analysis of eigenvalues and their estimation using tools like the matrix Bernstein inequality. It also covers the estimation of the number of integer vectors with a given L1 norm and the relationship between these counts and combinatorial structures. The post includes a detailed derivation of bounds for the difference between true and estimated eigenvalues, highlighting the role of the degrees of freedom and the impact of regularization in kernel methods. Additionally, it touches on the importance of spectral analysis in machine learning and its applications in various domains.
| | jxmo.io
6.7 parsecs away

Travel
| | A primer on variational autoencoders (VAEs) culminating in a PyTorch implementation of a VAE with discrete latents.
| | windowsontheory.org
5.3 parsecs away

Travel
| | Previous post: ML theory with bad drawings Next post: What do neural networks learn and when do they learn it, see also all seminar posts and course webpage. Lecture video (starts in slide 2 since I hit record button 30 seconds too late - sorry!) - slides (pdf) - slides (Powerpoint with ink and animation)...
| | sander.ai
25.6 parsecs away

Travel
| Perspectives on diffusion, or how diffusion models are autoencoders, deep latent variable models, score function predictors, reverse SDE solvers, flow-based models, RNNs, and autoregressive models, all at once!