|
You are here |
live-simons-blog.pantheonsite.io | ||
| | | | |
francisbach.com
|
|
| | | | | [AI summary] The blog post discusses the spectral properties of kernel matrices, focusing on the analysis of eigenvalues and their estimation using tools like the matrix Bernstein inequality. It also covers the estimation of the number of integer vectors with a given L1 norm and the relationship between these counts and combinatorial structures. The post includes a detailed derivation of bounds for the difference between true and estimated eigenvalues, highlighting the role of the degrees of freedom and the impact of regularization in kernel methods. Additionally, it touches on the importance of spectral analysis in machine learning and its applications in various domains. | |
| | | | |
lucatrevisan.wordpress.com
|
|
| | | | | To get a sense of how LaTeX2WP works, and of what is still missing, here is the unedited output of the program given the latex source of my Max Cut paper (the most recent arxiv submission). Note that the \ref commands have become clickable links. You can see that footnotes and bibliography are not supported.... | |
| | | | |
djalil.chafai.net
|
|
| | | | | This post is devoted to few convex and compact sets of matrices that I like. The set \( {\mathcal{C}_n} \) of correlation matrices. A real \( {n\times n} \) matrix \( {C} \) is a correlation matrix when \( {C} \) is symmetric, semidefinite positive, with unit diagonal. This means that \[ C_{ii}=1, \quad C_{ji}=C_{ji},\quad \left\geq0 \] for every \(... | |
| | | | |
stephenmalina.com
|
|
| | | Matrix Potpourri # As part of reviewing Linear Algebra for my Machine Learning class, I've noticed there's a bunch of matrix terminology that I didn't encounter during my proof-based self-study of LA from Linear Algebra Done Right. This post is mostly intended to consolidate my own understanding and to act as a reference to future me, but if it also helps others in a similar position, that's even better! | ||