|
You are here |
infiniteopt.github.io | ||
| | | | |
francisbach.com
|
|
| | | | | [AI summary] This text discusses the scaling laws of optimization in machine learning, focusing on asymptotic expansions for both strongly convex and non-strongly convex cases. It covers the derivation of performance bounds using techniques like Laplace's method and the behavior of random minimizers. The text also explains the 'weird' behavior observed in certain plots, where non-strongly convex bounds become tight under specific conditions. The analysis connects theoretical results to practical considerations in optimization algorithms. | |
| | | | |
www.randomservices.org
|
|
| | | | | [AI summary] The text covers various topics in probability and statistics, including continuous distributions, empirical density functions, and data analysis. It discusses the uniform distribution, rejection sampling, and the construction of continuous distributions without probability density functions. The text also includes data analysis exercises involving empirical density functions for body weight, body length, and gender-specific body weight. | |
| | | | |
matbesancon.xyz
|
|
| | | | | Learning by doing: predicting the outcome. | |
| | | | |
nhigham.com
|
|
| | | The spectral radius $latex \rho(A)$ of a square matrix $latex A\in\mathbb{C}^{n\times n}$ is the largest absolute value of any eigenvalue of $LATEX A$: $latex \notag \rho(A) = \max\{\, |\lambda|: \lambda~ \mbox{is an eigenvalue of}~ A\,\}. $ For Hermitian matrices (or more generally normal matrices, those satisfying $LATEX AA^* = A^*A$) the spectral radius is just... | ||