 
      
    | You are here | vladfeinberg.com | ||
| | | | | www.ethanepperly.com | |
| | | | | ||
| | | | | bartwronski.com | |
| | | | | In this blog post, I explore separable convolutional image filters: how can we check if a 2D filter is separable, and how to compute separable approximations to any arbitrary 2D filter represented in a numerical / matrix form using SVD. | |
| | | | | nhigham.com | |
| | | | | A correlation matrix is a symmetric matrix with unit diagonal and nonnegative eigenvalues. In 2000 I was approached by a London fund management company who wanted to find the nearest correlation matrix (NCM) in the Frobenius norm to an almost correlation matrix: a symmetric matrix having a significant number of (small) negative eigenvalues. This problem... | |
| | | | | mathematicaloddsandends.wordpress.com | |
| | | The function $latex f(x) = x \log x$ occurs in various places across math/statistics/machine learning (e.g. in the definition of entropy), and I thought I'd put a list of properties of the function here that I've found useful. Here is a plot of the function: $latex f$ is defined on $latex (0, \infty)$. The only... | ||