|
You are here |
lucatrevisan.wordpress.com | ||
| | | | |
thenumb.at
|
|
| | | | | [AI summary] The text discusses the representation of functions as vectors and their applications in various domains such as signal processing, geometry, and physics. It explains how functions can be treated as vectors in a vector space, leading to the concept of eigenfunctions and eigenvalues, which are crucial for understanding and manipulating signals and geometries. The text also covers different types of Laplacians, including the standard Laplacian, higher-dimensional Laplacians, and the Laplace-Beltrami operator, and their applications in fields like image compression, computer graphics, and quantum mechanics. The discussion includes spherical harmonics, which are used in representing functions on spheres, and their applications in game engines and glo... | |
| | | | |
terrytao.wordpress.com
|
|
| | | | | A key theme in real analysis is that of studying general functions $latex {f: X \rightarrow {\bf R}}&fg=000000$ or $latex {f: X \rightarrow {\bf C}}&fg=000000$ by first approximating them b | |
| | | | |
fabricebaudoin.blog
|
|
| | | | | In this section, we consider a diffusion operator $latex L=\sum_{i,j=1}^n \sigma_{ij} (x) \frac{\partial^2}{ \partial x_i \partial x_j} +\sum_{i=1}^n b_i (x)\frac{\partial}{\partial x_i}, $ where $latex b_i$ and $latex \sigma_{ij}$ are continuous functions on $latex \mathbb{R}^n$ and for every $latex x \in \mathbb{R}^n$, the matrix $latex (\sigma_{ij}(x))_{1\le i,j\le n}$ is a symmetric and non negative matrix. Our... | |
| | | | |
jeremykun.com
|
|
| | | Hard to believe Sanjeev Arora and his coauthors consider it"a basic tool [that should be] taught to all algorithms students together with divide-and-conquer, dynamic programming, and random sampling."Christos Papadimitriou calls it"so hard to believe that it has been discovered five times and forgotten." It has formed the basis of algorithms inmachine learning, optimization, game theory, | ||