Explore >> Select a destination


You are here

hadrienj.github.io
| | thenumb.at
2.7 parsecs away

Travel
| | [AI summary] This text provides an in-depth exploration of how functions can be treated as vectors, particularly in the context of signal and geometry processing. It discusses the representation of functions as infinite-dimensional vectors, the use of Fourier transforms in various domains (such as 1D, spherical, and mesh-based), and the application of linear algebra to functions for tasks like compression and smoothing. The text also touches on the mathematical foundations of these concepts, including the Laplace operator, eigenfunctions, and orthonormal bases. It concludes with a list of further reading topics and acknowledges the contributions of reviewers.
| | blog.georgeshakan.com
2.7 parsecs away

Travel
| | Principal Component Analysis (PCA) is a popular technique in machine learning for dimension reduction. It can be derived from Singular Value Decomposition (SVD) which we will discuss in this post. We will cover the math, an example in python, and finally some intuition. The Math SVD asserts that any $latex m \times d$ matrix $latex...
| | austinmorlan.com
2.0 parsecs away

Travel
| | It took me longer than necessary to understand how a rotation transform matrix rotates a vector through three-dimensional space. Not because its a difficult concept but because it is often poorly explained in textbooks. Even the most explanatory book might derive the matrix for a rotation around one axis (e.g., x) but then present the other two matrices without showing their derivation. Ill explain my own understanding of their derivation in hopes that it will enlighten others that didnt catch on right a...
| | www.robertkubinec.com
23.2 parsecs away

Travel
| Ordered beta regression can give you comparable, scale-free ATEs that can still be understood in the scale of the original data-all without using logs.