Explore >> Select a destination


You are here

francisbach.com
| | jeremykun.wordpress.com
7.2 parsecs away

Travel
| | This post is a sequel toFormulating the Support Vector Machine Optimization Problem. The Karush-Kuhn-Tucker theorem Generic optimization problems are hard to solve efficiently. However, optimization problems whose objective and constraints have special structureoften succumb to analytic simplifications. For example, if you want to optimize a linear function subject to linear equality constraints, one can compute...
| | blogs.princeton.edu
7.0 parsecs away

Travel
| | [latexpage] Sum of squares optimization is an active area of research at the interface of algorithmic algebra and convex optimization. Over the last decade, it has made significant impact on both d...
| | blog.omega-prime.co.uk
5.5 parsecs away

Travel
| | The most fundamental technique in statistical learning is ordinary least squares (OLS) regression. If we have a vector of observations \(y\) and a matrix of features associated with each observation \(X\), then we assume the observations are a linear function of the features plus some (iid) random noise, \(\epsilon\):
| | xcorr.net
65.9 parsecs away

Travel
| 2022 was the year of generative AI models: DALL-E 2, MidJourney, Stable Diffusion, and Imagen all showed that it's possible to generate grounded, photorealistic images. These generative AIs are instances of conditional denoising diffusion probabilistic models, or DDPMs. Despite these flashy applications, DDPMs have thus far had little impact on neuroscience. An oil painting of...