|
You are here |
francisbach.com | ||
| | | | |
www.jeremykun.com
|
|
| | | | | This post is a sequel to Formulating the Support Vector Machine Optimization Problem. The Karush-Kuhn-Tucker theorem Generic optimization problems are hard to solve efficiently. However, optimization problems whose objective and constraints have special structure often succumb to analytic simplifications. For example, if you want to optimize a linear function subject to linear equality constraints, one can compute the Lagrangian of the system and find the zeros of its gradient. More generally, optimizing... | |
| | | | |
jeremykun.wordpress.com
|
|
| | | | | This post is a sequel toFormulating the Support Vector Machine Optimization Problem. The Karush-Kuhn-Tucker theorem Generic optimization problems are hard to solve efficiently. However, optimization problems whose objective and constraints have special structureoften succumb to analytic simplifications. For example, if you want to optimize a linear function subject to linear equality constraints, one can compute... | |
| | | | |
fa.bianp.net
|
|
| | | | | There's a fascinating link between minimization of quadratic functions and polynomials. A link that goes deep and allows to phrase optimization problems in the language of polynomials and vice versa. Using this connection, we can tap into centuries of research in the theory of polynomials and shed new light on ... | |
| | | | |
jxmo.io
|
|
| | | A primer on variational autoencoders (VAEs) culminating in a PyTorch implementation of a VAE with discrete latents. | ||