|
You are here |
jaykmody.com | ||
| | | | |
www.johndcook.com
|
|
| | | | | The most obvious way to compute the soft maximum can easily fail due to overflow or underflow. The soft maximum of x and y is defined by g(x, y) = log( exp(x) + exp(y) ). The most obvious way to turn the definition above into C code would be double SoftMaximum(double x, double y) { | |
| | | | |
jxmo.io
|
|
| | | | | A primer on variational autoencoders (VAEs) culminating in a PyTorch implementation of a VAE with discrete latents. | |
| | | | |
jeremykun.wordpress.com
|
|
| | | | | There are two basic problems in information theory that are very easy to explain. Two people, Alice and Bob, want to communicate over a digital channel over some long period of time, and they know the probability that certain messages will be sent ahead of time. For example, English language sentences are more likely than... | |
| | | | |
sander.ai
|
|
| | | More thoughts on diffusion guidance, with a focus on its geometry in the input space. | ||