|
You are here |
www.moxleystratton.com | ||
| | | | |
r2rt.com
|
|
| | | | | ||
| | | | |
www.paepper.com
|
|
| | | | | [AI summary] This article explains how to train a simple neural network using Numpy in Python without relying on frameworks like TensorFlow or PyTorch, focusing on the implementation of ReLU activation, weight initialization, and gradient descent for optimization. | |
| | | | |
aimatters.wordpress.com
|
|
| | | | | A few weeks ago, it was announced that Keras would be getting official Google support and would become part of the TensorFlow machine learning library. Keras is a collectionof high-level APIs in Python for creating and training neural networks, using either Theano or TensorFlow as the underlying engine. Given my previous posts on implementing an... | |
| | | | |
jaketae.github.io
|
|
| | | In this short post, we will take a look at variational lower bound, also referred to as the evidence lower bound or ELBO for short. While I have referenced ELBO in a previous blog post on VAEs, the proofs and formulations presented in the post seems somewhat overly convoluted in retrospect. One might consider this a gentler, more refined recap on the topic. For the remainder of this post, I will use the terms "variational lower bound" and "ELBO" interchangeably to refer to the same concept. I was heavily inspired by Hugo Larochelle's excellent lecture on deep belief networks. | ||