|
You are here |
tiao.io | ||
| | | | |
www.nicktasios.nl
|
|
| | | | | In the Latent Diffusion Series of blog posts, I'm going through all components needed to train a latent diffusion model to generate random digits from the MNIST dataset. In the second post, we will bu | |
| | | | |
jxmo.io
|
|
| | | | | A primer on variational autoencoders (VAEs) culminating in a PyTorch implementation of a VAE with discrete latents. | |
| | | | |
blog.keras.io
|
|
| | | | | [AI summary] The text discusses various types of autoencoders and their applications. It starts with basic autoencoders, then moves to sparse autoencoders, deep autoencoders, and sequence-to-sequence autoencoders. The text also covers variational autoencoders (VAEs), explaining their structure and training process. It includes code examples for each type of autoencoder and mentions the use of tools like TensorBoard for visualization. The VAE section highlights how to generate new data samples and visualize the latent space. The text concludes with references and a note about the potential for further topics. | |
| | | | |
www.3blue1brown.com
|
|
| | | An overview of what a neural network is, introduced in the context of recognizing hand-written digits. | ||