Auto-Encoding Variational Bayes

Bayes Rule
Derivation of KL Divergence
Proof of non-negativity of KL
Derivation of ELBO
Another representation of ELBO
Objective function for model update
Reparameterization trick (source: https://stats.stackexchange.com/questions/429315/why-is-reparameterization-trick-necessary-for-variational-autoencoders)
Reparameterization trick
Closed-form solution of the KL divergence (under the simple assumption the variational distribution and true posteriors are univariate Gaussian)
The resulting estimator for the model

--

--

--

PhD in Machine Learning at Northeastern University

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Traffic forecasting, an amazing experience.

One neural network, many uses

Tokenization for Bert Models

Natural Language Processing: Part of Speech Tagging with Hidden Markov Model

Introduction to Semantic Image Segmentation

How to train an AI to play any game

Udacity Deep Learning Nanodegree Review

TensorLab: Autonomous Deep Learning in the Cloud

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Andac Demir

Andac Demir

PhD in Machine Learning at Northeastern University

More from Medium

Building a Robust Model with Partial Least Squares Regression (PLS)

Hierarchical Bayesian Model for Earthquake Prediction

Explaining correlated variables — how dalex makes it possible?