• Post category:StudyBullet-3
  • Reading time:7 mins read

Generative Adversarial Networks and Variational Autoencoders in Python, Theano, and Tensorflow

What you will learn

Learn the basic principles of generative models

Build a variational autoencoder in Theano and Tensorflow

Build a GAN (Generative Adversarial Network) in Theano and Tensorflow


Variational autoencoders and GANs have been 2 of the most interesting developments in deep learning and machine learning recently.

Yann LeCun, a deep learning pioneer, has said that the most important development in recent years has been adversarial training, referring to GANs.

GAN stands for generative adversarial network, where 2 neural networks compete with each other.

What is unsupervised learning?

Unsupervised learning means we’re not trying to map input data to targets, we’re just trying to learn the structure of that input data.

Once we’ve learned that structure, we can do some pretty cool things.

One example is generating poetry – we’ve done examples of this in the past.

But poetry is a very specific thing, how about writing in general?

If we can learn the structure of language, we can generate any kind of text. In fact, big companies are putting in lots of money to research how the news can be written by machines.

But what if we go back to poetry and take away the words?

Well then we get art, in general.

By learning the structure of art, we can create more art.

How about art as sound?

Get Instant Notification of New Courses on our Telegram channel.

If we learn the structure of music, we can create new music.

Imagine the top 40 hits you hear on the radio are songs written by robots rather than humans.

The possibilities are endless!

You might be wondering, “how is this course different from the first unsupervised deep learning course?”

In this first course, we still tried to learn the structure of data, but the reasons were different.

We wanted to learn the structure of data in order to improve supervised training, which we demonstrated was possible.

In this new course, we want to learn the structure of data in order to produce more stuff that resembles the original data.

This by itself is really cool, but we’ll also be incorporating ideas from Bayesian Machine Learning, Reinforcement Learning, and Game Theory. That makes it even cooler!

Thanks for reading and I’ll see you in class. =)

“If you can’t implement it, you don’t understand it”

  • Or as the great physicist Richard Feynman said: “What I cannot create, I do not understand”.
  • My courses are the ONLY courses where you will learn how to implement machine learning algorithms from scratch
  • Other courses will teach you how to plug in your data into a library, but do you really need help with 3 lines of code?
  • After doing the same thing with 10 datasets, you realize you didn’t learn 10 things. You learned 1 thing, and just repeated the same 3 lines of code 10 times…

Suggested Prerequisites:

  • Calculus
  • Probability
  • Object-oriented programming
  • Python coding: if/else, loops, lists, dicts, sets
  • Numpy coding: matrix and vector operations
  • Linear regression
  • Gradient descent
  • Know how to build a feedforward and convolutional neural network in Theano or TensorFlow


  • Check out the lecture “Machine Learning and AI Prerequisite Roadmap” (available in the FAQ of any of my courses, including the free Numpy course)


Introduction and Outline
Where does this course fit into your deep learning studies?
Where to get the code and data
How to succeed in this course
Generative Modeling Review
What does it mean to Sample?
Sampling Demo: Bayes Classifier
Gaussian Mixture Model Review
Sampling Demo: Bayes Classifier with GMM
Why do we care about generating samples?
Neural Network and Autoencoder Review
Tensorflow Warmup
Theano Warmup
Variational Autoencoders
Variational Autoencoders Section Introduction
Variational Autoencoder Architecture
Parameterizing a Gaussian with a Neural Network
The Latent Space, Predictive Distributions and Samples
Cost Function
Tensorflow Implementation (pt 1)
Tensorflow Implementation (pt 2)
Tensorflow Implementation (pt 3)
The Reparameterization Trick
Theano Implementation
Visualizing the Latent Space
Bayesian Perspective
Variational Autoencoder Section Summary
Generative Adversarial Networks (GANs)
GAN – Basic Principles
GAN Cost Function (pt 1)
GAN Cost Function (pt 2)
Batch Normalization Review
Fractionally-Strided Convolution
Tensorflow Implementation Notes
Tensorflow Implementation
Theano Implementation Notes
Theano Implementation
GAN Summary
Theano and Tensorflow Basics Review
(Review) Theano Basics
(Review) Theano Neural Network in Code
(Review) Tensorflow Basics
(Review) Tensorflow Neural Network in Code
Appendix / FAQ
What is the Appendix?
Windows-Focused Environment Setup 2018
How to How to install Numpy, Theano, Tensorflow, etc…
How to Succeed in this Course (Long Version)
How to Code by Yourself (part 1)
How to Code by Yourself (part 2)
Proof that using Jupyter Notebook is the same as not using it
Is this for Beginners or Experts? Academic or Practical? Fast or slow-paced?
Python 2 vs Python 3
Is Theano Dead?
What order should I take your courses in? (part 1)
What order should I take your courses in? (part 2)
BONUS: Where to get discount coupons and FREE deep learning material