This course is the next logical step in my deep learning, data science, and machine learning series. I ve done a lot of courses about deep learning, and I just released a course about unsupervised learning, where I talked about clustering and density estimation. So what do you get when you put these 2 together? Unsupervised deep learning!
In these course we ll start with some very basic stuff – principal components analysis (PCA), and a popular nonlinear dimensionality reduction technique known as t–SNE (t–distributed stochastic neighbor embedding).
Next, we ll look at a special type of unsupervised neural network called the autoencoder. After describing how an autoencoder works, I ll show you how you can link a bunch of them together to form a deep stack of autoencoders, that leads to better performance of a supervised deep neural network. Autoencoders are like a non–linear form of PCA.
Last, we ll look at restricted Boltzmann machines (RBMs). These are yet another popular unsupervised neural network, that you can use in the same way as autoencoders to pretrain your supervised deep neural network. I ll show you an interesting way of training restricted Boltzmann machines, known as Gibbs sampling, a special case of Markov Chain Monte Carlo, and I ll demonstrate how even though this method is only a rough approximation, it still ends up reducing other cost functions, such as the one used for autoencoders. This method is also known as Contrastive Divergence or CD–k. As in physical systems, we define a concept called free energy and attempt to minimize this quantity.
Instructor Details
Courses : 22
Specification: Unsupervised Deep Learning in Python
|
2 reviews for Unsupervised Deep Learning in Python
Add a review Cancel reply
This site uses Akismet to reduce spam. Learn how your comment data is processed.
Price | $16.99 |
---|---|
Provider | |
Duration | 10.5 hours |
Year | 2020 |
Level | Intermediate |
Language | English |
Certificate | Yes |
Quizzes | No |
$34.99 $16.99
Vikram Hegde –
Yes exactly what I was looking for.
Andrea Perlato –
Well explained!