The ‘Probabilistic Deep Learning with TensorFlow 2’ online course is a study about the building of probabilistic models with TensorFlow and the incorporation of probabilistic distributions into deep learning models such as Bayesian neural networks, normalizing flows, and variational autoencoders. This certification course is provided by the Coursera online education platform and the course modules are developed by the Imperial College London.
This program is an overview of the TensorFlow Probabilistic Library and is scheduled to be completed in fifty-two hours. The classes are guided by Dr. Kevin Webster(Senior Teaching fellow in Statistics) from the faculty of natural sciences at the department of mathematics in Imperial College London.
The ‘Probabilistic Deep Learning with TensorFlow 2’ enables students to receive a course completion certificate and the opportunity to learn the course at the preferred pace. The students are engaged with learning methodologies such as video lectures, graded assignments, and quizzes for evaluation.
The fees for the course Probabilistic Deep Learning with TensorFlow 2 is -
Head
Amount in INR
1 month
Rs. 4,117
3 month
Rs. 8,234
6 month
Rs. 12,352
Eligibility Criteria
The ‘Probabilistic Deep Learning with TensorFlow 2’ online certification course requires the students to know the fundamental aspects of machine learning, understand the deep learning domain, and the knowledge of probability and statistics.
Certificate qualifying details
The students of the ‘Probabilistic Deep Learning with TensorFlow 2’ certification by Coursera will receive a course certificate from the Imperial College London after completion of the online classes, graded quizzes, and graded programming assignments successfully at the end of the course.
What you will learn
Knowledge of deep learningProgramming skillsMathematical skillStatistical skills
The ‘Probabilistic Deep Learning with TensorFlow 2’ certification syllabus is created for the students to gain professional skills and knowledge with the probabilistic neural network, aspects of deep learning, concepts of a generative model, techniques involved in TensorFlow, and the Probabilistic Programming Language(PRPL). By the end of the training program, students will be able to form a variational autoencoder algorithm to make a generative model of a synthetic image dataset built by themselves.
The ‘Probabilistic Deep Learning with TensorFlow 2’ online certification course is designed for the students, research associates, and industry professionals of the domain who wish to enhance their knowledge with the concepts and techniques involved in deep learning with TensorFlow.
Admission Details
The registration process for the ‘Probabilistic Deep Learning with TensorFlow 2’ online classes is done through the course website as per the following,
Step 4: Complete the registration and join the course.
Application Details
The applicants for the ‘Probabilistic Deep Learning with TensorFlow 2’ online program should enter their relevant details on the registration page such as name, email address, user name, and password of the Coursera account or can create a new account.
They can also sign in using the valid Google, Facebook, or Microsoft account IDs.
The Syllabus
Videos
Welcome to Probabilistic Deep Learning with TensorFlow 2
Interview with Paige Bailey
The TensorFlow Probability library
Univariate distributions
[Coding tutorial] Univariate distributions
Multivariate distributions
[Coding tutorial] Multivariate distributions
The Independent distribution
[Coding tutorial] The Independent distribution
Sampling and log probs
[Coding tutorial] Sampling and log probs
Trainable distributions
[Coding tutorial] Trainable distributions
Wrap up and introduction to the programming assignment
Readings
About Imperial College & the team
How to be successful in this course
Grading policy
Additional readings & helpful references
Quiz
Standard distributions
Programming Assignment
Naive Bayes and logistic regression
Discussion Prompt
Introduce yourself
Ungraded Labs
Univariate distributions
Multivariate distributions
Multivariate Gaussian with full covariance
The Independent distribution
Broadcasting rules
Sampling and log probs
Trainable distributions
Naive Bayes and logistic regression
Plugin
Pre-Course Survey
Videos
Welcome to week 2 - Probabilistic layers and Bayesian neural networks
The need for uncertainty in deep learning models
The DistributionLambda layer
[Coding tutorial] The DistributionLambda layer
Probabilistic layers
[Coding tutorial] Probabilistic layers
The DenseVariational layer
[Coding tutorial] The DenseVariational layer
Reparameterization layers
[Coding tutorial] Reparameterization layers
Wrap up and introduction to the programming assignment
Quiz
Sources of uncertainty
Programming Assignment
Bayesian convolutional neural network
Ungraded Labs
Maximum likelihood estimation
The DistributionLambda layer
Probabilistic layers
Bayes by backprop
The DenseVariational layer
Reparameterization layers
Bayesian convolutional neural network
Videos
Welcome to week 3 - Bijectors and normalising flows
Interview with Doug Kelly
Bijectors
[Coding tutorial] Bijectors
The TransformedDistribution class
[Coding tutorial] The Transformed Distribution class
Subclassing bijectors
[Coding tutorial] Subclassing bijectors
Autoregressive flows
RealNVP
[Coding tutorial] Normalising flows
Wrap up and introduction to the programming assignment
Quiz
Change of variables formula
Programming Assignment
RealNVP
Ungraded Labs
Change of variables formula
Bijectors
Scale bijectors and LinearOperator
The Transformed Distribution class
Subclassing bijectors
Autoregressive flows and RealNVP
Normalising flows
RealNVP
Videos
Welcome to week 4 - Variational autoencoders
Encoders and decoders
[Coding tutorial] Encoders and decoders
Minimising KL divergence
[Coding tutorial] Minimising KL divergence
Maximising the ELBO
[Coding tutorial] Maximising the ELBO
KL divergence layers
[Coding tutorial] KL divergence layers
Wrap up and introduction to the programming assignment
Quiz
Variational autoencoders
Programming Assignment
Variational autoencoder for Celeb-A
Ungraded Labs
Variational autoencoders
Encoders and decoders
Kullback-Leibler divergence
Minimising KL divergence
Full covariance Gaussian approximation
Maximising the ELBO
KL divergence layers
Variational autoencoder for Celeb-A
Videos
Welcome to the Capstone Project
Goodbye video
Peer Review
Capstone Project
Ungraded Lab
Capstone Project
Plugin
Post-Course Survey
Instructors
Imperial College, London Frequently Asked Questions (FAQ's)
1: How long is the duration of the ‘Probabilistic Deep Learning with TensorFlow 2’ online course?
The course is fifty-two hours long.
2: Can I learn the ‘Probabilistic Deep Learning with TensorFlow 2’ certification course on my own?
Yes, you can learn the course on your own.
3: Which education platform offers the ‘Probabilistic Deep Learning with TensorFlow 2’ online certification course?
The course is provided by Coursera and Imperial College London.
4: What are the prerequisites for the ‘Probabilistic Deep Learning with TensorFlow 2’ training?
This advanced program requires the students to know concepts of Python, machine learning, deep learning, statistics, and probability.
5: Who will issue the certificate in the ‘Probabilistic Deep Learning with TensorFlow 2’ online program?
The online shareable certificate is issued by Coursera and the Imperial College of London.