A deep understanding of deep learning (with Python intro)

BY
Udemy

Develop a thorough understanding of deep learning and learn Python by taking the Udemy-offered course.

Mode

Online

Fees

₹ 499 3499

Quick Facts

particular details
Medium of instructions English
Mode of learning Self study
Mode of Delivery Video and Text Based

Course overview

A deep understanding of deep learning (with Python intro) course is an online programme on deep learning developed by Mike X Cohen, a Neuroscientist, writer and professor. The short certificate programmes help the students to gain the fundamental knowledge of deep learning required to explore advanced and new deep learning concepts and theories. The course curriculum will walk the participants through Math, NumPy, PyTorch, Autoenncoders, Data types and many more aspects of deep learning. 

A deep understanding of deep learning (with Python intro) online course, offered by Udemy, is open for all the folk who are interested to learn deep learning and does not demand any kind of prior experience or knowledge; rather, the course will kick start the concepts including Python, Pytorch etc from the scratch. The students will go through the process of creating artificial neural networks and building models in PyTorch and will master codes of gradient descent and calculus. 

The learners will also get practical exposure through A deep understanding of deep learning (with Python intro) certification with many experiences and practical examples etc. Udemy will confer the participants a certificate of completion at the end opening up plenty of opportunities in the field of deep learning that is widely used in many industries such as self-driving cars, medical diagnosis and whatnot. Those who are looking forward to trying a hand in deep learning can join the programme by making the payment.

The highlights

  • 100% Online course 
  • Offered by Udemy 
  • 30-Day Money-Back Guarantee
  • Downloadable resource
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of completion

Program offerings

  • 57.5 hours on-demand video
  • 3 articles
  • 1 downloadable resource
  • Full lifetime access
  • Access on mobile and tv
  • Certificate of completion
  • English videos
  • Exercises
  • Projects
  • Code-challenge
  • 8+ hour python tutorial

Course and certificate fees

Fees information
₹ 499  ₹3,499
certificate availability

Yes

certificate providing authority

Udemy

What you will learn

Knowledge of deep learning Knowledge of python Mathematical skill Knowledge of numpy

After the completion of A deep understanding of deep learning (with Python intro) online certification,  the students will understand the theory and math of deep learning in detail. Plus, the participants will learn Python from the very basics, convolutional networks, architectures of feedforward, and the use of GPUs for deep learning. 

The syllabus

Introduction

  • How to learn from this course
  • Using Udemy like a pro

Download all course materials

  • Downloading and using the code
  • My policy on code-sharing

Concepts in deep learning

  • What is an artificial neural network?
  • How models "learn"
  • The role of DL in science and knowledge
  • Running experiments to understand DL
  • Are artificial "neurons" like biological neurons?

About the Python tutorial

  • Should you watch the Python tutorial?

Math, numpy, PyTorch

  • PyTorch or TensorFlow?
  • Introduction to this section
  • Spectral theories in mathematics
  • Terms and datatypes in math and computers
  • Converting reality to numbers
  • Vector and matrix transpose
  • OMG it's the dot product!
  • Matrix multiplication
  • Softmax
  • Logarithms
  • Entropy and cross-entropy
  • Min/max and argmin/argmax
  • Mean and variance
  • Random sampling and sampling variability
  • Reproducible randomness via seeding
  • The t-test
  • Derivatives: intuition and polynomials
  • Derivatives find minima
  • Derivatives: product and chain rules

Gradient descent

  • Overview of gradient descent
  • What about local minima?
  • Gradient descent in 1D
  • CodeChallenge: unfortunate starting value
  • Gradient descent in 2D
  • CodeChallenge: 2D gradient ascent
  • Parametric experiments on g.d.
  • CodeChallenge: fixed vs. dynamic learning rate
  • Vanishing and exploding gradients
  • Tangent: Notebook revision history

ANNs (Artificial Neural Networks)

  • The perceptron and ANN architecture
  • A geometric view of ANNs
  • ANN math part 1 (forward prop)
  • ANN math part 2 (errors, loss, cost)
  • ANN math part 3 (backprop)
  • ANN for regression
  • CodeChallenge: manipulate regression slopes
  • ANN for classifying qwerties
  • Learning rates comparison
  • Multilayer ANN
  • Linear solutions to linear problems
  • Why multilayer linear models don't exist
  • Multi-output ANN (iris dataset)
  • CodeChallenge: more qwerties!
  • Comparing the number of hidden units
  • Depth vs. breadth: number of parameters
  • Defining models using sequential vs. class
  • Model depth vs. breadth
  • CodeChallenge: convert sequential to class
  • Diversity of ANN visual representations
  • Reflection: Are DL models understandable yet?

Overfitting and cross-validation

  • What is overfitting and is it as bad as they say?
  • Cross-validation
  • Generalization
  • Cross-validation -- manual separation
  • Cross-validation -- scikitlearn
  • Cross-validation -- DataLoader
  • Splitting data into train, devset, test
  • Cross-validation on regression

Regularization

  • Regularization: Concept and methods
  • train() and eval() modes
  • Dropout regularization
  • Dropout regularization in practice
  • Dropout example 2
  • Weight regularization (L1/L2): math
  • L2 regularization in practice
  • L1 regularization in practice
  • Training in mini-batches
  • Batch training in action
  • The importance of equal batch sizes
  • CodeChallenge: Effects of mini-batch size

Metaparameters (activations, optimizers)

  • What are "metaparameters"?
  • The "wine quality" dataset
  • CodeChallenge: Minibatch size in the wine dataset
  • Data normalization
  • The importance of data normalization
  • Batch normalization
  • Batch normalization in practice
  • CodeChallenge: Batch-normalize the qwerties
  • Activation functions
  • Activation functions in PyTorch
  • Activation functions comparison
  • CodeChallenge: Compare relu variants
  • CodeChallenge: Predict sugar
  • Loss functions
  • Loss functions in PyTorch
  • More practice with multioutput ANNs
  • Optimizers (minibatch, momentum)
  • SGD with momentum
  • Optimizers (RMSprop, Adam)
  • Optimizers comparison
  • CodeChallenge: Optimizers and... something
  • CodeChallenge: Adam with L2 regularization
  • Learning rate decay
  • How to pick the right metaparameters

FFNs (Feed-Forward Networks)

  • What are fully-connected and feedforward networks?
  • The MNIST dataset
  • FFN to classify digits
  • CodeChallenge: Binarized MNIST images
  • CodeChallenge: Data normalization
  • Distributions of weights pre- and post-learning
  • CodeChallenge: MNIST and breadth vs. depth
  • CodeChallenge: Optimizers and MNIST
  • Scrambled MNIST
  • Shifted MNIST
  • CodeChallenge: The mystery of the missing 7
  • Universal approximation theorem

More on data

  • Anatomy of a torch dataset and dataloader
  • Data size and network size
  • CodeChallenge: unbalanced data
  • What to do about unbalanced designs?
  • Data oversampling in MNIST
  • Data noise augmentation (with devset+test)
  • Data feature augmentation
  • Getting data into colab
  • Save and load trained models
  • Save the best-performing model
  • Where to find online datasets

Measuring model performance

  • Two perspectives of the world
  • Accuracy, precision, recall, F1
  • APRF in code
  • APRF example 1: wine quality
  • APRF example 2: MNIST
  • CodeChallenge: MNIST with unequal groups
  • Computation time
  • Better performance in test than train?

FFN milestone projects

  • Project 1: A gratuitously complex adding machine
  • Project 1: My solution
  • Project 2: Predicting heart disease
  • Project 2: Solution
  • Project 3: FFN for missing data interpolation
  • Project 3: My solution

Weight inits and investigations

  • Explanation of weight matrix sizes
  • A surprising demo of weight initializations
  • Theory: Why and how to initialize weights
  • CodeChallenge: Weight variance inits
  • Xavier and Kaiming initializations
  • CodeChallenge: Xavier vs. Kaiming
  • CodeChallenge: Identically random weights
  • Freezing weights during learning
  • Learning-related changes in weights
  • Use default inits or apply your own?

Autoencoders

  • What are autoencoders and what do they do?
  • Denoising MNIST
  • CodeChallenge: How many units?
  • AEs for occlusion
  • The latent code of MNIST
  • Autoencoder with tied weights

Running models on a GPU

  • What is a GPU and why use it?
  • Implementation
  • CodeChallenge: Run an experiment on the GPU

Convolution and transformations

  • Convolution: concepts
  • Feature maps and convolution kernels
  • Convolution in code
  • Convolution parameters (stride, padding)
  • The Conv2 class in PyTorch
  • CodeChallenge: Choose the parameters
  • Transpose convolution
  • Max/mean pooling
  • Pooling in PyTorch
  • To pool or to stride?
  • Image transforms
  • Creating and using custom DataLoaders

Understand and design CNNs

  • The canonical CNN architecture
  • CNN to classify MNIST digits
  • CNN on shifted MNIST
  • Classify Gaussian blurs
  • Examine feature map activations
  • CodeChallenge: Softcode internal parameters
  • CodeChallenge: How wide the FC?
  • Do autoencoders clean Gaussians?
  • CodeChallenge: AEs and occluded Gaussians
  • CodeChallenge: Custom loss functions
  • Discover the Gaussian parameters
  • The EMNIST dataset (letter recognition)
  • Dropout in CNNs
  • CodeChallenge: How low can you go?
  • CodeChallenge: Varying number of channels
  • So many possibilities! How to create a CNN?

CNN milestone projects

  • Project 1: Import and classify CIFAR10
  • Project 1: My solution
  • Project 2: CIFAR-autoencoder
  • Project 3: FMNIST
  • Project 4: Psychometric functions in CNNs

Transfer Learning

  • Transfer learning: What, why, and when?
  • Transfer learning: MNIST -> FMNIST
  • CodeChallenge: letters to numbers
  • Famous CNN architectures
  • Transfer learning with ResNet-18
  • CodeChallenge: VGG-16
  • Pretraining with autoencoders
  • CIFAR10 with autoencoder-pretrained model

Style transfer

  • What is style transfer and how does it work?
  • The Gram matrix (feature activation covariance)
  • The style transfer algorithm
  • Transferring the screaming bathtub
  • CodeChallenge: Style transfer with AlexNet

Generative adversarial networks

  • GAN: What, why, and how
  • Linear GAN with MNIST
  • CodeChallenge: Linear GAN with FMNIST
  • CNN GAN with Gaussians
  • CodeChallenge: Gaussians with fewer layers
  • CNN GAN with FMNIST
  • CodeChallenge: CNN GAN with CIFAR

RNNs (Recurrent Neural Networks) (and GRU/LSTM)

  • Leveraging sequences in deep learning
  • How RNNs work
  • The RNN class in PyTorch
  • Predicting alternating sequences
  • CodeChallenge: sine wave extrapolation
  • More on RNNs: Hidden states, embeddings
  • GRU and LSTM
  • The LSTM and GRU classes
  • Lorem ipsum

Ethics of deep learning

  • Will AI save us or destroy us?
  • Example case studies
  • Some other possible ethical scenarios
  • Will deep learning take our jobs?
  • Accountability and making ethical AI

Where to go from here?

  • How to learn topic _X_ in deep learning?
  • How to read academic DL papers

Python intro: Data types

  • How to learn from the Python tutorial
  • Variables
  • Math and printing
  • Lists (1 of 2)
  • Lists (2 of 2)
  • Tuples
  • Booleans
  • Dictionaries

Python intro: Indexing, slicing

  • Indexing
  • Slicing

Python intro: Functions

  • Inputs and outputs
  • Python libraries (numpy)
  • Python libraries (pandas)
  • Getting help on functions
  • Creating functions
  • Global and local variable scopes
  • Copies and referents of variables
  • Classes and object-oriented programming

Python intro: Flow control

  • If-else statements
  • If-else statements, part 2
  • For loops
  • Enumerate and Zip
  • Continue
  • Initializing variables
  • Single-line loops (list comprehension)
  • while loops
  • Broadcasting in numpy
  • Function error checking and handling

Python intro: Text and plots

  • Printing and string interpolation
  • Plotting dots and lines
  • Subplot geometry
  • Making the graphs look nicer
  • Seaborn
  • Images
  • Export plots in low and high resolution

Bonus Section

  • Bonus content

Instructors

Mr Mike X Cohen

Mr Mike X Cohen
Associate Professor
Freelancer

Trending Courses

Popular Courses

Popular Platforms

Learn more about the Courses

Download the Careers360 App on your Android phone

Regular exam updates, QnA, Predictors, College Applications & E-books now on your Mobile

Careers360 App
150M+ Students
30,000+ Colleges
500+ Exams
1500+ E-books