Careers360 Logo
Interested in this College?
Get updates on Eligibility, Admission, Placements Fees Structure
Compare

Quick Facts

Medium Of InstructionsMode Of LearningMode Of Delivery
EnglishSelf StudyVideo and Text Based

Course Overview

Machine Learning: Regression is a certification programme offered completely in the online mode. In the course by the University of Washington, the students will be exposed to machine learning, particularly regression. Machine Learning: Regression Certification Syllabus will talk about regression which is predictive modelling in machine learning that will help to foresee the outcomes analyzing the algorithm. 

In the Machine Learning: Regression Certification Course, offered by Coursera, the candidates will learn the regression based on the case study of the ‘Predicting Housing Prices’ where the students will come up with a model that will predict a continuous value/price based on the input features such as a number of bathrooms, bedrooms, etc. Machine Learning: Regression Certification by Coursera is the second in the four courses of Machine Learning Specialization. 

The Highlights

  • Provided by Coursera
  • Approximately 22 hours of programme
  • Offered by the University of Washington
  • Flexible Deadlines
  • Self-Paced Learning Option
  • Shareable Certificate
  • Financial Aid Available
  • 100% Online Course

Programme Offerings

  • English videos with multiple subtitles
  • practice quizzes
  • Graded Assignments with peer feedback
  • graded Quizzes with feedback
  • Graded Programming Assignments
  • Course Videos & Readings
  • EMI payment options
  • 14 day refund period.

Courses and Certificate Fees

Certificate AvailabilityCertificate Providing Authority
yesCoursera

Machine Learning: The regression Certification Fee is determined by how many months the students want to dedicate to the programme. Refer to the table below to know the month-wise fee structure: 

Description

Total Fee in INR

Course Fee, 1 month

Rs. 4,117

Course Fee, 3 months

Rs. 8,234

Course Fee, 6 months

Rs. 12,352


Eligibility Criteria

Certification Qualifying Details

To get awarded the Machine Learning: Regression Certification, the students will need to complete the whole process of the programme including the course materials, quizzes, and assignments. 

What you will learn

Machine learning

By the end of  the Machine Learning: Regression Training, the students will learn the following concepts:   

  • Linear Regression
  • Ridge Regression
  • Lasso (Statistics)
  • Regression Analysis
  • Input and output of a regression model.

Who it is for

Machine Learning: Regression Classes are a better option for the following professionals:

  • Data Scientist
  • Big Data Analyst
  • Analytics Manager
  • Business Analyst
  • Developer

Admission Details

Step 1 - At first, the learners must register and sign up on https://www.coursera.org/ to get access to the programmes offered by Coursera. 

Step 2 - After activating the Coursera account, the candidate can sign in.

Step 3 - Then, the candidate can search the ‘University of Washington’ in the search column, and then, the programmes provided by the University of Washington will be shown on the screen. 

Step 4 - Then, find the course 'Machine Learning: Regression’ in the list and click on it. 

Step 5- Then, the page of the course will appear on the screen, and then, click on the option ‘enroll’. They enroll in the programme either by the option of  ‘Audit Only’ or ‘Purchase Course’. 

The Syllabus

Videos
  • Welcome!
  • What is the course about?
  • Outlining the first half of the course
  • Outlining the second half of the course
  • Assumed background
Readings
  • Important Update regarding the Machine Learning Specialization
  • Slides presented in this module
  • Reading: Software tools you'll need

Videos
  • A case study in predicting house prices
  • Regression fundamentals: data & model
  • Regression fundamentals: the task
  • Regression ML block diagram
  • The simple linear regression model
  • The cost of using a given line
  • Using the fitted line
  • Interpreting the fitted line
  • Defining our least squares optimization objective
  • Finding maxima or minima analytically
  • Maximizing a 1d function: a worked example
  • Finding the max via hill climbing
  • Finding the min via hill descent
  • Choosing step size and convergence criteria
  • Gradients: derivatives in multiple dimensions
  • Gradient descent: multidimensional hill descent
  • Computing the gradient of RSS
  • Approach 1: closed-form solution
  • Approach 2: gradient descent
  • Comparing the approaches
  • Influence of high leverage points: exploring the data4m
  • Influence of high leverage points: removing Center City
  • Influence of high leverage points: removing high-end towns
  • Asymmetric cost functions
  • A brief recap
Readings
  • Slides presented in this module
  • Optional reading: worked-out example for closed-form solution
  • Optional reading: worked-out example for gradient descent
  • Download notebooks to follow along
  • Fitting a simple linear regression model on housing data
Quizzes
  • Simple Linear Regression
  • Fitting a simple linear regression model on housing data

Video
  • Multiple regression intro
  • Polynomial regression
  • Modeling seasonality
  • Where we see seasonality
  • Regression with general features of 1 input
  • Motivating the use of multiple inputs
  • Defining notation
  • Regression with features of multiple inputs
  • Interpreting the multiple regression fit
  • Rewriting the single observation model in vector notation
  • Rewriting the model for all observations in matrix notation
  • Computing the cost of a D-dimensional curve
  • Computing the gradient of RSS
  • Approach 1: closed-form solution
  • Discussing the closed-form solution
  • Approach 2: gradient descent
  • Feature-by-feature update
  • Algorithmic summary of gradient descent approach
  • A brief recap
Readings
  • Slides presented in this module
  • Optional reading: a review of matrix algebra
  • Exploring different multiple regression models for house price prediction
  • Numpy tutorial
  • Implementing gradient descent for multiple regression
Quizzes
  • Multiple Regression
  • Exploring different multiple regression models for house price prediction
  • Implementing gradient descent for multiple regression

Videos
  • Assessing performance intro
  • What do we mean by "loss"?
  • Training error: assessing loss on the training set
  • Generalization error: what we really want
  • Test error: what we can actually compute
  • Defining overfitting
  • Training/test split
  • Irreducible error and bias
  • Variance and the bias-variance tradeoff
  • Error vs. amount of data
  • Formally defining the 3 sources of error
  • Formally deriving why 3 sources of error
  • Training/validation/test split for model selection, fitting, and assessment
  • A brief recap
Readings
  • Slides presented in this module
  • Polynomial Regression
Quizzes
  • Assessing Performance
  • Exploring the bias-variance tradeoff

Videos
  • Symptoms of overfitting in polynomial regression
  • Overfitting demo
  • Overfitting for more general multiple regression models
  • Balancing fit and magnitude of coefficients
  • The resulting ridge objective and its extreme solutions
  • How ridge regression balances bias and variance
  • Ridge regression demo
  • The ridge coefficient path
  • Computing the gradient of the ridge objective
  • Approach 1: closed-form solution
  • Discussing the closed-form solution
  • Approach 2: gradient descent
  • Selecting tuning parameters via cross validation
  • K-fold cross-validation
  • How to handle the intercept
  • A brief recap
Readings
  • Slides presented in this module
  • Download the notebook and follow along
  • Download the notebook and follow along
  • Observing effects of L2 penalty in polynomial regression
  • Implementing ridge regression via gradient descent
Quizzes
  • Ridge Regression
  • Observing effects of L2 penalty in polynomial regression
  • Implementing ridge regression via gradient descent

Videos
  • The feature selection task
  • All subsets
  • Complexity of all subsets
  • Greedy algorithms
  • Complexity of the greedy forward stepwise algorithm
  • Can we use regularization for feature selection?
  • Thresholding ridge coefficients?
  • The lasso objective and its coefficient path
  • Visualizing the ridge cost
  • Visualizing the ridge solution
  • Visualizing the lasso cost and solution
  • Lasso demo
  • What makes the lasso objective different
  • Coordinate descent
  • Normalizing features
  • Coordinate descent for least squares regression (normalized features)
  • Coordinate descent for lasso (normalized features)
  • Assessing convergence and other lasso solvers
  • Coordinate descent for lasso (unnormalized features)
  • Deriving the lasso coordinate descent update
  • Choosing the penalty strength and other practical issues with a lasso
  • A brief recap
Readings
  • Slides presented in this module
  • Download the notebook and follow along
  • Using LASSO to select features
  • Implementing LASSO using coordinate descent
Quizzes
  • Feature Selection and Lasso
  • Using LASSO to select features
  • Implementing LASSO using coordinate descent

Videos
  • Limitations of parametric regression
  • 1-Nearest neighbor regression approach
  • Distance metrics
  • 1-Nearest neighbor algorithm
  • k-Nearest neighbors regression
  • k-Nearest neighbors in practice
  • Weighted k-nearest neighbors
  • From weighted k-NN to kernel regression
  • Global fits of parametric models vs. local fits of kernel regression
  • Performance of NN as the amount of data grows
  • Issues with high dimensions, data scarcity, and computational complexity
  • k-NN for classification
  • A brief recap
Readings
  • Slides presented in this module
  • Predicting house prices using k-nearest neighbors regression
Quizzes
  • Nearest Neighbors & Kernel Regression
  • Predicting house prices using k-nearest neighbors regression

Videos
  • Simple and multiple regression
  • Assessing performance and ridge regression
  • Feature selection, lasso, and nearest neighbor regression
  • What we covered and what we didn't cover
  • Thank you!
Readings
  • Slides presented in this module

Instructors

UW Washington Frequently Asked Questions (FAQ's)

1: What is the completion time of the Machine Learning: Regression Online Certification?

The course can be completed within about 22 hours. 

2: Will the students be conferred with a certificate after the Machine Learning: Regression Online Course?

Yes, the students will be conferred with a shareable certificate after the completion of the programme. 

3: In which languages the subtitles are available for the course videos?

The subtitles are available in the languages of Arabic, French, Portuguese (European), Italian, Vietnamese, Korean, German, Russian, English, and Spanish. 

4: Which institution is offering the programme?

The programme is offered by the University of Washington. 

5: Would Coursera provide placement support?

No, Coursera will not render the placement support to the students. 

Articles

Back to top