• Post category:StudyBullet-8
  • Reading time:15 mins read


Linear & Non-Linear Regression, Lasso & Ridge Regression, SHAP, LIME, Yellowbrick, Feature Selection & Outliers Removal

What you will learn

Analyse and visualize data using Linear Regression

Plot the graph of results of Linear Regression to visually analyze the results

Learn how to interpret and explain machine learning models

Do in-depth analysis of various forms of Linear and Non-Linear Regression

Use YellowBrick, SHAP, and LIME to interact with predictions of machine learning models

Do feature selection and transformations to fine tune machine learning models

Course contains result oriented algorithms and data explorations techniques

Description

This course teaches you an in-depth analysis of Linear Regression. We cover the theory and coding part together for better understanding. You will learn how to do an exhaustive analysis of machine learning models. We will show you result-oriented techniques to boost the accuracy of your machine learning models. This course teaches you everything you need to create an accurate Linear Regression model in Python.

You should have an introductory knowledge of Python before enrolling in this course otherwise please do not enroll in this course.

After completing this course you will be able to:

  • Interpret and Explain machine learning models which are treated as a black-box
  • Create an accurate Linear Regression model in python and visually analyze it
  • Select the best features for a business problem
  • Remove outliers and variable transformations for better performance
  • Confidently solve and explain regression problems

What is covered in this course?

This course teaches you, step by step coding for Linear Regression in Python. The Linear Regression model is one of the widely used in machine learning and it is one the simplest ones, yet there is so much depth that we are going to explore in 14+ hours of videos.

Below are the course contents of this course:


Get Instant Notification of New Courses on our Telegram channel.


  • Section 1- IntroductionThis section gets you to get started with the setup. Download resources files for code along.
  • Section 2- Python Crash CourseThis section introduces you to the basics of Python programming.
  • Section 3- Numpy IntroductionThis section is optional, you may skip it but I would recommend you to watch it if you are not comfortable with NumPy.
  • Section 4- Pandas IntroductionThis section introduces you to the basic concepts of Pandas. It will help you later in the course to catch up on the coding.
  • Section 5- Matplotlib IntroductionDo not skip this section. We will be using matplotlib plots extensively in the coming sections. It builds a foundation for a strong visualization of linear regression results.
  • Section 6- Linear Regression IntroductionWe will kick-start our Linear Regression learning. You will learn the basics of linear regression. You will see some examples so that you can understand how Linear Regression works and how to analyze the results.
  • Section 7- Data Preprocessing for Linear RegressionThis section is the most important section. DO NOT SKIP IT. It builds the foundation of data preprocessing for linear regression and other linear machine learning models. You will be learning, what are the techniques which we can use to improve the performance of the model. You will also learn how to check if your data is satisfying the coding of Linear Model Assumptions.
  • Section 8- Machine Learning Models Interpretability and ExplainerThis section teaches you how to open-up any machine learning models. Now you don’t need to treat machine learning models as black-box, you will get to learn how to open this box and how to analyze each and every component of machine learning models.
  • Section 9- Linear Regression Model OptimizationThis section extensively uses the knowledge of previous sections so don’t skip those. You will learn various techniques to improve model performance. We will show you how to do outliers removal and feature transformations.
  • Section 10- Feature Selection for Linear RegressionThis section teaches you some of the best techniques of feature selection. Feature selection reduces the model complexity and chances of model overfitting. Sometimes the model also gets trained faster but mostly depends on how many features are selected and the types of machine learning models.
  • Section 11- Ridge & Lasso Regression, ElasticNet, and Nonlinear RegressionThis section covers, various types of regression techniques. You will be seeing how to achieve the best accuracy by using the above techniques.

By the end of this course, your confidence will boost in creating and analyzing the Linear Regression model in Python. You’ll have a thorough understanding of how to use regression modeling to create predictive models and solve real-world business problems.

How this course will help you?

This course will give you a very solid foundation in machine learning. You will be able to use the concepts of this course in other machine learning models. If you are a business manager or an executive or a student who wants to learn and excel in machine learning, this is the perfect course for you.

What makes us qualified to teach you?

I am a Ph.D. in Machine Learning and taught tens of thousands of students over the years through my classes at IIT and KGP Talkie YouTube channel. Few of my courses are part of Udemy’s top 5000 courses collection and curated for Udemy Business. I promise you will not regret it.

English
language

Content

Introduction

Introduction
Resources Folder | DO NOT SKIP
Install Anaconda and Python 3 on Windows 10
Install Anaconda and Python 3 on Ubuntu Machine
Jupyter Notebook Shortcuts

Python Crash Course

Introduction
Data Types
Variable Assignment
String Assignment
List
Set
Tuple
Dictionary
Boolean and Comparison Operator
Logical Operator
If, Else, Elif
Loops in Python
Methods and Lambda Function

Numpy Introduction [Optional]

Introduction
Array
NaN and INF
Statistical Operations
Shape, Reshape, Ravel, Flatten
Sequence, Repetitions, and Random Numbers
Where(), ArgMax(), ArgMin()
File Read and Write
Concatenate and Sorting
Working with Dates

Pandas Introduction

Introduction
DataFrame and Series
File Reading and Writing
Info, Shape, Duplicated and Drop
Columns
NaN and Null Values
Imputation
Lambda Functions

Matplotlib Introduction

Introduction
Line Plot
Label for X-Axis and Y-Axis
Scatter Plot, Bar Plot, and Hist Plot
Box Plot
Subplot
xlim, ylim, xticks, and yticks
Pie Plot
Pie Plot Text Color
Nested Pie Plot
Labeling a Pie Plot
Bar Chart on Polar Axis
Line Plot on a Polar Axis
Scatter Plot on a Polar Axis
Integral in Calculus Plot as Area Under the Curve

Linear Regression Introduction

Linear Regression Introduction
Regression Examples
Types of Linear Regression
Assessing the performance of the model
Bias-Variance tradeoff
What is sklearn and train-test-split
Python Package Upgrade and Import
Load Boston Housing Dataset
Dataset Analysis
Exploratory Data Analysis- Pair Plot
Exploratory Data Analysis- Hist Plot
Exploratory Data Analysis- Heatmap
Train Test Split and Model Training
How to Evaluate the Regression Model Performance
Plot True House Price vs Predicted Price
Plotting Learning Curves Part 1
Plotting Learning Curves Part 2
Machine Learning Model Interpretability- Residuals Plot
Machine Learning Model Interpretability- Prediction Error Plot

Data Preprocessing for Linear Regression

Linear Model Assumption for Linear Regression
Definitions of Linear Model Assumptions
Load Boston Dataset
Create Reference Data
Check Linear Assumption for Boston Dataset Part 1
Check Linear Assumption for Boston Dataset Part 2
Log Transformation of Variables
Types of Variable Transformations
Reciprocal Transformation
sqrt and exp Transformation
Box-Cox Transformation
Yeo-Johnson Transformation
Check Variables Normality with Histogram
Check Variables Normality with Q-Q Plot
Variable Transformation for Normality
Check Variables Homocedasticity
Variable Transformation for Homoscedasticity Part 1
Variable Transformation for Homoscedasticity Part 2
How to Check Multicolinearity
Normalization and Standardization Introduction
Normalization and Standardization Coding

Machine Learning Models Interpretability and Explainer

Machine Learning Models Interpretability
Recap
Prediction Error Plot
Residuals Plot
Explain Machine Learning Models with LIME Part 1
Explain Machine Learning Models with LIME Part 2
Explain Machine Learning Models with LIME Part 3
Machine Learning Models Explainer Summary with SHAP
Explain Machine Learning Models with Dependence Plot
Explain Machine Learning Models with The Individual Force Plot
Explain Machine Learning Models with The Collective Force Plot
Explain Machine Learning Models with Shap Heatmap
Explain Machine Learning Models with with SHAP Waterfall Plots
Explain Feature Importance in Machine Learning Models
Explain Feature Selection with SHAP

Linear Regression Model Optimization

Recap
Check Linear Model Assumptions on Selected Features
Check Linear Model Assumptions on Selected Features Part 2
Detect Outliers in Machine Learning Datasets
Outliers Visualization Plot
Outlier Detection for Normal Variables
Outlier Detection for Skewed Variables
Types of Outliers Removal Techniques
Outliers Removal by Using Feature-Engine
Model Evaluation After Removing the Outliers
Model Evaluation After Feature Transformations and Outliers Removal

Feature Selection for Linear Regression

Introduction to Feature Selection
Introduction to A Python API for Intelligent Visual Discovery
Recap
Visualization of Data with LUX API
Select Most Correlated Features with House Price Part 1
Select Most Correlated Features with House Price Part 2
Model Performance Evaluation
Remove Correlated Input Features (Multicollinearity)
Recursive Feature Elimination (RFE) Introduction
10 Recursive Feature Elimination (RFE) Coding
Increamental RFE
Exhaustive Feature Selection (EFS)
Feature Selection by Linear Regression Coefficients

Ridge & Lasso Regression, ElasticNet, and Nonlinear Regression

Introduction to Regularization
Recap
Ridge Regression
Lasso Regression
Elastic Net
Polynomial Regression
Polynomial Regression with Variable Transformations
Polynomial Regression with Feature Selection