• Post category:StudyBullet-4
  • Reading time:11 mins read




Learn Machine Learning, Deep Learning, Bayesian Learning and Model Deployment in Python.

What you will learn

 

Deep Learning with Tensorflow!!!

 

Deep Learning with PyTorch!!! Yes both Tensorflow + PyTorch!

 

Bayesian learning with PyMC3

 

Data Analysis with Pandas

 

Algorithms from scratch using Numpy

 

Using Scikit-learn to its full effect

 

Model Deployment

 

Model Diagnostics

 

Natural Language Processing

 

Unsupervised Learning

 

Natual Language Processing with Spacy

 

Time series modelling with FB Prophet

 

Python

Description


Get Instant Notification of New Courses on our Telegram channel.


This is a course on Machine Learning, Deep Learning (Tensorflow + PyTorch) and Bayesian Learning (yes all 3 topics in one place!!!). Yes BOTH Pytorch and Tensorflow for Deep Learning.

We start off by analysing data using pandas, and implementing some algorithms from scratch using Numpy. These algorithms include linear regression, Classification and Regression Trees (CART), Random Forest and Gradient Boosted Trees.

We start off using TensorFlow for our Deep Learning lessons. This will include Feed Forward Networks, Convolutional Neural Nets (CNNs) and Recurrent Neural Nets (RNNs). For the more advanced Deep Learning lessons we use PyTorch with PyTorch Lightning.

We focus on both the programming and the mathematical/ statistical aspect of this course. This is to ensure that you are ready for those theoretical questions at interviews, while being able to put Machine Learning into solid practice.

Some of the other key areas in Machine Learning that we discuss include, unsupervised learning, time series analysis and Natural Language Processing. Scikit-learn is an essential tool that we use throughout the entire course.

We spend quite a bit of time on feature engineering and making sure our models don’t overfit. Diagnosing Machine Learning (and Deep Learning) models by splitting into training and testing as well as looking at the correct metric can make a world of difference.

I would like to highlight that we talk about Machine Learning Deployment, since this is a topic that is rarely talked about. The key to being a good data scientist is having a model that doesn’t decay in production.

I hope you enjoy this course and please don’t hesitate to contact me for further information.

 

English
language

Content

Introduction

Introduction
How to tackle this course
Installations and sign ups
Jupyter Notebooks
Course Material

Basic python

Intro
Basic Data Structures
Dictionaries
Python functions (methods)
Numpy functions
Conditional statements
For loops
Dictionaries again

Pandas

Intro
Pandas simple functions
Subsetting
loc and iloc
loc and iloc 2
map and apply
groupby

Numpy

Gradient Descent
Kmeans part 1
Kmeans part 2
Broadcasting

Scikit-Learn

Intro
Linear Regresson Part 1
Linear Regression Part 2
Classification and Regression Trees
CART part 2
Random Forest theory
Random Forest Code
Gradient Boosted Machines

Plotting

Plotting resources (notebooks)
Line plot
Plot multiple lines
Histograms
Scatter Plots
Subplots
Seaborn + pair plots

Classification

Kaggle part 1
Kaggle part 2
Theory part 1
Theory part 2 + code
Titanic dataset
Sklearn classification prelude
Sklearn classification
Dealing with missing values

Time Series

Intro
Loss functions
FB Prophet part 1
FB Prophet part 2
Theory behind FB Prophet

Model Diagnostics

Overfitting
Cross Validation
Stratified K Fold
Area Under Curve (AUC) Part 1
Area Under Curve (AUC) Part 2

Unsupervised Learning

Principal Component Analysis (PCA) theory
Fashion MNIST PCA
K-means
Other clustering methods
DBSCAN theory
Gaussian Mixture Models (GMM) theory

Natural Language Processing

Intro
Stop words and Term Frequency
Term Frequency – Inverse Document Frequency (Tf – Idf) theory
Financial News Sentiment Classifier
NLTK + Stemming
N-grams
Word (feature) importance

NLP Part 2 (Spacy)

Spacy intro
Feature Extraction with Spacy (using Pandas)
Classification Example
Over-sampling

Regularization

Introduction
MSE recap
L2 Loss / Ridge Regression intro
Ridge regression (L2 penalised regression)
S&P500 data preparation for L1 loss
L1 Penalised Regression (Lasso)
L1/ L2 Penalty theory: why it works

Deep Learning

Intro
DL theory part 1
DL theory part 2
Tensorflow + Keras demo problem 1
Activation functions
First example with Relu
MNIST and Softmax
Deep Learning Input Normalisation
Softmax theory
Batch Norm
Batch Norm Theory

Convolutional Neural Nets

Intro
Fashion MNIST feed forward net for benchmarking
Keras Conv2D layer
Model fitting and discussion of results
Dropout theory and code
MaxPool (and comparison to stride)

Model Deployment

Intro
Saving Models
FastAPI intro
FastAPI serving model

Final Thoughts

Some advice on your journey