• Post category:StudyBullet-7
  • Reading time:5 mins read


Build up your intuition of the fundamental building blocks of Neural Networks.

What you will learn

Understand the intuition behind Artificial Neural Networks

Understand the intuition behind Convolutional Neural Networks

Understand the intuition behind Recurrent Neural Networks

Apply Artificial Neural Networks in practice

Apply Convolutional Neural Networks in practice

Apply Recurrent Neural Networks in practice

Description

Deep learning (also known as deep structured learning) is part of a broader family of machine learning methods based on artificial neural networks with representation learning. Learning can be supervised, semi-supervised or unsupervised. Deep-learning architectures such as deep neural networks, deep belief networks, deep reinforcement learning, recurrent neural networks and convolutional neural networks have been applied to fields including computer vision, speech recognition, natural language processing, machine translation, bioinformatics, drug design, medical image analysis, material inspection and board game programs, where they have produced results comparable to and in some cases surpassing human expert performance.

This course covers the following three sections: (1) Neural Networks, (2) Convolutional Neural Networks, and (3) Recurrent Neural Networks. You will be receiving around 4 hours of materials on detailed discussion, mathematical description, and code walkthroughs of the three common families of neural networks. The descriptions of each section is summarized below.

Section 1 – Neural Network

1.1 Linear Regression

1.2 Logistic Regression

1.3 Purpose of Neural Network

1.4 Forward Propagation

1.5 Backward Propagation

1.6 Activation Function (Relu, Sigmoid, Softmax)

1.7 Cross-entropy Loss Function

1.8 Gradient Descent

Section 2 – Convolutional Neural Network

2.1 Image Data

2.2 Tensor and Matrix

2.3 Convolutional Operation


Get Instant Notification of New Courses on our Telegram channel.


2.4 Padding

2.5 Stride

2.6 Convolution in 2D and 3D

2.7 VGG16

2.8 Residual Network

Section 3 – Recurrent Neural Network

3.1 Welcome

3.2 Why use RNN

3.3 Language Processing

3.4 Forward Propagation in RNN

3.5 Backpropagation through Time

3.6 Gated Recurrent Unit (GRU)

3.7 Long Short Term Memory (LSTM)

3.8 Bidirectional RNN (bi-RNN)

English
language

Content

Welcome!

Welcome Message!
Course Outline

Artificial Neural Networks

Linear Regression
Logistic Regression
Purpose of Neural Networks
Forward Propagation
Backward Propagation
Activation Function
Cross-entropy Loss Function
Gradient Descent
Lab 1 – Intro to Python
Lab 2 – Intro to Tensorflow
Lab 3 – Intro to Neural Network
Lab 4 – Functional API
Lab 5 – Building Deeper and Wider Model

Convolutional Neural Network

Image data
Tensor and Matrix
Convolutional Operation
Padding
Stride
Convolution in 2D and 3D
VGG16
Residual Network
Lab 1 – Intro to Conv1D
Lab 2 – Intro to CNN
Lab 3 – Deep CNN
Lab 4 – Transfer Learning

Recurrent Neural Network

Welcome to RNN
Why Use RNN
Language Processing
Forward Propagation in RNN
Backward Propagation Through Time
Gated Recurrent Unit (GRU)
Long Short Term Memory (LSTM)
Bi-directional RNN
Lab 1 – RNN in Text Classification
Lab 2 – Sequence to Sequence Stock Candlestick Forecast