• Post category:StudyBullet-7
  • Reading time:5 mins read


Neural machine translation (NMT), Text summarization, Question Answering, Chatbot

What you will learn

Advance knowledge at modern NLP

Understand modern NLP techniques

Advance knowledge at modern DL

Understand modern DL techniques

Description

You will learn the newest state-of-the-art Natural language processing (NLP) Deep-learning approaches.

You will


Get Instant Notification of New Courses on our Telegram channel.


  1. Get state-of-the-art knowledge regarding
    1. NMT
    2. Text summarization
    3. QA
    4. Chatbot
  2. Validate your knowledge by answering short and very easy 3-question queezes of each lecture
  3. Be able to complete the course by ~2 hours.

Syllabus

  1. Neural machine translation (NMT)
    1. Seq2seq
      A family of machine learning approaches used for natural language processing.
    2. Attention
      A technique that mimics cognitive attention.
    3. NMT
      An approach to machine translation that uses an artificial neural network to predict the likelihood of a sequence of words, typically modelling entire sentences in a single integrated model.
    4. Teacher-forcing
      An algorithm for training the weights of recurrent neural networks (RNNs).
    5. BLEU
      An algorithm for evaluating the quality of text which has been machine-translated from one natural language to another.
    6. Beam search
      A heuristic search algorithm that explores a graph by expanding the most promising node in a limited set.
  2. Text summarization
    1. Transformer
      A deep learning model that adopts the mechanism of self-attention, differentially weighting the significance of each part of the input data.
  3. Question Answering
    1. GPT-3
      An autoregressive language model that uses deep learning to produce human-like text.
    2. BERT
      A transformer-based machine learning technique for natural language processing (NLP) pre-training developed by Google.
  4. Chatbot
    1. LSH
      An algorithmic technique that hashes similar input items into the same “buckets” with high probability.
    2. RevNet
      A variant of ResNets where each layer’s activations can be reconstructed exactly from the next layer’s.
    3. Reformer
      Introduces two techniques to improve the efficiency of Transformers.

Resources

  • Wikipedia
  • Coursera
English
language

Content

Neural machine translation (NMT)

Seq2seq
Seq2seq
Attention
attention
Neural machine translation (NMT)
Neural machine translation (NMT)
BLEU
BLEU
Beam search
Beam search

Text summarization

Transformer
Transformer

Question Answering

GPT-3
GPT-3
BERT
BERT

Chatbot

LSH
LSH
RevNet
RevNet
Reformer
Reformer

Bonus

GPT-3
DALL-E
CLIP