DEEP LEARNING AND NLP : HOW TO CREATE A NEURAL MACHINE TRANSLATOR
Machine Learning
Sep 09, 2018

DEEP LEARNING AND NLP : HOW TO CREATE A NEURAL MACHINE TRANSLATOR

DEEP LEARNING AND NLP : HOW TO CREATE A NEURAL MACHINE TRANSLATION:

This course is a crash course for someone who wish to dive into deep learning with TensorFlow. Learn all the basics of tensorflow, machine learning, natural language processing. 

The course have been divided into sections and at the end of the course you will build a Seq-Seq Neural Machine Translator. 

SECTION 1: Welcome to the Course

  • Get excited !

  • Applications of NLP in real life / understanding NLP

  • Tools for this course

  • How to succeed in this course


SECTION 2: Understanding Tensorflow

  • Tensorflow introduction

  • Inputs, variables, outputs and operations

  • Loss, optimizers and Training

  • Build our first Neural Network - MNIST CNN classifier

  • Source code


SECTION 3: Word2vec  - Learning word Embedding

  • Understanding classical word embedding

  1. Skip-gram algorithm with Tensorflow

  • Download our dataset

  • Read data and preprocessing with NLTK

  • Build the dictionaries

  • Generating batches of data for skip-gram

  • Defining hyperparameters for skip-gram algorithm

  • Defining inputs and outputs

  • Defining model parameters and other variables

  • Calculating word similarities

  • Model parameter optimizer

  • Running the skip-gram algorithm

  • Visualize the learning of the skip-gram

  • Source codes

 2. Continuous Bag-of-word

  • Changing the data generation process

  • Defining hyperparameters

  • Defining inputs and outputs

  • Defining model parameters and other variables

  • Defining the model computation

  • Model parameter optimizer

  • Calculating word similarities

  • Run the CBOW algorithm

  • Source codes


SECTION 4: Understanding Recurrent Neural Networks

  • Introduction to RNN

  • Backpropagation explains

  • Applications of RNNs - Generating Texts

  • Downloading our dataset

  • Reading data and preprocessing

  • Building the dictionaries(Bigrams)

  • Generating Batches of Data

  • Defining Hyper parameters

  • Defining inputs and outputs

  • Defining model parameters and other variables

  • Defining inference of the RNN

  • Calculating RNN loss

  • Defining learning rate and optimizer with Gradient Clipping

  • Resetting operations for resetting Hidden States

  • Prediction sampling

  • Running the RNN to Generate Text


SECTION 5: Long-Short Term Memory - Improving Generating Texts using LSTM

  • Understanding LSTM - the Maths behind LSTM

  • How LSTM solve the vanishing gradient problem

  • Understanding our data

  • Defining the LSTM

  • Defining hyperparameters

  • Defining inputs and outputs

  • Defining Model parameters

  • LSTM Computations

  • Calculating RNN loss

  • Defining learning rate and optimizer with Gradient Clipping

  • Resetting operations for resetting Hidden States

  • Greedy Sampling to Break the Repetition

  • Running Training, validation and Generation


SECTION 6: Improving LSTM

  • Peephole Connections - improving LSTM

  • Using Gated Recurrent Unit to improve our Text Generation

  • Comparing LSTMs to LSTMs with peephole Connections and GPUs

  • Improving LSTM - beam search

  • Improving LSTM - using words instead of n-grams


SECTION 7 - Machine Translation - German to English

  • Introduction to Neural Machine Translation

  • Preparing data for NMT system

  • Preprocessing text

  • Defining hyperparameters

  • Defining inputs/output placeholders

  • Defining the Encoder Model

  • Defining the Decoder Model

  • Defining LSTM Computations

  • Calculating the loss / Optimizer

  • Resetting Train and Test States

  • Running the NMT - step 1 - print and save the prediction result for training data

  • Running the NMT - step 2 - print and save the prediction result for test data

  • Running the NMT - step 3 - obtain the candidate and reference data to calculate BLEU score

  • Running the NMT - step 4 - Defining a single step of training

  • Running the NMT - step 5 - Defining Data Generators

  • Running Training and testing for NMT


SECTION 8: Improving NMT

  • Understanding Deep LSTM, force Teaching

  • Attention based Machine Translation


SECTION 9: Other applications of NLP in Deep learning

  • Chatbots

  • Image Caption Generation









---------------------------------------------------------

Done By: Constantine N. Mbufung

Master Research Student: AI,

Deep Learning NLP and Computer Vision

Skype: constantine.mbufung

Data Scientist at YooMee Cameroun

Post your comment
Apply Now