Neural Machine Translation

NLP with Attention Models

This week we dive deep into the Neural Machine Translation. We’ll learn about the encoder-decoder architecture, explore attention mechanisms that enable the model to focus on different parts of the input sequence during translation. In the hands-on exercises, we’ll implement an attention model for English to German translation, train it on a dataset of sentence pairs, and evaluate its performance.
Attention
Beam search
BLEU
ROUGE
Coursera
NLP with Attention Models
Notes
Machine translation task
MBR
NLP
Positional encoding
Seq2Seq
Transformer
Teacher forcing
Translation task
Word alignment
Author

Oren Bochman

Published

Saturday, March 20, 2021

Keywords

seq2seq models, Dot product attention, Deep Learning Algorithms, Machine Translation, Neural Machine Translation, Translation evaluation metrics, Encoder-Decoder architecture, Minimum Bayes Risk (MBR)