Week 2 - Text Summarization

NLP with Attention Models

This week we unlock the secrets of Neural Text Summarization. We will building a powerful Transformer model to extract crucial information and create concise summaries. Through hands-on exercises using JAX, we will learn techniques like beam search and length normalization to enhance the quality of our summaries. Through hands-on exercises we will train our model on a dataset of articles.
NLP with Attention Models
Neural Machine Translation
Coursera
Notes
Deep Learning Algorithms
Transformer
Teacher forcing
Positional encoding
GPT2
Transformer decoder
Attention
Dot product attention
Self attention
Causal attention
Multi-head attention
Summarization task
Author

Oren Bochman

Published

Thursday, April 1, 2021