Question Answering

NLP with Attention Models

This week we will dive into Neural Question Answering. We will build advanced models like T5 and BERT to accurately answer questions based on given contexts. We will fine-tune these models to optimize their performance. We will gain practical experience in building question-answering systems.
Attention
Coursera
Deep Learning Algorithms
NLP
Notes
NLP with Attention Models
Neural Machine Translation
Transformer
Teacher forcing
Positional encoding
Question answering task
Author

Oren Bochman

Published

Saturday, April 10, 2021

Keywords

BERT, Causal attention, Dot product attention, Multi-head attention, Self attention, Transformer decoder, T5