Classification & Vector Spaces - Word Embeddings and Vector Spaces

Course 1 of NLP Specialization

Concepts, code snippets, and slide commentaries for this week’s lesson of the Course notes from the deeplearning.ai natural language programming specialization.
NLP
Coursera
notes
deeplearning.ai
course notes
Word Embeddings
PCS
Laplace smoothing
Log-likelihood
classification
sentiment analysis task
bibliography
Author

Oren Bochman

Published

Friday, October 23, 2020

Vector Space Models

Word-by-Word Design

Word by Document Design

Euclidean Distance

Cosine Similarity

Intuition

Background

Norm of a Vector

Dot-product of Two Vectors

Implementation

Word Manipulation in Vector Spaces

PCA

Visualization of Word Vectors

Implementation

Putting It Together with Code

Document As a Vector

Reuse

CC SA BY-NC-ND

Citation

BibTeX citation:
@online{bochman2020,
  author = {Bochman, Oren},
  title = {Classification \& {Vector} {Spaces} - {Word} {Embeddings} and
    {Vector} {Spaces}},
  date = {2020-10-23},
  url = {https://orenbochman.github.io/notes/deeplearning.ai-nlp-c1/l3-pca/l3.html},
  langid = {en}
}
For attribution, please cite this work as:
Bochman, Oren. 2020. “Classification & Vector Spaces - Word Embeddings and Vector Spaces.” October 23, 2020. https://orenbochman.github.io/notes/deeplearning.ai-nlp-c1/l3-pca/l3.html.