Last updated on 1/28/21
Check Your Understanding of Embedding Techniques
- Vectorize Text For Exploration Using Word Embeddings
What is the main advantage of word embeddings compared to tf-idf?
The word vectors capture the relative meaning of the words.
The distance between words depends on the size of the vectors.
Word embeddings take less space because the document-term matrix is not sparse.
Which of the following assertions are true?Careful, there are several correct answers.
Embeddings have fixed size vectors, whereas tf-idf vectors depend on the number of documents.
Embeddings have dense vectors with few zero values.
Embeddings have a mean of zero.
Which assertion is true?Careful, there are several correct answers.
Training GloVe is faster than word2vec.
fastText tokenizes on parts of words.
fastText can handle OOV.