• 10 hours
  • Hard

Free online content available in this course.

course.header.alt.is_certifying

Got it!

Last updated on 1/28/21

Check Your Understanding of Embedding Techniques

Log in or subscribe for free to enjoy all this course has to offer!

Evaluated skills

  • Vectorize Text For Exploration Using Word Embeddings
  • Question 1

    What is the main advantage of word embeddings compared to tf-idf?

    • The word vectors capture the relative meaning of the words.

    • The distance between words depends on the size of the vectors.

    • Word embeddings take less space because the document-term matrix is not sparse.

  • Question 2

    Which of the following assertions are true?

    Careful, there are several correct answers.
    • Embeddings have fixed size vectors, whereas tf-idf vectors depend on the number of documents.

    • Embeddings have dense vectors with few zero values.

    • Embeddings have a mean of zero.

  • Question 3

    Which assertion is true?

    Careful, there are several correct answers.
    • Training GloVe is faster than word2vec.

    • fastText tokenizes on parts of words.

    • fastText can handle OOV.