Introduction to Natural Language Processing10 hoursHardLicense Free online content available in this course. course.header.alt.is_certifying Got it! Last updated on 3/4/22 Build Your First Word Cloud Remove Stop Words From a Block of Text Apply Tokenization Techniques Create a Unique Word Form With SpaCy Extract Information With Regular Expression Quiz: Preprocess Text Data Apply a Simple Bag-of-Words Approach Apply the TF-IDF Vectorization Approach Apply Classifier Models for Sentiment Analysis Discover The Power of Word Embeddings Compare Embedding Models Train Your First Embedding Models Bonus! Doing More with SpaCy Quiz: Check Your Understanding of Embedding Techniques Check Your Understanding of Embedding Techniques Log in or subscribe for free to enjoy all this course has to offer! Evaluated skillsVectorize Text For Exploration Using Word EmbeddingsQuestion 1What is the main advantage of word embeddings compared to tf-idf?The word vectors capture the relative meaning of the words.The distance between words depends on the size of the vectors.Word embeddings take less space because the document-term matrix is not sparse.Question 2Which of the following assertions are true?Careful, there are several correct answers.Embeddings have fixed size vectors, whereas tf-idf vectors depend on the number of documents.Embeddings have dense vectors with few zero values.Embeddings have a mean of zero.Question 3Which assertion is true?Careful, there are several correct answers.Training GloVe is faster than word2vec.fastText tokenizes on parts of words.fastText can handle OOV. Bonus! Doing More with SpaCy