Explain how Word2Vec works?

337    Asked by LarryHuffer in Data Science , Asked on Dec 23, 2019
Answered by Larry Huffer

Word2Vec is a popular word embedding technique which uses a two-layer neural network to represent words. It takes corpus of text as an input and gives set of vectors as an output. It makes a language easy to understand for a machine by converting words into vectors. There are two main training algorithms of Word2Vec

  1. Continuous Bag of Words
  2. Skip-gram.

Both are prediction-based approaches. One uses context to predict a word and the other is the opposite.



Your Answer

Interviews

Parent Categories