Explain how Word2Vec works?
Word2Vec is a popular word embedding technique which uses a two-layer neural network to represent words. It takes corpus of text as an input and gives set of vectors as an output. It makes a language easy to understand for a machine by converting words into vectors. There are two main training algorithms of Word2Vec
- Continuous Bag of Words
- Skip-gram.
Both are prediction-based approaches. One uses context to predict a word and the other is the opposite.