A user types a sentence "Zoni I want to find a pencil, a eraser and a sharpener" an finds no output but 'European authorities fined Google a record $5.1 billion on Wednesday for abusing its power in the mobile phone market and ordered the company to alter its practices' and gets the following output. Why did Spacy fail to recognize any entities in the first sentence?
Word2Vec uses a neural network to represent words whose hidden network encodes the representation into vectors. On the other hand, FastText breaks words into several n-grams and train on the data. For instance, the tri-grams for the word apple is app, ppl, and ple (ignoring the starting and ending of boundaries of words). The word embedding vector for apple will be the sum of all these n-grams. FastText takes longer time compared to Word2Vec but it performs better than Word2Vec in terms of embeddings.