 What is word to VEC? Words need to be represented with numbers so computers can understand them. The numbers take the form of vectors for every word. Word to VEC is a strategy to convert a word to a vector. This can be done using two models. A continuous bag of words model or a skipgram model. They learn to represent words as vectors such that words closer in meaning are actually physically closer to each other. However, they lack context information. Hence LSTMs and later attention-based methods have taken over. Check the channel for more info on NLP.