Introduction to Word2Vec
Word2vec is a popular group of related models used to produce word embeddings. Word embeddings are vector representations of words that capture semantic meaning allowing words with similar meanings to have similar representations. Word2vec models are shallow, two-layer neural networks trained on large amounts of text data to reconstruct linguistic