Year: 2,013
Journal: International Conference on Learning Representations
Languages: All Languages
Programming languages: Shell
Input data:


Output data:

words as vectors

Word2vec is a technique for natural language processing. The word2vec algorithm uses a neural network model to learn word associations from a large corpus of text. Once trained, such a model can detect synonymous words or suggest additional words for a partial sentence. As the name implies, word2vec represents each distinct word with a particular list of numbers called a vector. The vectors are chosen carefully such that a simple mathematical function (the cosine similarity between the vectors) indicates the level of semantic similarity between the words represented by those vectors.

Sign In


Reset Password

Please enter your username or email address, you will receive a link to create a new password via email.