Skip to main content
Fig. 11 | Applied Network Science

Fig. 11

From: Characterizing financial markets from the event driven perspective

Fig. 11

Graphical representation of word2vec. Note The flowchart is borrowed from the original paper of Mikolov et al. (2013). It depicts the mechanism underpinning the two sub-algorithms of word2vec,CBOW and Skip-gram. In CBOW, the network learns \(w_n\), the word at position n, by virtue of its predecessors \(w_{n-2}\) and \(w_{n-1}\) as well as successors \(w_{n+1}\) and \(w_{n+2}\). Skip-Gram reverses the process in CBOW, that is, with \(w_n\) given, the neural network predicts the surrounding words \((w_{n-2},w_{n-1},w_{n+1},w_{n+2})\)

Back to article page