Abstract
We develop an asymptotic theory for the partition function of the word embeddings model word2vec. The proof involves a study of properties of matrices, their determinants and distributions of random normal vectors when their dimension tends to infinity. The conditions imposed are mild enough to cover practically important situations. The implication is that for any word i from a vocabulary W, the context vector ci is a reflection of the word vector wi in approximately half of the dimensions. This allows us to halve the number of trainable parameters in static word embedding models.
Original language | English |
---|---|
Pages (from-to) | 70-81 |
Number of pages | 12 |
Journal | Eurasian Mathematical Journal |
Volume | 13 |
Issue number | 4 |
DOIs | |
Publication status | Published - 2022 |
Keywords
- asymptotic distribution
- neural networks
- partition function
- word embeddings
- word2vec
ASJC Scopus subject areas
- General Mathematics