Algebra with Words
Revision as of 16:37, 9 March 2019 by An
Word embeddings are language modelling techniques that through multiple mathematical operations of counting and ordering, plot words into a multi-dimensional vector space. When embedding words, they transform from being distinct symbols into mathematical objects that can be multiplied, divided, added or substracted.
By distributing the words along the many diagonal lines of the multi-dimensional vector space, their new geometrical placements become impossible to perceive by humans. However, what is gained are multiple, simultaneous ways of ordering. Algebraic operations make the relations between vectors graspable again.
This installation is using gensim, an open source vector space and topic modelling toolkit implemented in Python, to manipulate text using the mathematical relationships that emerge between the words, once they have been plotted in a vector space.
Concept & interface: Cristina Cochior
Technique: word embeddings, word2vec
Original model: Radim Rehurek and Petr Sojka