Actions

Reverse Algebra

From Algolit

Type: Algoliterary exploration
Technique: Word embeddings
Developed by: Radim Rehurek and Petr Sojka & Algolit

Word embeddings are language modelling techniques that through multiple mathematical operations of counting and ordering, plot words into a multi-dimensional vector space. When embedding words, they transform from being distinct symbols into mathematical objects that can be operated on.

Algebra is generally defined as a generalisation of arithmetic in which letters representing numbers are combined according to the rules of arithmetic. In natural language processing, the opposite happens: words are represented by numbers which are generated by the circumstances of the text and of the algorithm which processes them.

This exploration is using gensim, an open source vector space and topic modelling toolkit implemented in Python, to manipulate text according to the mathematic relationships which emerge between the words, once they have been plotted in a vector space.