Word Embeddings & Vector Space Models - embed+this=wherever

Version 15.1 by Qfwfq on 2019/05/13 00:06


 But isn’t midnight intermittent
Or was that just a whispered nine
A snap of blown light low against the flank of a cow
A likeness of something numberless that only I not knowing the sound
            might know
It may have been howled by a circling dog being chastised — threatened —
            by multiples of itself in pursuit of the consolation of knowing that
            everything is real
It was real
I don’t mean midnight — despite horizon, nipple, and fissure
I don’t mean
And yet I do — mean, I mean
 (Lyn Hejinian, [But isn’t midnight intermittent])

Language Game: embed+this=wherever

embed+this=whenever sentence constellationPainting of a word constellation

Early sketches towards the description: embed+this=wherever is a computationally mediated diagram-poem cycle confronting the geometrization of language by neural networks and other algorithmic models, in search of the gaps around algorithm-fabricated spaces. The quirks and failures of such mathematical spaces, calculated from our collective language use in Wikipedia and similar tracts of the web, now surround, act upon, and watch us through systems for translation, sentiment and network surveillance, and the corporate characters for synthetic conversation - embed+this=wherever writes within these models hoping to reveal, interrupt, or alter, as well as use, them; their is nothing neutral about these geometries. It consists of short texts successively modified using GloVe word embeddings and the METAFONT algebraic typography system. In the draft sketch, the words in a sentence are diagrammed or constellated by their distance in the model, as words and character shapes change down lines of computed closeness.

Theory: Word Embeddings & Vector Space Models

Vector-space models encode meanings (eg of words) as vectors (lists of numbers). On its own, that doesn't do very much, but vectors come with definitions of linearity (just being on a line) and distance that can be made useful. Conventionally, similarity is associated with distance. This involves turning linear (one-dimensional) relationships in a text into high-dimensional geometric information (a word becomes associated with a vector with hundreds of dimensions).

This is a class of models rather than a single theory. Old examples simply constructed Term Document Matrices, associating a vector to a word saying how often it shows up in a variety of documents. Modern vectors are instead constructed by artificial neural networks that try to predict a word from its context.

Tags:
Created by Qfwfq on 2019/01/13 00:43
    

Tips

You can click on the arrows next to the breadcrumb elements to quickly navigate to sibling and children pages.

My Recent Modifications

Need help?

If you need help with XWiki you can contact:

Titled, "Untitled" - Kavi Duvvoori - The Committee Made in Charge of Such Matters - Please Reuse or Distribute Further Only With a Measure of Generosity, Care, and Sense