This installation makes poetry using MIT’s ConceptNet


Definitions is a meditation on digital ontologies and commodification.


The brainchild of Parsons School of Design student Bryan Ma, Definitions is a poetic computational installation comprised of 15 small, networked LCDs that serves as a metaphor for the use of neuro-linguistic programming (NLB) in commoditizing human activity on the web.

definitions6

One at a time, the LCDs individually display seemingly random English words or phrases in order from the left to the right. After all of the screens have shown a word, the system restarts and reveals a new sequence. As Ma explains, the project utilizes the ability of computers to comprehend semantic meaning via “common-sense networks” to metrically represent the socio-cultural effects of natural language processing.

Each word or phrase on the LCD is connected via some semantic relationship to those adjacent to it. For example: “computer” -> “keyboard” -> “music”; or “cat” -> “water” -> “plant.” Computers have keyboards and keyboards play music, and cats don’t like water which is required by plants. These terms are sourced in real-time via an algorithm searching through MIT’s massive ConceptNet semantic network. (ConceptNet contains lots of things computers should know about the world, especially when understanding text written by people.) As a result, the possible combinations are practically infinite, though occasionally highly amusing or unexpected.

“With a degree of scrutiny it becomes apparent that the first and last LCDs always have the same two words: ‘person’ on the first, and ‘money’ on the last. The semantic pathway between them, though innumerably varied, begins and ends identically each time,” the Maker notes.

definitions5

Definitions was created with the help of Arduino, Processing and the ConceptNet API. As searching through 15 levels of connections is not computationally trivial, it required some tricks to keep a diverse set of results coming through, which were then archived and displayed via serial across the LCDs, all driven by a single Arduino Mega (ATmega2560).

“Many tools have been developed to give computers better ways to understand people – via analysis of facial expressions, written language, speech, and internet activity – allowing for the prediction of intent and future action,” Ma shares. “How do the resulting digital ontologies and software representation express, mutate, or influence qualities of human experience? Is there a tension between the hypothetical outcomes of these tools and their practical, quotidian applications?”

Intrigued? Watch the installation in action below, and then head over to Definition’s project page here.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s