Rob Kleiman

View Original

Giving a Face to the Internet as a Living, All-Embracing Organism

Sentiment analysis of millions of tweets reveals how the Internet really feels

When you post your thoughts on the Web, you may be sharing personal information that could be exploited and stored, often without you knowing it. More often, the way we communicate and express ourselves takes place through digital and social channels and our networks. Social shares are just data; this data may be listened to, heard and interpreted by anyone. AMYGDALA is an art installation attempting to hold a mirror up to this reality and discuss both the opportunities and dangers of living in the information age.

AMYGDALA listens to and interprets the content(s) shared on the Internet by users to generate an elaborate audiovisual work. It uses sentiment analysis to listen to shared thoughts, interpret states of mind and translate the data on social media so as to visually represent the collective emotional state of the Internet. It can track the shifts in mood and how public opinion changes as events occur, in real time, around the world.

As news and thoughts of users spread across social networks, events with worldwide implications immediately can lead millions of people to share their own opinions and emotions: happiness, anger, sadness, disgust, amazement or fear.

AMYGDALA, then, imagines the Internet as a living organism.

The creators of the exhibit think that its emotional state may be given by the overall emotions shared by users at any given time.

Where did it get the name? The amygdala is considered the part of the brain where emotions develop and memory resides. It compares stimuli experienced in the present with those of past experiences. The exhibit seeks to reproduce these mechanisms artificially. Its algorithm splits emotions into six types: happiness, sadness, fear, anger, disgust and amazement (as detailed above). Based on the contents of a message, it then carries out a text analysis for each single tweet at a rate of around 30 tweets per second.

When the analysis is complete, the machine offers the ‘strongest emotion’ from that given tweet. The process then repeats itself, over and over.

The project is massive and complicated: An internal cooling system allows a constant temperature to be kept, and a flow of air is blown onto the front glass panel to avoid misting which allows the columns to work 24 hours a day, 365 days a year. Each column is made up of 3,071 LEDs giving it a total of 125,952 LEDs.

The project was created from an algorithm of Sentiment Analysis based on the open-source library Synesketch: An Open Source Library for Sentence-Based Emotion Recognition. From the moment AMYGDALA is activated, over the three months of the exhibition, millions of tweet will be listened to and interpreted, thus compiling an emotional archive of the net:

“Big Data may in fact be used to monitor the spread of an epidemic in real time, or to prevent a crime and improve the safety of a city; likewise, it may also be exploited by companies and institutions to store — often unknown to us — infinite quantities of information on our own private lives.”

The designers of the project believe that being conscious of these mechanisms at work in our society may help in the protection of individual and collective free speech.

AMYGDALA


Originally published at www.psfk.com on February 18, 2016.