Creating a live emotional contagion tool.

Dana Center Event, Science Museum London, February 2008.

Chameleon Project, Prototype 02 is a performance tool that slows down a display of live social interaction, decoding the unspoken, subtle elements of mimicry that come into play over social situations. It allows the audience and participants to view the micro-expressions and fleeting moments of cohesion and incongruence that we may often miss consciously.

In prototype 02, the team aimed to extended what was learnt in prototype 01. Rather than observing emotional contagion through watching a film (which uses post production and aesthetic effects to capture how we transfer emotions between social groups), Prototype 02 attempts to capture emotional contagion while it is happening. The team created a system to visually capture the communication (focusing on the face), and then reduce its speed to five frames a second, to isolate the interaction. This was an important tool 1), so we can further understand what are the most potent emotional moments that create contagion, 2). to make participants consciously aware of the constant transference of emotions between each other. 3). It allowed stooges to be used to promote emotional contagion and potentially challenge emotional contagion theories. For example, many agree that imitation is key to language learning, empathy and many other cultural advances that separate us from the apes. Science has revealed that timing is important. If people consciously mimic each other, responding like a mirror, people easily become aware of it, and it arouses suspicion. If the mimicry is slightly out of sync, then the mimicking goes unnoticed, and social bonds are built. With prototype 02, we could test these timings and effects.

Prototype two was built for an experiment at the Dana Center Event at the Science Museum in London. People were separated into groups of twos. Three of the couples knew each other, three couple didn’t know each other, and three of the couple was with stooges. Each couple sat down at a table. A web camera monitored each face. This was fed live to the computer. Software was built that would slow down live footage real-time, and then display this footage on screens surrounding the participant. The computer slows the imagery down to ten per cent. The neuroscientists and audience attempted to analyze the relationship of the couple.

The project is collaboration with UK based neuroscientists prof Hugo Critchley and prof Chris Frith, affective computer scientists Prof Rosalind Picard, and Dr Rana El kaliouby at the MIT Medialab, Cambridge, and curated by Helen Sloan of SCAN. Gonsalves would like to acknowledge the in kind support from the MIT Media Lab, Banff New Media Institute, SCAN and Institute of Neurology at UCL. The project is funded by the Wellcome Trust Large Art Award, Australian Network for Art and Technology Synapse Residency, Arts Council England, Australia Arts Council inter-arts board, Australia Arts Council Visual Arts Board.
At the Dana Center, we sent two participants at a time into a private room to chat. Most participants did not know each other. Two web cameras were postions on each of the participant's face. A live video capture was taken of the commuication and then displayed into the main bar area. The captured footage was slowed down to ten percent so all micro-expressions could be monitored, all of the things we may catch, subonsciously when we inter-relate.