CHAMELEON PROJECT, prototype 03 : Mimicking emotional contagion:

shown at the Banff Center for the Arts, CANADA, March 2008.

ICA, London, April 2008

Lighthouse, Brighton, March 2009

download video example (Quciktime 5.8 meg).

Chameleon Prototype 03 is a two screen video installation, exploring the intimate communication between two people. The screens display the profiles of a man and woman, starring at each other. The two video screens of the man and woman interact via an embedded video of emotional intelligence.

The audience are privy to view and listen to a couple who become caught into a never-ending personal journey of personal and uncontrollable array of arousing, disturbing and confusing emotional dramas and habitual responses. They sporadically attempt to resolve their clashes, resulting in a fragile calm state. So how do we catch our partner’s moods? The brain’s “mirror neurons” are to blame, says Prof John T. Cacioppo, director of the Center for Cognitive and Social Neuroscience at the University of Chicago and co-author of “Emotional Contagion”. If we see someone upset, neurons in our brains fire in response to other people’s actions and intentions, as if it was happening to us. The more we care about the people, the stronger the neurons fire. So if you see that your partner is anxious, hurting or depressed, you literally feel his pain.


The emotional intelligence of prototypes two was built with neuroscientist, Chris Frith. Frith put forward a hypothesis of how we read and respond to emotional responses. For example, if A is angry, B’s response is most likely to be sad. But if A does not recognise B’s sadness, how would B feel? These mappings of interactions we modelled into a code, becoming an ‘emotional algorithm’ was developed that informed Prototype 03’s video engine. The algorithms were then evaluated in the lab.

The video portraits that were created using actors and artists to speak in present tense about emotional states. They were shot at the Banff Centre, over the Liminal Screen Residency. We also verified the effectiveness of the video to create emotional contagion in the audience in the lab.

Prototype 04 was built to work in various environments and sizes (see pics). The piece does emit sound. It requires two screens, two speakers and one computer.

The Emotional Algorithms

Chris Frith writes about the emotional algorithms in email correspondence in June 2008,

“We know that people tend to covertly (and unconsciously) mirror the actions of others and this also applies to facial expressions. Observing a happy expression elicits happiness, fear elicits fear and disgust elicits disgust. (I can give references for all these claims if you need them.)

However, this mirroring is not a simply copying of the motor behaviour we observe. For example, we mirror the eye gaze of others, but what we copy is not the action, but the goal. That is we look at the same place as the person we are observing. We want know what they are looking at. This will usually involve a very different eye movement since we will have a different line of sight.

We therefore need to consider the function of the behaviour of the observer. Seeing a fearful face is a sign that there is something to be afraid of, so that our fearful response is appropriate. Unless of course, the person is afraid of us, in which case a different response would be appropriate. An example of the function of these exchanges of expression is the case of embarrassment to diffuse anger.

A person commits a social faux pas. This elicits and expression of surprise and then anger. The person then displays embarrassment. This elicits compassion (for the distress of the person) in the observer. This expression of compassion indicates that the person is forgiven and every one is happy again.

I used these ideas to make a best guess about the parameters for the emotional algorithms".

Frith's hypothesis of emotional exchanges was then evaluated in the lab by Matt Iacibini, supervised by Nadia Berthouse of UCL's human computer interaction center. Iacibinii verified the effectiveness of the stimulus video material that will be used in the human machine emotional interaction, and to test the current prototype of the emotional algorithm used to control the machine side of the dynamic of the interaction. The experimental prototype will be using a technique from HCI called Wizard of OZ (Maulsby 1993) that consists in using a person to substitute for the parts of the system that are not ready yet. A human rater will be watching the live video of the observers and classifying their emotional expression.

The algorithmic code was then translated into a learning algorithmic code to inform the video engine, which then plays appropriate video portraits to build empathy over a social group.

The project is collaboration with UK based neuroscientists prof Hugo Critchley and prof Chris Frith, affective computer scientists Prof Rosalind Picard, and Dr Rana El kaliouby at the MIT Medialab, Cambridge, and curated by Helen Sloan of SCAN. Gonsalves would like to acknowledge the in kind support from the MIT Media Lab, Banff New Media Institute, SCAN and Institute of Neurology at UCL. The project is funded by the Wellcome Trust Large Art Award, Australian Network for Art and Technology Synapse Residency, Arts Council England, Australia Arts Council inter-arts board, Australia Arts Council Visual Arts Board.
example of prototype 03, using small LCD screens, framed, hinged.
Lighthouse, Brighton March 2009
example of prototype 03 using large projections