CHAMELEON PROJECT, prototype 07 : Integration of mind reading technology

Chameleon 07, Superhuman, RMIT Gallery, Melbourne2009

Chameleon 07, Dana Center, Science Museum, London 2009

Chameleon 07, Lighthouse, Brighton, 2009 (see blog)

Chameleon 07, Sharjah Art Gallery, Cairo, 2009

Chameleon 07, Residency and commission for evaluation of interaction (visual journey of residency)

 

CHAMELEON project, prototype 07 is an emotionally interactive video installation. This prototype integrates audience into the interaction scenario. The audience’s expression drives the video portraits to emotionally empathize with the audience.

To show empathy is to identify with another's feelings. It is to emotionally put yourself in the place of another. The ability to empathize is directly dependent on your ability to feel your own feelings and identify them. In scientific studies, high-empathy people were found to have a higher degree of mimicking behavior than the low-empathy people.

To build the empathy intelligence of prototype 07, the system needs to assesses and understand the emotions of the audience. It does this via a discrete, yet highly sensitive facial emotion analyzer, assessing multiple axes of emotional expressions. The mindreading technology is being developed with Rana El Kaliouby, senior research fellow at Affective Computing Group at the MIT Media Lab in Cambridge, USA. We spent much time developing the technology to work in darker lighting scenarios, tracks people’s facial expression while moving and from a distance and pick up the six key expressions of Chameleon Project (disgust, happiness, anger, neutrality, sadness and surprise) from a distance. Much testing takes place to integrate the mind reading technology with the existing algorithmic code and video engine.

In an empty room, there is a video of a person looking bored. When an audience member walks into a room, the person in the video attempts attracts their attention by provoking them into an emotionally intense verbal dialogue. The person in the video talks to the participant in such a way that it implicates the audience member into their emotional narrative. The audience member’s face is unobtrusively monitored. This is connected to the facial analysis technology developed by the MIT media lab. The participant’s facial expression triggers the selection of emotional video sequences. The video engine is built on emotional algorithms that are informed by social neuroscience, and the engine matches the expression against these scientific algorithms. The work attempts to build an empathic relationship with the audience member.

Chameleon, Prototype 07, Lighthouse March 2009

The project is collaboration with UK based neuroscientists prof Hugo Critchley and prof Chris Frith, affective computer scientists Prof Rosalind Picard, and Dr Rana El kaliouby at the MIT Medialab, Cambridge, and curated by Helen Sloan of SCAN. Gonsalves would like to acknowledge the in kind support from the MIT Media Lab, Banff New Media Institute, SCAN and Institute of Neurology at UCL. The project is funded by the Wellcome Trust Large Art Award, Australian Network for Art and Technology Synapse Residency, Arts Council England, Australia Arts Council inter-arts board, Australia Arts Council Visual Arts Board.

Chameleon Prototype 07, Superhuman Exhibition, 5 Nov - 5 Dec 2009 :: RMIT Gallery.
Photographer Mark Ashkanasy - installation view.
Copyright RMIT Gallery.
Chameleon Prototype 07 mock up with back projections ontorapid prototyped body scans
Chameleon Prototype 07 mock up with back projected screens
Chameleon Prototype 07 mock up with back projected screens down long hallway
mind reading technology developed by the MIT media lab assesses facial emotion expression
dana center, Science Museum, London 2009