Tina Gonsalves, The "Feel" series: An Overview:

The “Feel” series (2005-2007) are an interconnected progression of short films and interactive sketches aiming to sense, translate and provoke the psycho-physiology of the audience. Darren Tofts writes “(with the “Feel” Series), Gonsalves’ artistic sensibility absorbs scientific hypothesis and technological possibility into an interface, a psycho-somatic stage, at once theatre of cruelty, emotional catharsis and critical insight”.[i] The series forms the initial investigations of artist Tina Gonsalves and affective neuro-scientist Dr. Hugo Critchley. The collaboration extends research in the naturalistic embodi-ment of emotion, looking at what ways art, science and technology can converge to become agents that allow us to have a more intimate relationship with our own bodies; more embod-ied interaction, tools that crossover between art and wellness; tools that interplay between the external and internal.


Each of the works used varying collaborative methods to create and strengthen empathic interaction techniques and emotionally provocative audiovisual content. Each prototype was built in a chronological progression, exploiting the achievements, and answering the setbacks of the last prototype.

TINA GONSALVES, FEEL<PERSIRE: responsive biofeedback installation 2007


Art & Direction : Tina Gonsalves
neuroscience: Dr Hugo Critchley
Computer Science: Tina Gonsalves, David Muth.

Synopsis of “Feel Perspire” : Feel:Perspire is a psycho-physiologically responsive video installation, using sweat to trigger footage.The participant’s sweat is monitored using a bio-sensor attached to their finger. When the participant becomes calm, images of clouds verge into abstraction, becoming quite blurry and reminiscent of Rothko’s paintings. If the participant became more nervous, the clouds grow to be more stormy and violent, enveloping the participant in their fury.

Building of “Feel Perspire” : From the feedback gathered, I came to the conclusion that “Feel Insula” achieved the naturalistic interaction and engaging content I was searching for. However, I felt the interaction mode of reading movement was too generic and didn’t allow for the more ‘personal’ sensing mode I initially envisaged. This lead to “Feel Perspire”, a psycho-physiologically responsive video installation, using galvanic skin response (sweat) to trigger footage.

We used the sensor mode of the Galvanic Skin Response (GSR) to trigger video sequences. GSR provides a continuous and immediate response, giving participants a sense of control, creating a biofeedback loop. Biofeedback is a technique in which people are trained to improve their health by learning to control certain internal bodily processes. While using the technologies, there is a shift of the subconscious experiences of heart rate, breathing and nervous system activity to a level of cognitive awareness. Therefore, through reflection, participants slowly learn to identify, sense and eventually coordinate the physiological behaviours being monitored.

I developed and tested a few video databases. The initial video footage for the project was taken from a helicopter. I shot out the window of the moving craft. While shooting, I focused on a horizon. Originally, I wanted to use this in a simple narrative. For example, if the participant’s GSR reading was calm, the footage continues to fly out to a horizon. If the GSR reading rose, the footage would to crash to the sea. After testing, this scenario was deemed to be too potent for the viewer, often enticing them to become more nervous than relaxed

I finally arrived at time-lapsed cloud footage. If the participants relaxed, the footage would blur and become ‘rothkoesque’. If the participant become stressed, storms would rolls in, enveloping the participant in their fury.

At the core of Feel_Perspire was a Max/MSP patch that controlled the computation for the system. We ran the patch on a G5 Apple Macintosh computer under Mac OS X. The output of the GSR was read into the patch through the analog audio input. This signal was used to control the experience of the participants, directly triggering the video narrative. The programming was initially created by Max MSP expert, David Muth. It was difficult to attain a smoothness of video with the constant triggering from the GSR data. The video did not work with programming, leading to a conundrum of how to take the project further.

After three months reflection, I began working with programmer, Evan Raskob, a MAX MSP expert based in London. I altered the concept, working with some stock cloud footage, real-time effects and multiple video channels. I wanted to investigate the use of hue and saturation as an expressive modality, so it was essential that this these effects could be achieved with fluid live effects. Itten, in his book, the Art of Colour, expresses how the mixing of pure color with either white, black or gray to form gradients gives ‘expressive’ power. From the perceptual literature, Valdez and Mehrabian, in their book, Effects of Color on Emotions found that a significant amount in the variance of subject’s emotional response to color came not from the color’s hue, but from the brightness and saturation levels of the color.[i]

Enabling real Real time effects of speed, colour and scale effects was an important step in the research. This created a very smooth and responsive video narrative when triggered by GSR

Conclusion of “Feel Perspire” : We tested the work three times. The initial consumer GSR monitor we implemented in “Feel Perspire” proved troublesome, often dropping out and not giving sensitive enough readings, creating frustration in the participant more than anything else. The following GSR reader (usually used for experiments in laboratory settings) was very large, old and bulky, though extremely sensitive to subtle changes in sweat.

The sensor was attached to the user’s fingers, and dressing the sensor was quite a delicate process. This also hindered the natural interaction scenario, as the user had to remain still and sit in an assigned chair. Furthermore, The GSR monitor needed to be tuned in manually 30 seconds after the participant was hooked up to the monitor, making it difficult to work with in public exhibition contexts. As with “Feel_Trace”, the act of attaching the sensor catalysed an arousal in the body. Adjusting the interface on the monitor was difficult and time consuming. Although the readings proved sensitive, the machine only took a reading every three seconds. Another limitation of using GSR is that it primarily monitors arousal, and does not define emotions. Movement of the body and the variability of GSR data sets across multiple participants provided much difficulty.To tackle this, the GSR monitor needed to be calibrated for each participant.

The simple imagery, using real time effects that responded to the GSR reading, provided a fluid biofeedback interaction scenario. Using GSR as a sensor seemed to result in an engagement “…I wanted to do it for a long time. Feeding back physiological signal on a big screen is really great – one feels good involvement in biofeedback process. This is achievable with a small computer screen but the difference may be like watching movie on the TV or in the cinema". Initial observations demonstrate that participants felt that the video work was analogous to their psychophysiological state, and giving a sense of control. Past research has shown, that when the participant has a modality of control in the environment they experience a greater sense of presence.[ii] “The most unsettling part of the artwork is that after some time, I started to feel like I could control the content of the feedback, while remaining unable to explain how. Falling into the skies accompanied by stormy winds or quietly floating with resting noise, the artwork transported me throughout neatly intermingled settings related to my feelings”. The interaction design assumed that when the GSR level rose, this meant you were ‘stressed’ triggering video of a storm clouds and loud noise to fill the exhibition space. When discussing the project with Picard, she stated a higher GSR reading could mean you were stressed or happy, and I had not creative a narrative that catered for happiness. The limited sensing modality didn’t allow us to differentiate the data to denote an emotional feeling, only ‘aroused’ or ‘calm’.

[i] Valdez, P. & Mehrabian, A. (1994). Effects of Color on Emotions. Journal of Experimental Psychology: General. 123, 394-409.

[ii] Sadowski Jr., W., & Stanney, K. Virtual Environment Handbook. Lawrence Erlbaum Assoc., Chapter 45. Measuringand Managing Presence in Virtual Environments

Feel Perspire, responsive video installation using sweat to trigger footage