We subconsciously perceive all information from the environment, only some certain things will become our conscious apperception and it’s different from people due to backgrounds, education and culture. We are interesting in how the environment affect us subconsciously, so that we tested on ourselves using pulse sensor in Tottenham Court Road and Regent Park. We found out in Tottenham Court Road our heart rate reached to 125 bmp which is rather high than in Regent park for only 50 bmp. Therefore, we are actually affected by the environment but we are not consciously aware of it and our body response tells the truth.

In this project, we want to create an individual scale of wearables to critique between Subconscious perception and Conscious Awareness which is “How I truly feel?” and “How I feel?”. When you post an emoji or feelings on your Facebook page, you’ve already process the feeling, somehow it is not always true. It’s another you that you want to share with the world. We are creating a new type of communication wearable to represent you with colors, at the same time, you get to see you true body response towards people and the environment.

 

Figure 1: Design Concept Mechanism -The relationship between two scales

[Function of Emotive Wearable]

Our project is a biometric interactive wearable which contains two parts: the detection of participant’s facial expressions with Muscle sensor at the same time it collects Bio-data by using Galvanic Skin Response sensor. 

 

Figure 2: The function of EMO

We also have a particular aspects to explore a concept of human’s affect feelings(James. A Russell). The relationship between arousal and valence could be collected by Galvanic Skin Response and facial expression feedback. The idea of making two visual output was designed to investigate human’s emotional feeling as a mapping tool in two dimension scale.

 

Figure 3: The Circumplex Model of Affect, James. A Russell

 

Facial expression is a media for us to show our feelings in public, so we attached muscle sensor on the cheek. And the colour will change after the detection of a smiling expression. Therefore the ‘colour’ on the wearable could express the participant’s feeling. Another visual output was designed without human’s eyes visibility, Infrared light. Since not everyone are willing to show their biometric data. When participants’ ‘bio-data’ is at an arousal state the infrared light will turn on for them to check their inner statement.

 

Figure 4: The visible lights for human’s eyes, Quora

[Future Projection]

In future projection, we would like to explore and amplifier from individual scale to urban scale, we will collect and visualise 3 differences condition in a place which are Bio-Data, Social Media and Weather. Perhaps more importantly that we can triangulate in angle of physical, psychological and also biological.

Figure 5: Project Mechanism in Urban scale

Future project proposal

An installation that can represent conditions of the place.

Collective Data
  • Bio-data(GSR); collect data from GSR (Galvanic Skin Response) which install on the wearable,using Xbee broad to send the data computer.
  • Social Media; collect data from Twitter which hashtag particular place and analyse words, using sentiment analysis(positive, negative and neutral).   
  • Weather; collect data from openweathermap website in real time, using API to generate the data.
Visual and Audio representation

Figure 6: Collective Data Representation

 

 

The project can be an alternative way to represent condition of city or place. Instead of represent city in terms of only visual or physical perspective, it can illustrate and critique between how people feel and how people truly feel in the area.

   

Click to view slideshow.

   

   

Leave a comment

电子邮件地址不会被公开。 必填项已用*标注