Supparoos - Thomas

Intended Concept & Experience

Brief

During the early weeks of this course we were presented with instructions to create a physical prototype using the following statement:

“Design for playful and open-ended interaction in everyday life: human values in physical computing”

We were divided into teams focusing on a specific research area. All prototypes must have: a clear relevance to the studio theme and research area, a physical interaction, be playful/novel, open-ended and fit within the users everyday life.

Emotional Intelligence

Human emotion, as we all know, are layered and complex, Emotional Intelligence is our ability to work with our own and other people’s emotions. According to PhychCentral, Emotional Intelligence can be separated into five different categories.

Self-Awareness

Our ability to recognize or identify our emotions at the moment they occur and our skills and self-worth.

Self-Regulation

The emphasis here is that this is not about controlling our emotions but to deal with them as they occur through a selection of healthy techniques.

Motivation

Our ability to self-motivate, having the drive and commitment to achieve our goals with a positive attitude.

Empathy

Is the same skill as Self-Awareness but instead of focusing inward, it focuses outward to others and is thereby about recognizing and identifying the emotions of those around us and who we interact with.

Social Skills

Is also referred to as “people skills” and is about using the overmentioned skills in everyday social settings. It is about communicating clearly, collaborating, lead or negotiating.

Drived from: https://psychcentral.com/lib/what-is-emotional-intelligence-eq/

Intended Experience

Using novel physical interactions within the emotional intelligence space, we aimed to create a way to:

“encourage others to share positive emotions with close ones remotely”

The way we intended to address this is by creating physical artifacts that allow users to share personal messages, emotions, and memories with each-other. Our concept focuses on making this experience feel more personal than conventional distracting and overused social media, making it more meaningful to both sender and receiver.

Image of Concept Map Draw by Tuva, designed by Team Supaproos
navigate_before
Image of Intended Experience
navigate_next
Draw by Tuva, designed by Team Supaproos

The media above shows the basic form of our intended experience, what we planned to achieve. However, the team had several additions in mind to elevate the concept more.

Filming & Projection

Image of Memory in Ball Extracted from Diseney's Inside Out Image of Memory Projected Extracted from Diseney's Inside Out

Inspired by the Pixar movie Inside Out, which also tackles complex emotional themes, we ultimately wanted to include visual elements to the users past simple colours. Including a camera inside of the ball would allow users to film as well as record messages and memories. The receiver can than either watch the video inside of the ball or that the image is projected on a surface around the receiver.

Saving Memories

Image of Memories on Wall Extracted from Diseney's Inside Out Image of Memory Hug Extracted from Diseney's Inside Out

Again, inspired by the movie Inside Out, having now more complex and detailed messages and memories we wanted to create the option to save sent and received memories. To do this we envisioned a wall with shelves where users could “store” their balls with the desired memories. To encourage the users to revisit their saved memories the balls would fade over time and eventually fade away. To avoid this, users can interact with memories, revisiting them by either just holding them or re-watching them. The intention is that upon revisiting happy memories that the user becomes happy themselves and potentially share the happy memory with others, share and reminisce together.

Electro Tactile Feedback and other Futuristic Technology

The team loved the idea of including the newly developed technology called Electro Tactile Feedback. This Technology allows users to touch the surface and have it feel like any material, this is done by stimulating the nerves in the user’s fingers in a specific way to trick the user’s brain into believing the surface feels like something else. Using this the sender would be able to simulate touching and holding the receivers end to reinforce the feeling of human contact and help consoling close ones during difficult times, no matter how far away they are. In addition, some non-existing technologies were considered such as sending scents, where the sender could “record” a smell and send it to other users, and full body VR experience, where users could actually simulate being part of an experience with all senses.

Prototype


Together as a team we designed the concept, intended use and experience, and design. However, due to the restrictions of COVID-19 we split the concept into different section, each member building and testing it separately to be put together for the final presentation at the exhibition.

Team Prototype

Together as a team we designed the concept, intended use and experience, and design. However, due to the restrictions of COVID-19 we split the concept into different section, each member building and testing it separately to be put together for the final presentation at the exhibition.

The team concept was to “encourage others to share positive emotions with close ones remotely”.

Image of Concept Map

The concept image above shows the short version of the interaction plan. We aim, though the use of physical interactions and haptic feedback to simulate a feeling of closeness between the sender and receiver. Squeezing the ball represents holding and squeezing the other’s hand, while the haptic feedback mimics the other’s heartbeat. Below is and image showing the entire interaction plan with the intended emotions and a group video presenting the created prototype. The video is from an earlier delivery and some small functionalities have been changed, the overall experience and intention remains the same.

The ball starts in passive state, waiting for instructions, from here the interaction with the ball can go two ways, sending a message and receiving. To send a message the user can pick up the ball and squeeze it, starting the recording.

To indicate the ball is recording the ball pulsates red while softly vibrating in the same interval.

Squeeze to record GIF

When the user is finished with their recording, they can stop recording by squeezing the ball again. It then goes through a saving sequence first pulsating yellow, then green twice to signalling to the user that its ready for the next step

Save Sequence GIF

The user can now choose the colour to represent their message by rotating the ball, squeezing when they have decided on a colour.

Choosing Colour with Squeeze

Now the message is ready to be sent and the user can either throw the ball up to send the message or drop the ball to delete it. The ball going back to its passive state.

Throw GIF

When a user receives a message the ball on the receiving end will pulsate in the same colour that the sender chose.

Notification GIF

To listen to the message the user can pick up and squeeze the ball to start the audio playback.

Squeeze listen GIF

To stop the recording playback the user can squeeze the ball again, the ball then going back to its passive state.

Squeeze stop listen

Both the recording and playback of the sound is simulated.

Individual Part

I have been focusing on creating the first step in sending the message, recording. Also, I have, together with Sigurd, conducted a material study to find the optimal form and material for the object, the results of which are presented in the process section.

Functionality

The functionality that I have created for this final prototype is as followed:

Squeeze to start recording

gif of squeeze record

Squeeze to stop recording

gif of stop squeezing to stop record

The idea behind these interactions are to simulate closeness between users, mimicking the holding of hands or general physical contact. The vibration during the recording also aims to mimics closeness my mimicking the heartbeat of the receiver. Although not an interaction, I took part in the development of the sending function and server-side code.

Simulated Components

The recording of sound was simulated during testing and for the final prototype. The reason for this was that I received a faulty SD card, which I found out after two weeks of trying to make it work. At this point I was no longer willing to use more time on it and opted to focus on the other interactions instead. For the final prototype we collectively decided that recording the sound was not essential to the success of the prototype and chose to simulate it instead. The reason that it’s not essential to have it working inside the ball is that even though the ball can’t record, the effects of simulating recording is an almost identical experience, only requiring some effort on our end during testing.

Technical

Arduino

Arduino Image Arduino Image Arduino Image Arduino Image

To build the prototype I used the following components:

  • Bend Sensor
  • This sensor senses when the ball is being squeezed. When the a bend sensor is bent the resistance it creates increases. By fastening it to the inside of the ball it has a constant value for when the ball lies still and when squeezed it will give a different value. By setting a value which the resistance must go above or below I set the appropriate amount of squeeze a user needs to active certain interactions (more detail in code section)

  • LED-Strip
  • All visual feedback that the user receives from the ball is through the use of different strengths and coloured lights.

  • Vibrating Motor
  • This component provides haptic feedback in the form of vibration at certain moments during the interaction plan. (more detail in code section)

  • Acceleromiter
  • This measures both movement and orientation/rotation, this is used both for choosing a colour and detecting when a message should be sent or deleted (more detail on this in Marie’s and Tuva’s portfolios)

Ball Construction

Plaster Moulds

Plaster Moulds

To create the silicon ball, I first created a mould using plaster. Since participants felt the size of the ball was appropriate, I created the mould using the previously used ball by pressing it into the newly mixed plaster.

Silicone Shell

Silicone in Plaster Silicone in Plaster

Using the plaster mould, I mixed building silicone with corn flour to assist in the drying process. Although this worked well, and the silicone ball was set entirely within 24hrs it did have two significant downsides. First, the ball was not as translucent as I wanted it to be, although coloured light still penetrates the shell, it could have been clearer. Second, nothing would adhere to the silicone shell, not even silicone, which means I needed to sew components on the inside of the ball, using tape so that the thread had something to penetrate safely. Due to the need for tape it obscured areas of the ball even more, blocking light and making the balls light uneven.

Construction

Image of bend sensor Other Sewing Image Other Sewing Image

Sewing components inside of the ball was, for the most part, simple and straight forwards. However, the bend sensor was challenging. The most reliable squeeze detection was when I placed the sensor as in the image below. However, in practice it was difficult fastening it in a way that did not make the bend sensor fold, damaging it. Instead of this I ended up placing the bend sensor above the seam in the upper half of the ball.

Code

Below I will present all code that I have helped write for the final prototype and it’s functionality.

Arduino

Main Loop Code

First, we see if we have received a message through the function “readIncommingMessage”, if the is a message the variable “messageReceived” will become true and the if the user squeezes the ball past the set threshold the “listenToMessage” function will be run, playing the message, more detail on this in Sigurd’s portfolio. If no message has been received and the ball is squeezed past the threshold, the ball starts recording by running the function “Record”.

Record Code

While the Boolean “isRecording” is true the bend sensor is read and it’s value saved, the vibrate motor and lights pulsate and lastly the bend value is check against the threshold to see whether the ball is squeezed again and the recording stops if the minimal length of recording has been achieved. The reason for the minimal length is to avoid accidently aborting the record if the user holds the ball for a little to long on the initial squeeze. When the user deactivates the recording it turns off all the lights in the LED strip and runs the function “StopRecord”

Stop Record Code

Here the code simply goes through a saving sequence, pulsating yellow for a short period and then green twice, this code was originally made since the hardware and libraries used needed some time to save the file without issues. This piece of code has no technological function but does clearly signal to the user that they entering a new part of the interaction process. When finished the code runs the function “selectColor”, more detail on this in Tuva’s portfolio.

LED Code

Whenever I needed to pulsate light I ran this function, it simply runs through the LEDs in the strip, lighting them up in succession in the colour that has been sent into the function

Sending Data Code

Since Sigurd and I set up the backend interactions for our first prototype after Marie set up the shell we knew what kind of data needed to be sent so that it would work. Some alterations to the server needed to be made as is shown in the next section. The function “sendData” is run from Tuva’s function and created mostly my Marie. Sigurd and I altered it to take the colour that is sent in and convert it to an appropriate form and print it to the Serial Port so that the server can send it to the other balls. To differentiate between messages and other prints to the Serial Ports we included “message:” in front so it can be identified by the server. The rest is simply the RGB values separated by “,”.

Server

Sending Data GIF

When data is sent to the Serial Port the server checks if it contains “message:” if it does it find the position of “:” and extracts everything after as a string. It then sends the colours with the non-functioning audio to another ball “socket.emit(‘audio’, payload)”

Receiving Data

When the server receives a message from another ball it runs the function “covertRGB” to convert the received string to an int so that the Arduino can use the values to light up the LED strip in the correct colours.

Extract RGB Data Code

This code, written by Sigurd, simply splits the received String into the separate RGB value, making them integers and putting them into an array so they can be sent over the Serial Port.

Process

This following section will detail the process the team, and myself for my individual work have followed from the start of the project all the way up to the exhibition.

Initial Interviews

After initially pitching the idea in class we made some modifications to the concept to make it better fit the brief’s everyday aspect and moving it away from an installation to an in-home experience. Following this, wanting some more information on how and with who people currently share emotions and stories, and find what they would want in a more personal emotion sharing platform, we conducted interviews.

All team members conducted one interview each, approximately lasting a half-hour long. The interview aimed to find the following things:

  • How user currently share emotions and with who
  • How comfortable they are sharing their emotions
  • Sharing digital or face-to-face
  • How they would visually represent different emotions

We found that all participants preferred sharing and showing negative emotions with only their closest friends and family whilst most were happy sharing and showing happy emotions with anyone. Although there were some other suggestions, all participants mentioned that colour would be a useful visual tool to show and represent emotions. We found that although some users talked about the same colours when asked what they would choose to represent emotions, it is still likely that it is highly personal and we decided that we would have to do more research on this after delivering the project proposal.

Proposal

In preparation for the proposal, beside the initial research, we researched the emotional intelligence space, what research has been done and already existing researched solutions.

We looked at research conducted on colours and how they relate to emotions. An article by Gilbert, Fridlund and Lucchina showed an interesting collection of colours people associate with emotions in the form of diagrams. These have been very useful as a starting point for our own emotions as colours research, conducted by Tuva.

Furthermore, sharing emotions digitally is an important part of our concept and thus we looked at some work done there as well, specifically the works of Vermeulen, Vendebosch and Heirman. These authors looked at how adolescents approached the sharing of emotions over social media. Although adolescents are not our target audience it still provided useful insights into the newer practices used.

Our concept relies on the fact that experiences and emotions can effectively be shared, even when not together. Although Boothby, Clark and Bargh do not focus on the effects of sharing remotely they have researched the effects of sharing experiences. They found in their research that when sharing an experience it becomes more powerful, Sigurd has looked into this further in his individual research.

Lastly, we found that it was important to understand the motivations behind sharing emotions and what people want to get out of the experience. Duprez, Christophe, Rimé, Congard and Antoine found that sharing emotions, especially negative ones can strengthen relationships. Something that is backed up by our initial interviews where people stated they only share the more intimate emotions with their closest friends and family.

Concept

The concept was straight forward as the image below shows.

Concept Map Draw by Tuva, designed by Team Supaproos

We also created a series of story boards to show some different scenarios and context of use. The two main functions we see are first, someone feeling happy and wanting to share the positive experience with friends, spreading joy. Second, someone feeling down and using their friends to self-regulate to feel better.

Storyboard, sharing nice experience, sending Draw by Tuva, designed by Team Supaproos Storyboard, sharing nice experience, receiving Draw by Tuva, designed by Team Supaproos Storyboard, asking for emotional support, sending Draw by Tuva, designed by Team Supaproos Storyboard, giving emotional support, sending Draw by Tuva, designed by Team Supaproos

Individual Focuses

In response to the COVID-19 restriction we had some choices on how to proceed. We, as a team, felt that we would get the most out of this project if we continued to work together but split the concept into separate section, having each team member develop it, test and put all components together towards the end. This way we could do in depth testing on all interactions of the concept and develop them with care. We divided the concept as followed.

  • Thomas
    • Audio Recording Interface
    • People Perception and Relationship with Emotional Intelligence
  • Tuva
    • Colour Choosing Interface
    • Colours as Emotions
  • Marie
    • Sending Interface
    • Error Tolerance
  • Sigurd
    • Incoming Messages and Audio Playback
    • The Effects of Sharing Experiences Remotely

First Prototype and Test

To create this first prototype I used three main components, the Arduino and its components, the ball and the code, on the Arduino and the server.

Arduino

Originally the idea way to include the following components in the first prototype. Bend sensor, LED light, microphone and SD card. However, due to a fault in the SD card I was unable to include this into this prototype. After two weeks of trying to make the ball record sound and finding that it was that particular model of SD Card Reader that was faulty I wanted to use no more time on including it this stage and opted to simulate it instead. Since the SD Card did not work there was no point including the microphone either, in hindsight, it was a lot harder to work with sound recording and playback through an Arduino. After receiving the Auxiliary kit with addition Arduino components I changed out the single LED light with the LED strip and included a vibrating motor for some haptic feedback. The final components for the Arduino is as followed:

  • Bend Sensor (to sense when the user squeezes the ball)
  • LED-strip
  • Vibrating Motor (haptic feedback)
arduino component - microphone arduino component - sd card

Ball Construction

We, as a team, had difficulty finding a suitable ball, most balls that were squishy in stores were often too small to hold the necessary components or coloured, meaning the ball would light a different colour then the LEDs. I have to point out that this was before we decided not to include the SD-card reader, which was by far the largest component. In the end we chose to use a DIY Christmas bauble, by cutting it open and sanding it diffracted the light effectively, however, as was confirmed during testing, the squeezing sensation it was not satisfying.

Images of Construction Images of Construction Images of Construction Images of Construction

Code

The code from this version to the final version changes little for me, for the most part only the other team members sections were added. However, I cleaned up my code, removing unnecessary variables and changed variable names to the most appropriate ones from team members before merging it with the two team members. Furthermore, I changed the “Record” function to change the record interaction from squeeze and hold to squeeze to start and squeeze to stop.

code - arduino main loop

When the bend sensor is squeezed hard enough and the ball it not already recording the ball starts the “Record” function.

code - arduino record function

When starting to record by Serial printing “1” the server know to start recording. While the ball I squeezed hard enough the ball keeps recording and the lights and vibrating motor pulsate. When the ball is no longed squeezed the “StopRecord” function is run.

code - stop record function

The vibrating motor is forced to off, and the ball goes through the saving sequence for the recording. Something that by the end of this prototype was not longer needed, but I wanted to test if this was a good way of indicate that the ball was busy with another task and that the user had to wait. After the sequence is complete the ball flashes green twice and “0” is Serial printed to signal to the server that the recording is finished and ready to be sent. Although sending messages was not a part of my prototype, Sigurd and me worked together on the prototype testing for this round and to show context and cause-effect we opted to show that once something was sent from my ball it was received by another (Sigurd’s).

code - light up LEDs function

In succession, all lights are lit up using the colour received from the function call.

code - server for sending data

Video and Feedback

Together as a team we filmed our individual parts and edited them together to show the entire user journey and all interactions and how the relate to each other. This way we could get feedback on how each individual interaction is perceived and the concept as a whole.

Each team member created a section for their own individual part to get feedback on specific interactions etc to further their own design. For my part that meant getting feedback on the squeezing interaction and whether it’s appropriate or should be changed out, if the colour of the lights made sense and, what material would be most appropriate.

After testing the prototype with participants and receiving feedback from my peers it became clear that at this stage few changes needed to be made to the interactions. The only thing that seemed to worry participants was that the squeeze interaction could be uncomfortable and cause fatigue suggesting that instead of having a squeeze and hold interaction to change it to squeeze to start and squeeze to stop, which I did. When it came to material nearly all feedback pointed to using some sort of silicone material for the ball. Naturally, I chose this as the material for the final prototype and together with Sigurd conducted a material study.

Material Study

After creating the silicone ball it was time to test whether is was better then the other alternatives.

Together with Sigurd I conducted a material test, using my silicone ball and his own we asked participants to try both and give their opinions to find whether we had gotten closer to an optimal material and form. All participants had tried the previous prototype so had an additional frame of reference. User stated that they preferred the smaller ball since it fit better in their hands and was easier to squeeze. However, they also stated that they preferred the feel of the bigger silicone ball, but it was much to firm and thereby difficult to squeeze. Also, due to the thickness and opaqueness of the ball it did not let enough light through, making it more difficult to see colours when the ball was in a brightly lit room. This means that although we still have not landed on the perfect form, we have made progress. The optimal form and material seems to be a ball slightly smaller than a tennis ball which is made from a silicone like material requiring little effort to squeeze.

Final Prototype

This brings us to the end of the process with the final prototype created, to see details of the prototype go to the Prototype section of this page.

Reflection

On of the biggest challenges we have had as a team is working together remotely. Although we’ve had few issues with each other, the process was still very challenging. Although we worked on separate parts of the prototype we still aimed to put all components together at the end. With that it mind we should have coordinated more between us during some processes. Although we decided on what components to use and, for the first prototype, what material the ball was to be made off, we did not coordinate the actual construction of the ball. This meant, when putting all code together, issues popped up, especially around the bend sensor and accelerometer. Since all team members had, by the end of the project, different types of balls, all bend sensors and accelerometers were placed differently in each ball, all values were different as well. This could have been avoided to some degree by writing a more open-ended code with fewer, or preferably no hard-coded variables and values. Paired together with better planning could likely have helped us save some time during this process, but it was simply something we did not consider before it was already too late.

Some components in the final prototype are still simulated, mainly the recording and playback of sound. Although having this actually work in the final proof of concept is critical, it would have made a much better and fuller experience during the exhibition and testers.

The test we have done, although valid are still, in my opinion, inconclusive. I say this because although we have received both positive and negative feedback, we’ve been unable to test the long-time effects of our concept and how user’s perception of it possible changes over time, for good or bad. I am aware that this has simply not been possible to do over the relatively short period of time we have worked, but that does not stop me from wondering to what degree our concept would have actually been successful in it’s aims.

Naturally, like most projects requiring physical testing during this semester, we also have experienced the effects of COVID-19. Although we’ve been lucky, since most team members have someone they live with and have been able to do some physical testing, we’ve still only been able to do very little. Although we could have done some remote testing we would have still have had the issue that our project is very dependent on touching and feeling to truly experience the concept, meaning that remote testing would have given us fairly little.

Finally, I’ve found working with emotional intelligence interesting and challenging. Understanding how and why people share their feeling and creating a safe, novel and new way of doing this has been challenging. I do believe that our concept could be successful, however, not in its current state. By that I don’t mean whatever bugs and simulated components still remain. With that I mean that I believe more mediums for message sending should be added and the option to save positive memories and messages. It’s my belief that doing this would result in a more lasting experience, where you could always be reminded of good times and where the physical objects encourage you to do this.