Design Process

The process of how Spud was built and a reflection on project outcomes.

Research

Our team’s main concept was using a hand that uses different hand gestures and signals to convey different emotions. However, when we separated to work on the project individually, I wanted to look at other unique ways to use different parts of the body for different emotions. I first thought of using facial expressions. I did not want to use eyes and mouth in my prototype as they were a very common way to express emotions (e.g. emojis). With this in mind, I researched different facial expressions on cartoon characters and noticed that they were very expressive with their eyebrows. In the process, I also observed that ears and antennas can be expressive in various positions and arms were used in more intense emotions like excitement and anger.

Inside Out
Mr. Potato
Stitch
Grasshopper

Sketches

From the information from the research, I drew sketches of three different types of robots with four aspects of my findings: eyebrows, ears, antennas and arms, in various emotions. Eyebrows were used in all of the sketches as it was the most uncommon way of expressing emotion.

Sketch 1
Sketch 2
Sketch 3

User Testing

Testing was then done with users to determine the best form of the robot to convey different emotions. This testing revealed that a simple robot with only eyebrows, without ears or antennas, was the best choice. This is because the ears and antennas were distracting and redundant as the eyebrows were expressive enough. Therefore, the eyebrows and arms were chosen as the robot’s way of expressing emotion.

Chosen Sketch 3
User Testing

A second user testing was done to determine the eyebrow and arm position for the movement of the robot. I decided on 6 different emotions and actions: happy, angry, sad, excited, greet and stop. The participants were given a figurine and asked move the eyebrows and arms according to each emotion and action. The data obtained from this user testing was then used as inspiration for the positioning of the eyebrows, head and arms of Spud for its six current emotions and actions: warning, stop, scared, wave, fist bump and dance.

Spud v.1

The first form of Spud followed the final sketch of the robot from the user testing. Its head and body are of the same size which emphases the eyebrows on its face. It is small with the ultrasonic sensor as its eyes and the microphone as its nose.

However, after trying to fit five servos in to its body, I decided that a bigger and sturdier form was needed. By interviewing users on the appearance of Spud, feedback given was that the arms had to be thicker to be more obvious as it was too thin for any movement to be seen easily. They also said that the ultrasonic sensor as eyes make it creepy and unpleasant to look at.

Spud v1

Spud v.2

Phase 1

Two forms of Spud were then built with pieces of thick cardboard paper attached together with a hot glue gun. A form big enough to fit the servos is used to showcase the functionality of Spud’s movements while a smaller one is used to show the actual size of Spud. The smaller form fits the intended experience of being portable and being able to sit on the user’s shoulder.

For this first phase, all seven movements and positions of Spud was controlled and simulated using two buttons: one to change between all the alert and friendly actions and one to change Spud back to a neutral position.

Two spuds

This look of Spud was much better than the previous one and the testing participants said that it looks cute and more friendly. The movements of Spud were also bigger and easy to understand. However, they said that Spud's movement was too slow which was changed to be quicker. There were also feedback that the servos were too loud when moving which unfortunately could not be changed.

Phase 2

This next phase was to implement the ultrasonic sensor and the microphone functions. The distance to determine how close someone can get to Spud before it reacts was tested with participants. The results from the distances were determined to be 1.3 meters for the warning expression in the alert mode and 0.8 meters for the stop and wave gesture.

The ultrasonic sensor and a microphone is used for Spud to detect and determine when to move to a certain body position or facial expression. Spud senses when something comes close to it and recognises voice commands. The only thing simulated is the volume detection for the scared expression.

Wave

Reflection

Actual vs Intended

Overall, I feel that final product has followed the intended concept really closely and I am really happy of how it turned out to be. The things that did not follow the intended concept was the portable features and the noise volume detection that I had to simulate. I would have wanted Spud's body to be 3D printed to be more sturdy and smaller in size with all the wiring and parts to be hidden inside and not connected to the laptop. Undoubtedly, it would be built better if it was done by a team of four. However, as it became an individual project, I had to limit myself on how many features I could add for Spud, what I could implement and what I had to simulate.

Other Work in the Domain

Robots normally talk or use different sounds to convey something and sometimes use a screen for facial expressions but I wanted to do something different with Spud. The use of eyebrows and movement of the head and arms turned out to be a good choice in conveying emotion especially the eyebrows which was something novel as it is rarely seen in robots and surprisingly quite expressive.

Team Domain & Studio Theme

I believe Spud has accurately upheld the team domain "Sassy Tech" and the studio theme of "designing for playful and open-ended interaction approaches". Spud is sassy and playful in its own way by helping its user externalise their mood with the use of its eyebrows and body movement. People can also interact with it when they get close the user or ask it to do tricks with voice commands.

Project Outcomes

Did Spud meet the project's desired outcomes?

1. Does Spud help the user to keep others away and stop others from being loud in the Alert mode?

Partially. With the testing done with participants and from the reactions of visitors of the exhibition, Spud's warning and stop facial expression makes it amusing and cute to look at so it does not work as intended. However, the scared expression does work as the test participants felt bad when Spud was scared. The warning expression can be kept the same but the stop movement of Spud should be completely changed to be something more aggressive like the Helping Hand.

2. Does Spud help the user to socialize with others in the friendly mode?

Yes. From the testing done and the reactions of the visitors of the exhibition, Spud is very likeable and popular which would help the user in socialising with others.

3. Does Spud react correctly in different situations?

Partially. The distance sensor worked well to detect people walking near but the voice recognition for tricks was not accurate at times and the noise volume detection was simulated. A more accurate voice recognition tool should be used and the volume detection implemented.

4. Is Spud portable and able to be brought around by the user?

No. The final product of Spud is big with a wire connected to the laptop. Therefore, it is not portable. However, a smaller version of Spud was built as a ideal size to easily show people how Spud should actually look like.

5. Is the different facial expressions and body positions of Spud understandable by everyone?

Yes. From the user testing done, all the participants could clearly identify what Spud was trying to convey with different movement.

Links

Journal

Click Here. Weekly journals written throughout the semester for my thoughts and progress on Spud.

Proposal

Click Here. This report shows the concept, context and problem space of both Spud and the Helping Hand with detailed explanations of the individual projects.

Prototype Document

Click Here. This document gives an overview of the design process, user research, intended interaction plan and project objectives of the prototype.

Prototype Demonstration

Click Here. A video to demonstrate the prototype.