This week I finished my prototype demonstration video and critiqued the prototype videos of other groups. In my vid I demonstrated how the Sassbot talks to the user with the various phrases. Describing a standard interaction, I showed when each personality mode is triggered and the corresponding lines that the SassBot will say.
I enjoyed the process of critiquing other groups since there were some very interesting prototypes. Team Triangle was a standout for me. I loved their super original concept of recording and mixing sounds with lab equipment to relieve stress. The smaller details of their prototype such as how the test tubes light up when they contain a sound and how shaking the test tube plays the sound it contains, were the subtle features that made this concept so quirky and novel.
Ryan's robotic hand gesturing prototype was also really cool. His use of Arduino motors to make fingers move was very impressive. I'm looking forward to seeing how this prototype develops to where the hand gestures become more distinguished.
Looking towards the final prototype delivery, I've put some thought to refining the features and adding new ones. Since the main aspect of my prototype is how the robot can convey emotions, I'd like to add more detail to the robot's facial emotions. Currently, the robot's face is a tissue box with drawn on eyes - not the most convincing interface. Ways to give the face more expressions would be to look at incorporating an esp20 digital interface to give the robot various, changing expressions. Alternatively, I could try implementing some analog indicators of emotion such as lights for eyes. Red eyes indicating that the robot is angry, yellow eyes to indicate that it's annoyed and green eyes to show that it's pleased. Another option is to create some mechanically moving facial features like eyebrows and a mouth. I'm going to start work on the eyes since they're the easiest and can convey obvious feelings.