Between the Contact and the Workshop, all team members did one user interview to help us refine our idea further. We designed the protocol together in our meeting after the Tuesdays Contact and analyzed the results together during Thursday's workshop. I feel that everyone did a great job on the interviews and that we gathered some really valuable data from it. I hope that we can keep gathering good data throughout the semester with these new restrictions in mind. I would say that we are pretty lucky in this fact since all members have roommates in one form or another so physical testing is still possible. That said remote testing is still very needed since those that you live in close proximity to tend to adopt some of the same views as you and have likely had some insight into the project you¨ve been working on. Although they'll still provide valuable data we'd likely get better data if we interview people who have no previous knowledge of our concept. This means that just like in my Master Thesis we'll have to get a little creative when it comes to testing our physical prototype, although this can be cumbersome it's also interesting to see what creative solutions we can come up with for testing our work.
During the workshop, we got some help from Steven, he helped us decide how we were going to focus our concept, what to include and what to leave in, he said that we should create a sentence that our project should be based on, almost like a research question. Based on that we created the following sentence “How can we encourage others to share positive emotions with close ones remotely”, armed with this it should be much easier to decide whether a function or interaction is valuable to the project by considering if it serves the concept's primary purpose, which is this sentence.
Before we examined the results of our user interviews we still considered using automatic emotion detection for our project. Due to this, I did a lot of research on this topic, looking at different solutions. I landed on using Google's Firebase pack for facial recognition which also has the ability to detect emotions based on facial expressions. After several hours I was still not able to get this to work and put it aside. However, we have now decided to not include automatic emotional detection, I see this as a bittersweet moment since the research I did is no longer needed and essentially wasted time, the good part is that I no longer need to make it work and can focus on the bits that will be useful.
Since Thursday all I've really done is writing different sections of the Proposal I was partially responsible for writing about the Theme and the Relevance to Theme sections of the proposal. In addition to this everyone had to write about at least 1 peer-reviewed article relevant to the assignment. The theme sections were relatively interesting since it allowed me to delve a little deeper into what experts include in emotional intelligence and how our concept fits within it, I was delighted to see that our concept covers almost all categories of emotional intelligence, which means that we have interpreted what emotional intelligence is and thereby have a concept that fits well within it. For my peer-reviewed article, I was very interested in how humans connect colours with emotions. Although it is likely that people from different cultures view colours differently and I'm sure we'll have to do our own testing with colours its nonetheless useful to have a baseline to work from and see how other researchers have explored this space.