Entries - Tag = week11

[Week 11] - Working on the next Prototype

Sigurd Soerensen - Tue 26 May 2020, 6:53 pm

Feedback

On Monday we had a meeting to discuss the feedback we had received and our path going forward. Both as a team and for my individual project, we received some useful data, however, some things that were mentioned were answered in the video and document and provided little value. Most of what we received was helpful though and does correlate with the data gathered from user testing and interviews.

As for my own project, feedback and user testing data suggested that I should look into the material and also how the device could make for a more personal artefact. Other than this, most of the feedback I received only requires minor fixes in the codebase, such as not having to hold after squeezing until the audio is done playing and smoothing out the quick flash at the end of the notification cycle for a more pleasant experience.

We decided during the meeting to focus on putting all our code together into one codebase to better be able to showcase our concept on the tradeshow. We also set up another meeting for Friday to start merging the codebase. We chose to focus on merging our code before continuing to work on other features on our individual projects as more code would mean more refactoring. Given that all of us had to focus on our thesis for the coming days, this did not cause any issues for us.

Midweek

As for Tuesday and Thursday, we had our regular stand-ups. I did like that we were all going to say one positive thing given that a lot of stress with covid on top quickly makes for a negative pattern. All week up to Friday, except for Monday's meeting and classes, I had to spend working on my conference paper for my master thesis as I had mostly been focusing on PhysComp and had that due on Thursday.

Friday's Meeting

On Friday our group met at Uni to start merging our code. Whereas Thomas and I had an easy time merging our codes, Tuva and Marie had to start from scratch using a new library for their MPU6050's. Given that we had an easier time putting our code together we put in place a couple of functions so that Marie and Tuva could easily merge their code with ours without having to read through and understand it all.

Weekend

During the weekend, being inspired by Thomas' solution to create a ball from silicone, I chose to try doing the same, only instead exploring a different shape. I went to Indooroopilly, to purchase some clear silicone and then headed back home to make a mould for my shape. I decided to try to make a cube due to how it is easier than most other shapes to make and then Thomas and I would be able to test two different variations to see which one felt better. My thoughts were also that using different shapes could be a way of making the artefact more personal as people could pick their own shapes or a pair where two and two E-mories devices would have the same shape to distinguish them from others. However, after two attempts, one time with only small amounts of corn starch to retain some translucency and another time with a lot of corn starch, it still would not dry, so I ended up scratching trying to make my own cube out of silicone. My plan B would have to wait until Monday as I had previously seen some clear balls laying around at K-Mart on Toowong that I could work with.

Imgur Imgur

week11 prototype codemerge

Reflection (Week 11)

Shao Tan - Sun 24 May 2020, 9:56 pm
Modified: Sat 20 June 2020, 5:21 am

Work Done

This week I started looking at ways to implement the ultrasonic sensor and the microphone into Spud.

I started watching videos of how to use ultrasonic sensors and tried it myself. It was quite straightforward and easy to work on. For voice recognition, I found a way of using the microphone on the laptop instead of the Arduino microphone module using C# (they used the Visual Studio IDE) from this website tutorial here. Hopefully it would not be that hard to implement these in Spud as it might be tricky to do this with two different modes, the alert and friendly mode, and having to send information from the visual studio IDE to Arduino.

I also did user testing to determine how far the distance of the person walking towards the user should be before Spud reacts. Results:

  • >1.3m away from user = normal
  • <1.3m away from user = Spud turns angry as a warning
  • <0.8m away from user = Spud shakes its head/ waves to the person.

Work to be done

For Spud, I have to start implementing the ultrasonic sensor and the voice recognition as fast as possible. At the same time, I also have to work on my website as that might take a long time to make it nice and presentable. I'll first make the form of the website and set up the CSS and JavaScript code. Then, I will write down content about my work with Spud and decide how to display it without making it just seem like a document.

week11 #spud

Week 11

Marie Thoresen - Sun 24 May 2020, 6:12 pm

Team meeting

This week my team and started with having a meeting to go through the feedback we had gotten and discuss what the next phase of our development would be. The feedback we had received was mainly positive and gave us confident that our concept and how we displayed it in the video was good and that people thought it was exciting.

Everyone of us had asked questions in our videos in addition to a series of team questions that asked around the overall concept. For my prototype I got confirmation that the trowing metaphor was appropriate for its functionality. In addition, many expressed that a way for the users to either replay the message and/or delete was something we should include so this will be taken into further consideration.

At the team meeting we decided that we would have a physical meeting on Friday to see if we were to put the prototype together as one.

Friday Meeting

The team concluded that we would try to assemble all of the prototypes into one so that each and every ball could perform the entire interaction flow. While Thomas and Sigurd managed to put their prototypes together fairly easily, me and Tuva met with some additional issues. Firstly, we hadn't used the same libraries on our prototypes so we had to decide on which one to use. Tuva tried to install the one I used, but for some reason this didn't work and so, together with a tutor, we decided to scarp it all together. A new library that was available for both me and Tuva was found so we decided to use this one. Since most of my code had been based on the last library I had used meant that I had to rewrite most of my code. It took my a while but I managed to make it make it work finally. Secondly I was tasked with merging mine and Tuvas code. However, as we expected the accelerometer couldn't distinguish between a shake and a throw so it was decided to add a squeeze in-between to "lock" the colour and starting the throwing state of the ball. This worked perfectly and so now the interaction is as follows:

Imgur

The inside of the ball looks currently as a hot mess but everything works perfectly. However, because of the new pressure sensor the schematic was even more difficult to look decent. Hopefully people can make something out of it.

Imgur Imgur

Going forward

Based on the previous user testing the team also discussed some additional feature that we could add or try separately. For my part, most of my users wanted a way to delete the message instead of sending it. I was thinking of adding some code which registered when a user dropped the ball instead of throwing it upwards and that this could be a good metaphor for deleting and resetting the ball. This is however something I will have to look into closer at a later point.

week11

Pages