Entries - Tag = week1

The Last Weeks

Timothy Harper - Mon 22 June 2020, 5:47 pm

I managed to miss a few of the last week journals so I seek to catch up in this longer journal post.

In the lead up to the final exhibition, as a team we managed to meet up on campus a few times. This was excellent as we hadn't been able to previously and as each of us were working on the same Sassmobile, it was imperative that we caught up.

As we quickly discovered the project was more complex than expected and bringing together all of the different parts of the bot would be tricky.


The first problem we encountered was being able to connect the ESP32 to the arduino Uno. As both systems need to talk together in order for the bot to function it would serve as a challenge. The best way for this to happen would be to connect the RX/TX of the ESP32 to that of the Arduino Uno. Sadly they were already in use from the speaker. Luckily we could use pin 16 and 17 to the same purpose.

An individual problem I had was getting the robot to follow your face. It was a cool feature, and one that I wanted to get working desparately, however not completely core to the project. The problem was that the robot didn't move fast enough to follow a face.

We then tweaked with the code so that the bot would turn when being looked at. A few of the problems we suspected were the serial using up too much time, so we deleted any serial code. We also tried using a case switch statement to clean the code up. Sadly I didn't capture any footage of the bot moving when being looked at as the camera was being used for facial recognition.

switch (BluetoothData) {


  case 49:

    irsend.sendNEC(0x4FB58A7, 32);


  case 50:

    irsend.sendNEC(0x4FBF20D, 32);


  case 51:

    irsend.sendNEC(0x4FBC23D, 32);


  case 52:

    irsend.sendNEC(0x4FB42BD, 32);






As shown in the code there are 5 different cases. The data received from the Arduino phone app sends through the Bluetooth serial either 49,50,51,52 or 53. I implemented the robot movement codes for each case. In case 49, the Face Recognition app detects a face in the left side of the screen, meaning the robot needs to turn left in order to center up the bot with their face. For the bot to turn, it sends an IR code with that command from the ESP32 to the vacuum, causing it to turn.

In terms of the design of the bot, we decided to custom paint a cardboard box, that Anshuman glued the various lighting strips to. We finished up with this.


Due to health reasons, we couldn't meet up with Ben til the exhibit day, so Anshuman and I met up again without him. We continued to work on the bot, when we encountered issues with sound. Low and behold, Ben was our sound engineer. Anshuman and I then tried to understand his code, and seek to solve the problem. We were however unsuccessful. We later discovered the problem was most likely a power issue. As the one Arduino uno powered all the lights (over 40 LED) plus the potentiometer, touch sensor and sound, it simply couldn't handle all of the load. We were then unsuccessful in our attempts at powering the setup separately, as we didn't think this was the problem.

The Exhibition

On Exhibition day, we all came into uni. I brought in my TV so that we could demonstrate channel changes, and my ESP32 and the vacuum, and Anshuman bought in the bot. We then spent the next few hours trying to finalise everything up, however quickly encountered issues with Arduino. To our luck, they pushed an update which affected the loading of the IDE, meaning we couldn't edit our code. We were able to find a fix, which meant deleting all of the temporary storage of the IDE, including packages. For Anshuman, this wasn't a big problem as he was using the Arduino Uno however as I was using the ESP32 and had previously installed an array of libraries to run the various IR codes and Wifi codes, all of these were deleted and the bug meant I wasn't able to redownload them. All of a sudden our bot had lost half its capability.

Presenting to you the Sassmobile, simulated edition.

As we were unable to fix the power issue, we had Ben run the DF robot sound separately and on queue, Anshuman was our model man who sat watching the TV, and controlling the lighting on the bot. I was able to move the robot around using the remote control. Sadly we couldn't get it all together in time however it has been a unique learning experience. Perhaps with more time, access to 3d printing resources to enable a more solid build, and better understanding of our problems we could have built a better final product. I am still proud of the boys. #teambatsqwad it has been an honour.

Here is our team video of the exhibit on the night.

week15 #sassmobile

Week 12 - 13 Build

Ryan O'Shea - Mon 22 June 2020, 9:42 am

Building and Materials

After collecting resources from the workshop room, including the wooden hand, wires to move it, servos and pulleys I started building the final product. Working off the feedback from the prototype the goal is to make a more sturdy hand where the fingers can move freely and be positioned in any way I wish. The wooden hand has stiffer joints which enable the fingers to stay in almost any position which should come in handy for that. With all the wires attached to the fingers however, it is clear that they will not pull themselves back up like the elasticity that was in the cardboard hands so wires will also be needed at the back to pull the fingers back up. Additionally wires will be attached to the wrist and back to the servos to move the hand, and need space to rotate and move around in order to properly pull the strings.

As you can see below, I made a huge mess with all the materials and building, where most of which was done by trying to solve a problem using trial and error. Building using odds and ends I found at home, I used a plastic container for the arm which turned out to be quite sturdy, but needed to drill holes into the top and sides to attach the servos in so they could rotate it. A servo was attached to the wrist and the hand was pulled off the of the arm it was attached to in order to attach the wrist servo to the base of the hand.

Imgur Imgur

Putting it together

The foam base from the prototype worked very well and the servo was placed inside the plastic container for the next stage where all servos were in place. Another box was used as the container for all the wires and Arduino back end to put it all together and out of sight. With the physical components set up, the next stage was to connect all the wires on the hand to the servos, code the Arduino components then attach it all to the box and battery to make sure the hand stays where it is while getting powered to move and perform gestures.

Imgur Imgur

With everything in place the wires were loose to not stretch themselves out of place before the exhibition while getting all other parts in place. Glue and lots of tape was used to keep the hand in place, all weight resting on the arm servo as the hand will have to move independently to wave, thus not resting its weight on the base of the arm at all. The two distance sensors were placed outside the box facing forwards, offering a simple method to see from which direction people would approach the concept, then the arm would rotate to face the side from which they approach. This was done easily by comparing the distances at each and then using the distance of the side which was closer, and then the action relevant to the distance detected would be performed.

Imgur Imgur

After all the time and effort payed off, the final product was finished and worked decently well, however the battery runs out very quick and when I attached the hand it turned out to be very difficult for the servos in the wrist and arm to rotate. I am quite happy with the look of the final product and believe that it works quite well for the intended concept, and if it wasn't too heavy it could have been a successful design.

week12 final

Week 13

Marie Thoresen - Mon 22 June 2020, 7:49 am

Server issues

This week the team and I needed to make sure that the server was up and running before the exhibition. Sigurd and Thomas had already made the sever work for their prototypes but it was important that it worked for all of our team members. However, as we tried to send the colour values over the server it became apparent that something was wrong and it just refused to cooperate with us. After several hours of trying to make it work we simply had to give up since other tasks were more pressing

Later this week while Tuva was trying to make it work it suddenly just did. What was initially wrong and how we fixed it some a bit of a mystery but thankfully we managed to make it work.

I also learned this week that Sigurd's prototype is broken, so for the exhibition we will only have 3 functioning prototypes. This shouldn't be a problem but it would have been ideally if everyones prototype could be displayed on the exhibition.


This week I became mostly finished with the websites design, the only thing that remains is writing the content which I find to be the boring part of building a website. However, I must say I became very pleased with the header image I created for the website. I was inspired by the sketching tools for arduino and created it to look like a breadboard etc.


In the end, I though this project was fun and I learned a lot about the theme and creating prototypes using Arduino.

week13 exhibition

Week 12

Marie Thoresen - Mon 22 June 2020, 7:40 am
Modified: Mon 22 June 2020, 7:40 am

Additional feature

This week I added the final interaction to the prototype. Based on the feedback I had gotten on the prototype demonstration I decided to add a way for the users to delete the message instead of sending it, if that is something they wish. During the prototype demo I also got confirmed that the throwing interaction was an appropriate metaphor, based on this I decided that the delete metaphor should be the opposite, namely drop the ball instead of throwing it upwards. This, however, turned a little more difficult than anticipated since it required the accelerometer installed to know the difference between a throw and a drop. By studying the output values registered when performing these various task, it became apparent that they behaved differently from each other. The throw slowed down at the top of the throw before it falls down again, while a drop has only a quick stop of the acceleration. In the end, I managed to make it work.

Because of the time restriction and the restriction I won't be able to conduct a user testing to confirm if this last interaction is the best way to delete the message. A different metaphor might be even better than the one I have chosen but hopefully it will be easy for the user to understand and interact with.

Next week

Since the prototype is almost finished, I will start to make the website for the exhibition. I have already some ideas of the design that I want to create for the website and hopefully it will turn out as good as I have imaged. Building a website is just fun in my opinion so this is something that I look forward to.

week12 prototype


Jay Sehmbey - Sun 21 June 2020, 8:27 pm

Reflection (my submission):

After I completed making my first prototype, I recorded the different parts of the video using my phone. I asked my flatmate to be a part of it and do the actual demonstration. I thought that would be a better way to represent rather than me using my own product for demo. I also asked a few questions to the user regarding my product.

Overall I think the video was of a very good quality with both goof audio and video.

During the studio, we were meant to give detailed critique to a 3 other assigned teams. We decided to watch the videos one by one per team and read each document and give critique before moving onto the next person's video and document. I think this was a good and quick method. As we all waited for each other to complete writing our critiques, we were getting more detailed ideas and improvements for the projects we were looking at. What I had from previous courses as well was that the best way to give feedback was if I pointed at something that I didn't found the best, I suggest a better or a different way. This way, the person would be able to think about other ideas as well.

After we went through all of them, We thought it'd be best if we divided all the individual project's critiques that we all wrote and basically summarised the points. In the end, we all got to summarise 3 to 4 appraisals. This didn't take too much time as we all had written good, valid points for the appraisals.

week10 #teamappraisal

Project Adjustment for Exhibition

Ryan O'Shea - Sun 14 June 2020, 6:55 pm
Modified: Sun 14 June 2020, 6:55 pm

Gesture Based Research

For the gestures that I want to use in my robotic hand, first I need to make sure that none of them offend people. Therefore, to quickly gather some data I asked several people around the target market base of 18 -30 years old what gestures offended them, and in what context. Many of these gestures were deemed offensive but required more of the human body than just one hand like my robot. With a lack of body and other hand, many of these gestures are impossible or if attempted don’t look the same and aren’t offensive. For those gestures that can be done only two were an issue, with flipping the bird and using two fingers to represent ‘up yours’ were deemed offensive by some, however the context was when the gesture was directed at them by others, especially those they don’t know or when in an argument this gesture would insult them if done with intent. After asking whether these gestures done by a disembodied robotic arm would be as offensive, most participants responded with no, rather it would be humorous or novel to see, not offensive. There was nobody in my research group who would want a robot to not do these basic gestures due to a caused offense in this group.

Use of Research

Overall this showed me that it would be okay to use seemingly offensive gestures for my robot as they don’t come from a harmful place and is rather a novel method of interaction and shouldn’t offend anyone in the user group. The only potential issue is with second-hand viewers like children, however the nature of the concept doesn’t mix well with kids as the moving parts and many wires make the product fragile and susceptible to being broken, therefore in my project these gestures will be used to convey aggressive intent by the robot.

Update to Research

It is now week 13 and Lorna has just notified me that using the middle finger emoji in my branding of ‘hand signs’ for my portfolio could be unsuitable as school kids are attending the online exhibition. While the kids themselves might revel in the included ‘vulgar’ gestures, their parents would less than enthused. Rather than risk the concern, these gestures will be censored and not used in the robot, and more simple emoji notices will be used in the design (from middle finger to thumbs down on the site) in order to reduce the risk of possible offense to parents or guardians of the attending kids.

censorship week13 gestures

[Week 13] - Portfolio and Prototype

Sigurd Soerensen - Mon 8 June 2020, 5:52 pm

At the end of last week, I mentioned some issues I had with my prototype. Sadly, both the accelerometer and bend sensor are broken, rendering my prototype unusable. I have spent a lot of time on this subject this semester, but most of the time has been spent trying to solve technical issues, which in the end have stolen countless hours from building a prototype that we can use to gather data from. Given that I've dedicated most of my time to PhysComp, I don't really have any more time to spend to rebuild my prototype if I ever wish to finish my master thesis. It's a sad way to end the semester by having your prototype break down. Luckily I have a couple of pictures, a short video and my test results to show for. The test results are in the end why we made the prototype in the first place.


After Tuesday's stand-up, our team met to discuss how we should work on the team reflection. We ended up with a divide and conquer method, splitting the parts amongst ourselves. Some of us started to write a draft on the team reflection the very same day before we iterated on the text later during the week. Then we proceeded to check if our combined code base worked after everyone had updated their code. This didn't turn out as we expected, we found some new issues that hadn't been there before that didn't allow us to receive data from each other anymore. Moreover, the overall experience and functionality were rather janky and unreliable, as it jumped back and forth between the different states, seemingly at will. Thomas, Tuva and I spent many hours that day trying to figure out what the issue was but were unable to fix the issue. The reason for these issues in the first place is probably a lack of communication and planning out the system ahead of time, which likely would not have been an issue if the semester ran as normal where we could meet up and discuss things in-person. We did discuss a plan B in case we did not get it up and running again, as Thomas and I have our previous combined codebase, where sending and receiving worked on both ends and Tuva and Marie have their individual working code. However, if possible we would like to present one working codebase and functional prototype. For the rest of the week, given that Thomas and I had previously spent the most time on the combined codebase, Tuva volunteered to spend some time fixing the issue, as Thomas and I were falling a bit behind on the thesis.


I did spend some time finishing up my portfolio this week. I've been pushing ahead a bit to finish it ahead of time as I need to spend as much time as possible in the coming weeks on my thesis. Most of what I did on my portfolio for this week was to write down more content and make sure the website was fully responsive and accessible. Working on the portfolio took significantly longer than I expected it to, and it all feels a bit repetitive given that we are re-writing the very same things we have already written about here in the journals, the proposal and in the prototype delivery previously. Also, building the site took more time than expected as I'm not used to working without a framework. I could really see the differences in all the heavy lifting a framework does for you and the time you save as a result of this, but then again, the content is what took the most amount of time.

week13 portfolio prototype

Week 10-11, Building on Up

Ryan O'Shea - Sun 7 June 2020, 4:52 pm

After Critiques

After all the responses from the critiques on the Miro board and lots of good feedback and inspiration from seeing everyone else's designs I went into the studio on Friday and picked up the new and improved wooden hand that I want to use in my final version of this concept. In the workshop I also wired the strings through the finger joints in hopes this will better pull them down and back up from behind, operating just like tendons do in our real meat hands. ImgurImgur

Along with the hand, I got stronger and larger servo motors that should better rotate and pull the strings a full 360 degrees this time to properly operate the strings in order for the fingers to move better than in the prototype. These came along with disks on top which will hopefully be able to run the threads through on tracks in order to pull the strings which moves the fingers a larger amount.


Sadly this was all the physical work I could do this week as I was otherwise very busy with other INFS subjects and finalizing work for them. Next week I hope to further complete this final build, making the hands look better with some other design work, hooking all the strings and wires up to the servos and building a sturdy base for the whole build so it is secure and can be fitted on a desk in a easy casing with a sturdy base.

week10 iteration finaldesign

[Week 12] - Building the Second Prototype

Sigurd Soerensen - Mon 1 June 2020, 12:27 pm
Modified: Mon 1 June 2020, 6:39 pm

I spent most of my time last week working on the next prototype and my annotated portfolio.

Studio & Workshop

In the studio, we had our regular stand-up with this week's focus on having a one-line pitch for our concept, show what we have been working on, what our priorities are for finishing the project for the exhibition and questions regarding the portfolio. Although we have slight variations of pitching the concept and thoughts of the ideal product, given that we are still exploring different aspects, our current one-liner is "E-mories, a distraction-free physical platform to remotely share personal emotions with close friends and family". Moreover, for my progress, I showed the state of the ball, which at that point in time was the new ball with a bend sensor attached with silicone to the inside. As for priorities finishing the project, we had already fixed it so that all devices could communicate over the server, so we mostly just had to continue focusing on making sure our individual prototypes works and to conduct user testing for the last prototype.



As for the prototype, I found a nice transparent ball at K-mart which I could use. The ball had a nice pattern to it which I believed could reflect the colours in a neat way and it also contained some glitter water inside. At first, I didn't think much of the glitter water as I mostly wanted to use the ball. However, looking back at our additional features suggested in the proposal, one of them was to add water to the E-mories device. Given that I am building this prototype to test material and how to make the device more of a personal artefact, I decided to test how adding glitter water could make for a unique look and feel and test whether it made the device feel more personal.

Imgur Imgur

I drained the water from the ball and started to place the various Arduino components and sensors inside, making sure they were all fully covered in silicone to avoid any water touching the electronics. I covered all electronics I could in clear plastic wrap and black tape before I covered them in silicone, to better protect the electronics. I let the silicone dry before I carefully tried to put some water inside to see if it was still leaking. Three times I had to put more silicone in to stop the ball from leaking, which was strange as I by the end had covered the entire bottom half of the ball with silicone.

Imgur Imgur

When I finally had made sure it did not leak, I tested to see if everything still worked, which it did, but the accelerometer has ever since testing it for the first time seemed quite unreliable as it seems to mix angles. As for the working parts, they can be seen in the images and video below. At this stage, everything from recording to picking a colour, sending data and getting an incoming message notification worked as intended. However, when I picked the prototype up the next day, the bend sensor values were all over the place, which made nothing work. I inspected the ball for water leakage, but there was none. I knew from when I received the bend sensor that the connection was somewhat loose which I had taped earlier to avoid these issues and later put silicone on top of to hold in place. Despite this, I seem to have some issues with the bend sensor. Having tried to fix the issue for a couple of hours, I decided to drain the ball of water in case that had any effects on it. So, I'm going to let it dry off before trying again. If not, I might have to stimulate the squeeze interaction as I would have to pull everything apart to access the bend sensor and fix it at this point, basically meaning I would have to purchase a new ball and start over from scratch.

Imgur Imgur

Web Portfolio

As for the rest of the week, not counting the time I've spent working on my thesis, getting ready for the prototype demonstration there, I spent working on the portfolio. My current progress can be found here: portfolio

Most of the time I've spent working on the portfolio has gone to rewrite and condense what we have already written about in the proposal, journal and first prototype delivery. Re-writing this content feels rather repetitive and as a result, made my motivation take a hard hit. I'm still struggling with motivation in both courses as there is a lot of repetitive work and every day feels the same, not being able to have a social life for the entire semester. Still, I believe I'm on track for the exhibit and portfolio in PhysComp, while I still need to catch up on the thesis as PhysComp requires most of my time throughout each week.

week12 building prototype portfolio

Reflection (Week 12)

Shao Tan - Sun 31 May 2020, 11:58 pm
Modified: Sat 20 June 2020, 5:42 am

Work Done


I have been working on implementing the ultrasonic sensor and the voice recognition and have met with difficulty in making them work together without confusing the Spud program. As the ultrasonic sensor and the voice recognition software are always on, I have to decide on giving one of them the priority of making an action. Otherwise, as I'm experiencing now, when Spud detects someone nearby and listens to a voice command, it does the actions one after another without stopping. Therefore, I have decided to make voice commands the priority. Also, I have to think of a way to loop the angry expression and the stop motion where Spud shakes its head as long as it is detecting that someone is at that distance. Otherwise it would be weird if Spud just goes back to a neutral position after a while even with the person still standing there not taking the hint when it is supposed to do try and make that person to go away.

On the mean time, I did testing on the interactions of Spud with participants. I tested whether the participants understood the meaning of all the different movement and how they felt for each of them. A few commented that the movement was too slow and the dancing trick looked like Spud is angry instead of being playful. In response to this, I will increase the speed and try to make Spud look a bit more silly instead of angry.


I first planned how my portfolio will look like and have now completed the overall frame of the website and chose its colour scheme. I also drew some illustrations and icons of Spud so I can add them into the website.

Imgur Imgur

I will complete all the functions and features of Spud as soon as possible and conduct a final user testing observation and interview while adding details and information into the website.

week12 #spud

[Week 11] - Working on the next Prototype

Sigurd Soerensen - Tue 26 May 2020, 6:53 pm


On Monday we had a meeting to discuss the feedback we had received and our path going forward. Both as a team and for my individual project, we received some useful data, however, some things that were mentioned were answered in the video and document and provided little value. Most of what we received was helpful though and does correlate with the data gathered from user testing and interviews.

As for my own project, feedback and user testing data suggested that I should look into the material and also how the device could make for a more personal artefact. Other than this, most of the feedback I received only requires minor fixes in the codebase, such as not having to hold after squeezing until the audio is done playing and smoothing out the quick flash at the end of the notification cycle for a more pleasant experience.

We decided during the meeting to focus on putting all our code together into one codebase to better be able to showcase our concept on the tradeshow. We also set up another meeting for Friday to start merging the codebase. We chose to focus on merging our code before continuing to work on other features on our individual projects as more code would mean more refactoring. Given that all of us had to focus on our thesis for the coming days, this did not cause any issues for us.


As for Tuesday and Thursday, we had our regular stand-ups. I did like that we were all going to say one positive thing given that a lot of stress with covid on top quickly makes for a negative pattern. All week up to Friday, except for Monday's meeting and classes, I had to spend working on my conference paper for my master thesis as I had mostly been focusing on PhysComp and had that due on Thursday.

Friday's Meeting

On Friday our group met at Uni to start merging our code. Whereas Thomas and I had an easy time merging our codes, Tuva and Marie had to start from scratch using a new library for their MPU6050's. Given that we had an easier time putting our code together we put in place a couple of functions so that Marie and Tuva could easily merge their code with ours without having to read through and understand it all.


During the weekend, being inspired by Thomas' solution to create a ball from silicone, I chose to try doing the same, only instead exploring a different shape. I went to Indooroopilly, to purchase some clear silicone and then headed back home to make a mould for my shape. I decided to try to make a cube due to how it is easier than most other shapes to make and then Thomas and I would be able to test two different variations to see which one felt better. My thoughts were also that using different shapes could be a way of making the artefact more personal as people could pick their own shapes or a pair where two and two E-mories devices would have the same shape to distinguish them from others. However, after two attempts, one time with only small amounts of corn starch to retain some translucency and another time with a lot of corn starch, it still would not dry, so I ended up scratching trying to make my own cube out of silicone. My plan B would have to wait until Monday as I had previously seen some clear balls laying around at K-Mart on Toowong that I could work with.

Imgur Imgur

week11 prototype codemerge

Reflection (Week 11)

Shao Tan - Sun 24 May 2020, 9:56 pm
Modified: Sat 20 June 2020, 5:21 am

Work Done

This week I started looking at ways to implement the ultrasonic sensor and the microphone into Spud.

I started watching videos of how to use ultrasonic sensors and tried it myself. It was quite straightforward and easy to work on. For voice recognition, I found a way of using the microphone on the laptop instead of the Arduino microphone module using C# (they used the Visual Studio IDE) from this website tutorial here. Hopefully it would not be that hard to implement these in Spud as it might be tricky to do this with two different modes, the alert and friendly mode, and having to send information from the visual studio IDE to Arduino.

I also did user testing to determine how far the distance of the person walking towards the user should be before Spud reacts. Results:

  • >1.3m away from user = normal
  • <1.3m away from user = Spud turns angry as a warning
  • <0.8m away from user = Spud shakes its head/ waves to the person.

Work to be done

For Spud, I have to start implementing the ultrasonic sensor and the voice recognition as fast as possible. At the same time, I also have to work on my website as that might take a long time to make it nice and presentable. I'll first make the form of the website and set up the CSS and JavaScript code. Then, I will write down content about my work with Spud and decide how to display it without making it just seem like a document.

week11 #spud

Week 11

Marie Thoresen - Sun 24 May 2020, 6:12 pm

Team meeting

This week my team and started with having a meeting to go through the feedback we had gotten and discuss what the next phase of our development would be. The feedback we had received was mainly positive and gave us confident that our concept and how we displayed it in the video was good and that people thought it was exciting.

Everyone of us had asked questions in our videos in addition to a series of team questions that asked around the overall concept. For my prototype I got confirmation that the trowing metaphor was appropriate for its functionality. In addition, many expressed that a way for the users to either replay the message and/or delete was something we should include so this will be taken into further consideration.

At the team meeting we decided that we would have a physical meeting on Friday to see if we were to put the prototype together as one.

Friday Meeting

The team concluded that we would try to assemble all of the prototypes into one so that each and every ball could perform the entire interaction flow. While Thomas and Sigurd managed to put their prototypes together fairly easily, me and Tuva met with some additional issues. Firstly, we hadn't used the same libraries on our prototypes so we had to decide on which one to use. Tuva tried to install the one I used, but for some reason this didn't work and so, together with a tutor, we decided to scarp it all together. A new library that was available for both me and Tuva was found so we decided to use this one. Since most of my code had been based on the last library I had used meant that I had to rewrite most of my code. It took my a while but I managed to make it make it work finally. Secondly I was tasked with merging mine and Tuvas code. However, as we expected the accelerometer couldn't distinguish between a shake and a throw so it was decided to add a squeeze in-between to "lock" the colour and starting the throwing state of the ball. This worked perfectly and so now the interaction is as follows:


The inside of the ball looks currently as a hot mess but everything works perfectly. However, because of the new pressure sensor the schematic was even more difficult to look decent. Hopefully people can make something out of it.

Imgur Imgur

Going forward

Based on the previous user testing the team also discussed some additional feature that we could add or try separately. For my part, most of my users wanted a way to delete the message instead of sending it. I was thinking of adding some code which registered when a user dropped the ball instead of throwing it upwards and that this could be a good metaphor for deleting and resetting the ball. This is however something I will have to look into closer at a later point.


[Week 10] - Prototype Delivery & Feedback

Sigurd Soerensen - Mon 18 May 2020, 3:15 pm

Before Submission

By Tuesday, we had all delivered our prototype documents and uploaded the videos. I did most of my video and document last week but made some last changes on Monday. Given that our team chose to build separate parts of the same concept, as explained in previous posts, we also found it most useful to create a team-based video and include that in our videos. Looking at the finished video, I believe it turned out quite good and that we did well in creating the team-based video. Everything revolving the video and document took much longer than I expected, so for the last couple of weeks, I have focused more on PhysComp than my thesis.

Tuesday and out

From Tuesday and out, we focused solely on writing our feedback to the other teams, responding to the few questions we got and starting to look over the feedback we received. After meeting up for the studio on Tuesday, we all gathered and started to write out feedback. First, we tried to have one person share their screen and watch the videos together like so, to be able to play and pause along the way to comment. However, we quickly found this ineffective and started to watch the videos on our own and then meet up between videos to read the documents, discuss them and come up with rough bullet points with feedback. For the first group, we started to write summaries of our bullet points before going to the next person's video. For the second team, we just wrote down bullet points from the video and documents, discussed them to come up with more and then moved on. Before starting on the third group some members of the team wanted a break, and some wanted to go on. After discussing for a bit, we came up with an asynchronous solution where two of us, Thomas and I, continued to the last group straight away and the two others to come back and do their review of the last group later. Moreover, Thomas and I were to summarise the comments for the second group as everyone had already written their comments for them and then Tuva and Marie could summarise the third group when they reviewed them later that day. In my opinion, this solution worked much better and was more effective. We decided to go through all summaries after Thursday's workshop, before commenting on Miro.

For the rest of the week, I had to focus on my master thesis given that I had focused on PhysComp for a long time and had to pick up the slack on the thesis. Besides the feedback, this week was quite uneventful.

week10 prototypedelivery feedback

[Week1 - Post 3] - Card Ideation

Sigurd Soerensen - Mon 9 March 2020, 7:23 pm
Modified: Tue 31 March 2020, 3:35 pm

The card ideation method was a new and exciting approach to me. Our table came up with some fun ideas, although most of them were toilet-related due to the cards we drew.

However, I feel our creativity was limited by having to stick with the first cards until they had been exhausted. This issue became especially apparent when we were supposed to do the same task individually, where most of us had one or two weird words that didn't make sense in that context or cards that had words that didn't make much sense at all. Even though I am fond of ideation methods, I've noticed that in most subjects this far, we have not used any ideas after we came up with them. I believe this is for the simple reason that they are either not captivating enough as a concept or not feasible, making the ideation process somewhat wasteful. I'm hoping that this will not be the case for this subject.

In the end, we did decide to present the idea of a mechanical plant that would die as you are wasteful with water in your home; A simple but intriguing concept. This concept came to be when we ideated on the sentence "Design for enlighten in a bathroom using ridged with quality of detached". The idea is enlightening due to how it visually represents your water consumption with a familiar, easy to grasp metaphor, that of a plant dying, which equals bad. Although the plant isn't required to be in a bathroom, this is where the idea originated. The mechanical aspect of the plant came from the ridged requirement, and the plant itself can be picked up and moved around, hence fulfilling the detached requirement. We did continue to ideate on the concept with a new card reading semi-trailer instead of a bathroom, as we felt like we wanted to ideate something else than bathroom ideas, but eventually, after voting, ended up presenting the original idea to the class.

All images can be viewed here if not shown below.

Imgur Imgur imgur

week1 cardideation

[Week1 - Post 2] - Seven Grand Challenges

Sigurd Soerensen - Mon 9 March 2020, 6:46 pm
Modified: Tue 31 March 2020, 3:35 pm

The first week started with running through the seven great challenges as presented in the international journal of human-computer interaction. I was tasked with reading through the seventh task on social organization and democracy, of which you can find my highlighted sections here.

As for the in-class discussion afterwards, I took the role of facilitating a conversation and trying to involve everyone on the table, while another person in the team took notes as we discussed. Facilitating the discussion proved to be difficult as most people at the table sat silently by not interested in giving their point of view on the matter. Hence, we ended up being three to four people keeping traction in the conversation. After having discussed for a while, we chose one team member to present our discussions. The gist of what we discussed was to focus on a broader scope when creating technologies in the future that encompasses Glocal (global + local) thinking and how the product can affect people other than the user. Moreover, we looked at how technology can manage resources and its impact; how technology should be more inclusive, active technology co-design with users; and technology to give equal voice to everybody, with an example of cryptocurrencies.

Although I understand that for the discussion it would be helpful to place us in groups with those who had read the same chapter, I did not feel like I learned anything from listening to the presentation of the other groups. Moreover, I had a hard time hearing what was presented. Without reading the other chapters, I would not even know what they were about from the presentations held. Moreover, I don't understand why we had this exercise in the first place as we didn't use the challenges when generating ideas the next day.

week1 sevengrandchallenges


Kuan Liu - Mon 2 March 2020, 11:36 pm

A week passed by, we had clarification about how this course is going forward, and what we would be expected and do. Though, new concern rises when everyone was going to work together toward the end of the course. How it's going to manage with the tools and spaces? On the other hand, my curiosity increased when I was working on my poster. Thinking beyond current technology was fun and excited.

On that say, time management becomes a key that has been addressing in all past projects. I don't like to do anything in the last minutes. Sometimes, the brainstorming and ideas section took much time or rather in a rush; as a result, the product we had was not exciting and fun. Since we had more time in this class, I hope this would change with time constrain.

HCI: The Seven Grand Challenges

In our group, we assigned to Human-environment interactions. We started by breaking down and discuss over by small sections. Since we had a large group of people, we break into two groups. One of the classmates took notes, and the rest of us shared what we think and agreed. Below are the notes we had from our table, and the image of the two groups joined, in the end, to share what we found.

Imgur Imgur

My ideas and thoughts:

In this section, one of the areas that I was worry about (maybe a little exaggerated) was how our world is going to take forward with the technology with AR/VR. Are the users be able to find the balance between the virtual world and the physical real world? For example, HoloLens helps users moving toward real virtuality; however, the users might become addicted to virtual worlds and over-relying to virtual agents. It reminds me of a Korean TV show called Memories of the Alhambra, I saw over the break, it's about a VR video game in AR setting. The physical reality is the game environment itself. It's similar to Pokemon GO, but the users would play the game once they put on the gaming contact lenses. It merges of playing and viewing the game in the real physical world. One odd thing from the people who don't play the game, they would think you might have a mental problem. Once the user started playing the game, they will have big gestures moving in the air or jumping in a space. Without revealing too much from the drama for people who are interested to watch it. I would end by saying that the main character, in the end, doesn't need the contact lenses to play the game because he, itself, is in the game already. Nevertheless, people thought he had some severe mental problems, but only he knows what happened to him, and only the people who played the game would understand.

Maybe in the future, we do not need only lenses to view VR/AR. Minimizes is always an aim for all the designers and technology inventor. It is so-called a sleek design.

One other thing I take away from our group discussion is that beyond using our senses in most of the current designs. Smell and taste have not yet developed. I am interested in how our future design would have accomplished in these areas. Since it's going to be a new challenge to pave the way towards involving, evolving, and evaluating methodologies and technologies, I would love to be part of it.

Ideation activity in the class

We have an exciting ideation activity in the second part of our class. I have never used this method before; we used the porker as a design thinking tool. At the beginning of the group work, it was a bit confusing since none of us had played it before. Everyone was trying to figure out how to play; however, we got a tough sentence. Everyone in the group was having a hard time to rephrase it. I gave tried to try to open up so that everyone could share what they think. But it didn't let the conversation going. In the end, with the tutor's help, we started to break down each word to generate more ideas.


Next, we worked on some concepts alone. I don't know why somehow I got stuck with the words I had. I couldn't think outside of the box. I struggled a bit of thinking I can only write all the words in a sentence. I don't know why I would think that. It limited myself to thinking anything beyond that "one sentence." Here are some of my notes.

Imgur Imgur

#week1 #reflection

Week 1 - Class Work

Ryan O'Shea - Mon 2 March 2020, 4:46 pm
Modified: Mon 2 March 2020, 4:46 pm

In the first week of classes we did some idea generating exercises the first being based off the seven challenges of Human Computer Interaction. The group I was with was assigned Environment Based Challenges where problems posed by individuals interacting with the increasingly evolving intelligent environment we live in today. We did a number of exercises going through our individual interpretations of the article Imgur and then summarizing the points we found into a more concise list of issues that should be thought about and addressed in design in the future. Imgur The second brainstorming activity we did involved using playing cards and randomly generated words to come up with interesting contextual scenarios in which an idea space was created. Seen in the picture below with the cards and word key that were used to generate the sentences. Imgur This was rather enjoyable with the group at my table as we had some good ideas along with many silly ones which still worked. Seen below are some of my interpretations of using the sentence: "Design for Disconnecting in a cemetery using hopping and the argumentative quality". Imgur

These exercises were quite useful in generating interesting ideas and concepts which will be needed for our presentations next week.

week1 ideas

Week 1 - Introduction

Jason Yang - Sun 1 March 2020, 8:22 pm
Modified: Sun 1 March 2020, 8:26 pm

About Me

Hi! My name is Jason Yang and I'm studying a Bachelor of Commerce and I.T. Innovation has been a huge passion for me, especially after working for SAP in their Innovation Network Centre team. I am very passionate about innovation, graphic design and collaborating collectively as a team to deliver the best results.

What I Hope to Learn/Achieve in PhysComp

Since this course has been designed for students in their final year, I would love design and showcase at the end of the semester a product in which we can all be proud to show potential employers.

I would love to grow my experience in ideation in particular as this stage is usually the most sophisticated - given various factors.

Furthermore, it is great that this course provides us with an opportunity to build a physical item with no material limitations per se.

I endeavour to achieve a high grade in this course through high work and collectively develop an innovative masterpiece which I am able to proudly showcase in my portfolio at the and of the semester.

Looking forward to a solid semester and working with everyone in the studio! Let's make great happen!

#intro #week1


Sulaiman Ma - Sun 1 March 2020, 8:22 pm

About Me

Hello everyone! my name is Sulaiman MA. I am a postgraduate of ID who is passionate about designing cool stuff relating to music and game. I am a kind of imaginative person,which also makes me a good idea producer when brainstorming. I hope to have a good time and see some really cool stuff this semester. Good luck everyone!😊


As a 4 units course, I thought we will learn more things from this course, Such as Programming technology relating to Arduino, some basic knowledge to use a basic machine to produce some medium-quality products. After attaining the UQ Innovate Induction, I feel stressed about the process of practicing my design, because of so many machines and so many rules to memorize. Hope that the tutor can give us some help when we do the production of our design.

Finally, hope everybody can have a good time, and get an ideal final outcome!💪