Documentation & Reflection

The Last Weeks

Timothy Harper - Mon 22 June 2020, 5:47 pm

I managed to miss a few of the last week journals so I seek to catch up in this longer journal post.

In the lead up to the final exhibition, as a team we managed to meet up on campus a few times. This was excellent as we hadn't been able to previously and as each of us were working on the same Sassmobile, it was imperative that we caught up.

As we quickly discovered the project was more complex than expected and bringing together all of the different parts of the bot would be tricky.


The first problem we encountered was being able to connect the ESP32 to the arduino Uno. As both systems need to talk together in order for the bot to function it would serve as a challenge. The best way for this to happen would be to connect the RX/TX of the ESP32 to that of the Arduino Uno. Sadly they were already in use from the speaker. Luckily we could use pin 16 and 17 to the same purpose.

An individual problem I had was getting the robot to follow your face. It was a cool feature, and one that I wanted to get working desparately, however not completely core to the project. The problem was that the robot didn't move fast enough to follow a face.

We then tweaked with the code so that the bot would turn when being looked at. A few of the problems we suspected were the serial using up too much time, so we deleted any serial code. We also tried using a case switch statement to clean the code up. Sadly I didn't capture any footage of the bot moving when being looked at as the camera was being used for facial recognition.

switch (BluetoothData) {


  case 49:

    irsend.sendNEC(0x4FB58A7, 32);


  case 50:

    irsend.sendNEC(0x4FBF20D, 32);


  case 51:

    irsend.sendNEC(0x4FBC23D, 32);


  case 52:

    irsend.sendNEC(0x4FB42BD, 32);






As shown in the code there are 5 different cases. The data received from the Arduino phone app sends through the Bluetooth serial either 49,50,51,52 or 53. I implemented the robot movement codes for each case. In case 49, the Face Recognition app detects a face in the left side of the screen, meaning the robot needs to turn left in order to center up the bot with their face. For the bot to turn, it sends an IR code with that command from the ESP32 to the vacuum, causing it to turn.

In terms of the design of the bot, we decided to custom paint a cardboard box, that Anshuman glued the various lighting strips to. We finished up with this.


Due to health reasons, we couldn't meet up with Ben til the exhibit day, so Anshuman and I met up again without him. We continued to work on the bot, when we encountered issues with sound. Low and behold, Ben was our sound engineer. Anshuman and I then tried to understand his code, and seek to solve the problem. We were however unsuccessful. We later discovered the problem was most likely a power issue. As the one Arduino uno powered all the lights (over 40 LED) plus the potentiometer, touch sensor and sound, it simply couldn't handle all of the load. We were then unsuccessful in our attempts at powering the setup separately, as we didn't think this was the problem.

The Exhibition

On Exhibition day, we all came into uni. I brought in my TV so that we could demonstrate channel changes, and my ESP32 and the vacuum, and Anshuman bought in the bot. We then spent the next few hours trying to finalise everything up, however quickly encountered issues with Arduino. To our luck, they pushed an update which affected the loading of the IDE, meaning we couldn't edit our code. We were able to find a fix, which meant deleting all of the temporary storage of the IDE, including packages. For Anshuman, this wasn't a big problem as he was using the Arduino Uno however as I was using the ESP32 and had previously installed an array of libraries to run the various IR codes and Wifi codes, all of these were deleted and the bug meant I wasn't able to redownload them. All of a sudden our bot had lost half its capability.

Presenting to you the Sassmobile, simulated edition.

As we were unable to fix the power issue, we had Ben run the DF robot sound separately and on queue, Anshuman was our model man who sat watching the TV, and controlling the lighting on the bot. I was able to move the robot around using the remote control. Sadly we couldn't get it all together in time however it has been a unique learning experience. Perhaps with more time, access to 3d printing resources to enable a more solid build, and better understanding of our problems we could have built a better final product. I am still proud of the boys. #teambatsqwad it has been an honour.

Here is our team video of the exhibit on the night.

week15 #sassmobile

Week 11 Update

Timothy Harper - Fri 22 May 2020, 12:04 pm
Modified: Fri 22 May 2020, 12:04 pm

We have spent the past week implementing changes from the feedback received from Miro.

For me, this meant looking into using a phone to track faces and move the robot. I found this quite challenging as most of the work involved a stationary base for the robot to sit upon, and it simply moving its head around, with the help of a couple of servo motors.

However in my case, I wish to use the robot vacuum to move around. An application is downloaded onto the phone (Android) to get face detection. It draws a green line around the face when detected.

Here is a link to the tutorial:


Here you can see a clear green line around my face. Then in the top corner, there are the co-ordinates of where my face is. These can then be transmitted via bluetooth to the ESP32 (The tutorial uses HC-05) to record these and then process it for the movements of the bot.

I'm still processing the movements, as the corresponding servo movements need to be readjusted with corresponding IR signals.

Week 10

Timothy Harper - Fri 15 May 2020, 4:59 pm

This week we submitted the video and prototype documents onto Miro. It was great seeing everyone else's work coming along, and good to be able to encourage them with the appraisals.

To have to make a video was good to reacquaint myself with premiere. I had forgotten most of the basics so using video footage, sound from my phone and also stills together.

The biggest challenge ahead is combining all of our works together, as everyone in the team is working on the same project. A lot of the interactions are interlinked so we may have to meet up at campus and work through it with coding and physical build.

I plan on doing some user testing with the bot to get more insight into people reacting to the bot.

I am also looking into the possibility of using my phone camera as a way to track faces to improve movement of the robot.

Week 9

Timothy Harper - Fri 8 May 2020, 10:24 am

This week I have been focussed on getting my first prototype all wrapped up. My plan is to build a little hub on top of a robotic vacuum which can store all of the electronics and sensors without looking too shabby. I have managed to do this using a square piece of foam with a cylindrical cutout.

This design implementation only suits my functionality however, as the other designers in my group have different roles and thus different designs. Hopefully for the final we can all come together with a single design and all functionality.

This week we are also working on preparing videos and reports for the first prototype. I have begun a script for the video.

As the video is set to go for ten minutes we can put in lots of information.

To start the video I plan to have a trailer of sorts, showcasing all the parts of the robot with combined footage from all the teammates. This trailer of sorts should try and go for about 30 seconds - 1 minute. This gets the viewer thinking into the space of which our robot lies.

Then switching to me, I will go through the end goal of the prototype, being to reduce users screen time. I will go through some research on the current times people might typically use their devices each day, and that sometimes this can't be avoided. However for many of us stuck at home, we are losing major productivity. Hopefully this section can be from 1-2 minutes.

The section following will look at how I imagine people engaging with the product. I have a test subject in my house who will give live feedback as to how they react and further explanations required if they dont understand. In essence, the user sits either watching TV or on their phone and begins getting harassed by the robot. For my section, I can show the robot progressively getting more and more sassy, if they continue to sit there. For example, they might begin by turning down the robot, and then turning up the volume. It might then change the channel, and then leading on from there turn off the TV completely.

Following this section I will go through what features aren't implemented yet, and outline what these are for the next deliverable.

Week 8 Update

Timothy Harper - Fri 1 May 2020, 12:06 pm
Modified: Wed 20 May 2020, 12:14 pm

our concept

The team concept is a robot, sent back from the future to warn you of the dangers of too much screen time and try and reduce them.

The robot is sneaky and will try and get its way no matter what. It uses sass and responds differently depending on the situations. It may try and distract you if you're using your screen too much. This could include making noise and running around in circles, or changing the screen you're watching. Ultimately it is successful when it gets you up and moving.

We're still working on the form of the robot, however at the moment it is based on top of a robotic vacuum. It is set to have the ability to speak, and move around, and also hack into your tv, phone or computer.

individual focus

My focus for the assessment is on the technical side. I have been looking into IR receivers and emitters, as well as using bluetooth and wifi technologies found in the ESP32.


The ESP32 has the ability to do a lot more than the little Arduino Uno. It can do everything that the Uno does but more, most importantly including Wifi and Bluetooth.


This here is an example setup tutorial I did. The setup process was quite a long one. I encountered a few problems such as using a charging micro-USB cable instead of a data transfer one. (The difference is a charging cable only has two internal wires (positive, negative), whereas the data transfer has four: positive negative data transfer and data receive. However externally there is no clear way to differentiate them)

This setup had multiple parts to it. Firstly a red LED. This was used to test the wifi functionality. The ESP32 has the ability to make it's own WiFi network which you can join. Or it can connect to already existing networks. By uploading a simple page with two buttons, one for high, the other low, you can program these to turn the light on or off wirelessly.

Making my own network and connecting to it was easy, whoever connection to an existing network was hard as there was a glitch, in which my SSID name required a single space after it in order for it to be recognised. After working through that, it meant I could enter the local address given to the ESP ( on any device on the same network and control the light.

Now that I know I can connect to devices, I need to come up with ways to reduce screen time such as turning off the phone.

We also have a Bluetooth chip within the ESP32. You can for example set up a chat between the serial module within the Arduino IDE and your phone connected to it via Bluetooth.

I also tried working with touch by using some copper tape connected to a pin. This allows a disturbance in electrical signals.


This is the underside of the ESP32. It requires a photo for reference as the pin locations aren't on the top side.


A top shot of the brand and model ESP32.

Uno IR

This is the setup for the IR remote. By using the IR receiver setup shown last week, I could record codes sent by various TV remotes and then send out those codes again with this setup.

As I can easily replicate any control, such as turning off the TV or sound, changing channels etc, we can easily infuriate the watcher if they've spent too much time on the telly.

Trouble I faced was one TV remote, a battery had leaked and caused the remote to short out. I however downloaded a universal remote control app off the play store on my mums old Galaxy S4 which has an inbuilt IR emitter. I then found the same remote and recorded the codes sent from the phone.


In this instance I turned the volume down on the screen. This also showcases my weird computer setup. I'm pretty much just using my surface as a desktop with an external mouse, keyboard, and cheap 32" TV.

The base which the emitter will be placed on is an IR controllable robotic vacuum, so I recorded all the manual override controls from the vac remote giving me full control over the robots movements. I just need to do some questions and design ideation as to where I place the kit on top of the robot. I will need multiple IR beams being sent - one facing the robot and one out to control the TVs.

I also need to copy this setup over to the ESP32 which should be simple. It's just a matter of plugging to the right ports.


This was an exercise of using a remote to control a light. Perhaps the robot could predict what the person was trying to do and stop them. For example, it recognises the code to watch the movie channel, and then automatically changes the channel after.

Project work

Timothy Harper - Fri 24 April 2020, 11:06 am

The project is well and truly underway. We have had a few weeks over the break to play test with arduino and figure out the 'how' for our project.

So what is it we are trying to achieve?

Story line

Our idea is a robotic companion (sent from the future) to warn its family members of the troubles of technology. It seeks to see you reduce your screen time, so that it doesn't overtake your life (like technology did in the future). From a health perspective, there are links between too much time looking at TV and computers and detrimental health affects such as sore eyes, tiredness, as well as neck and back problems.

The Bat Sqwad is working together on the same project, each focusing on different aspects. My aspect is looking into how to hack into TV's and computers. In order to do this, we are going to use IR sensors. In order to control devices which aren't controlled by IR, I am going to use either the Bluetooth module HC05 or the ESP32 which is like the Arduino Uno but more powerful with built in Bluetooth and WiFi.

Work done so far

In order to get the required sensors, instead of buying my own, I repurposed a few from an old RC helicopter. RC helicopters are great, as they contain both a receiver and emitter sensor.


Pulling apart RC remote. The clear looking LED is an IR emitter. This remote had 3 in unison to connect with the helicopter. This allowed for a wide beam of data being sent unlike a straight beam sent from your TV remote


The IR emitters. I used a Gerber saw to cut them off.

Imgur Imgur

The helicopter with its protective casing removed. IR receiver location. Had to fully dissemble the copter to get to the chip it was attached to.

Imgur Imgur

The brains of the copter with the IR receiver. Clearer shot of the receiver - I was a bit worried because of rust.

Imgur Imgur

Using this program, I was able to test the receiver by hooking it up to Arduino and aiming a remote at it. The setup was quite simple. A power line, ground and line to a pin.


These are the hex codes shown when a button is pressed. I can then use these same codes and emit them back to the TV to have the same effect. For example, the turn off button has the code 4CD0391B, I can then program an emitter to send that code back through the Arduino and hack the TV. This is the future plans.


Timothy Harper - Fri 24 April 2020, 10:32 am

We used Miro to explore ways in working out how to get data about things in the current climate. As we cannot go outside we need to make the most of technology to collect data.

Breaking down the project

Working out where to start

Mapping out the project

  1. work out the overall mass of the project
  2. identify areas of weakness / strength in your concept
  3. identify missing information
  4. scope the stages of the project
  5. doing this as a team ensures everyone is on the same page.

What do we know about the items form?

  1. size
  2. shape
  3. material

What do we know about the items function

  1. processes (input / output)
  2. storage (data storage, where, form)

Context of use

  1. where in the world (eg. home - where in the home)
  2. fixed space or moveable (is it designed to move)
  3. can it be used in one place or many places
  4. characteristics of location (busy, quiet, small, spread out)
  5. who is going to use it?
  6. audience characteristics (is that group of people constant)
  7. is time important (use during day / night)

What we do know helps us figure out what we don't know. Setting a minimal list of requirements.

Week 5

Timothy Harper - Fri 3 April 2020, 5:31 pm

Report back

How was zoom?

Zoom has been ok. I've had internet and computer problems. On Wednesday, it worked smoothly. However Thursday was a nightmare and we tried Discord but it wasn't much better. We settled with just having our voice streaming through Zoom. We're using Facebook messenger to chat, but it is hard to discuss big ideas and try and get your ideas across when typing as opposed to talking. I've also had some pretty sucky computer issues (my laptop battery doesn't hold a charge) however we can make do for the moment.

What did we pitch?

Last week we pitched the Maze robot. A sassy robot which helps you through a maze, and can be untrustworthy, depending on whether you trust it or not. The problem space was looking at whether technology can be trusted.

What have we done as a team?

We have been idea generators - from the feedback, it has been difficult to scope down the main ideas and where we want to go moving forward.

Achievements made moving forward

Need to figure out which idea we want to go with. The main report is also due in under a weeks time.


If we build it ourselves are we all supposed to build different parts of it? How to determine whether we want to work on it individually or as a team?

Anshuman came up with a 'doggo idea' whereby users place 'to-do' cards on the robot and it alerts (barks) them to finish the tasks. If the users haven't completed the task, they could tap the dog to get it to stop barking. Over time, as users would be getting lazy, and the dog is at foot level, users might mistreat the dog. The dog would then bit the user, take further action, etc.

This space could uncover how users would react differently to how they treat a robot dog to just a robot - would they feel some human value attached to it and treat it with kindness? Typically dogs are friendly and happy. Giving tech its own feelings can make how we respect it change.

Ben looked for a more specific reason to betray the robot as a means of progressing yourself further.

- getting lazy

- game (you wanna work with the companion)

- games have a defined structure (good set of rules)

- everyday activity where there is a reason to betray technology?


The Arduino kits have arrived! Very grateful to have received my kit, and really looking forward to making something awesome.

I've already had a look through the included book and it has some fun projects.

Finalised team idea

As the team, we 'boiled down' our ideas and went through this process a few times. We looked at what a maze is. A maze is a single solution problem, there is one clear way through it. A user has a sensation of feeling lost whilst being in a maze, and would rely entirely on the robot to help them through. You have to trust something you have no trust for. There is also a sense of not knowing the finish. It could be around the corner or many turns away.

We explored what it means to be lost. Steven (our Tutor) suggested looking at a car driver. When you are driving, you can be lost, and completely dependent on your GPS. Having a sassy robot assistant (like an enraged Italian taxi driver) could make it work.

We also thought of a personal assistant interaction. Having a robot which could control the amount of technology you are using. As a team we could explore the different ways it can look. Perhaps the assistant is like a small dog. It could yap and scream at you if you're watching too much television and turn it off. If you wanted to stop listening to the dog, you could 'zip' its mouth shut or power it down by removing its 'brain'. Obviously these aren't real things but a replica of sorts, to aid in an everyday interaction whilst keeping it physical.

As a team, we liked this idea the most. There is a fair bit of research out there on using technology too much, and the negative affects on the body. From biting into sleep time (which negatively impacts mental health) to using up productivity time, and putting pressures on yourself to get work done sooner, it can be hard to regulate screen times.

We are looking to explore different designs of what the assistant should look like. Perhaps a dog, or a human, or a ball. The assistant could also be given different personalities, such as angry or passive aggressive, kind and supportive or impatient and snappy.

As the current COVID-19 climate makes team collaboration hard especially on a physical project, applying these different personalities to the same overall concept could work.

Through this process, it was important to come up with core concepts that the robot would keep to. These are that it is sassy, it is in control, it offers a service and is needed, and there are consequences for disobeying it.

Week 4

Timothy Harper - Fri 27 March 2020, 2:17 pm

This week we presented our team ideas. For our team, we went with a sassy machine that is untrustworthy.

The domain of sass doesn't particularly solve any world problems however it can add something to life which is against the norm.

The other presenters throughout Tuesday and Wednesday gave some pretty interesting ideas. We are in an interesting space whereby we don't have to know practically how an object could physically come together but we can dream up some ideas.

The one idea to have a play ball that you smack around the house whilst doing chores sounded great. Practically speaking it could be dangerous, and everyone is quick to realise that but it is good to fantasize how cool it would be if these things came into real life.

Also having fun with ideas and recognising they are flawed, but getting great feedback was an excellent part of the experience. I actually enjoyed hearing from tutors and teachers than the presenters, seeing how they could turn an idea on its head, looking at it from a different angle. It reminded me of the prototyping excersises with DECO2300, where we built on an idea, but in the last iteration, we could have a 'black horse' iteration, flipping it on its head. Asking questions like, 'how would this look in a multiplayer scenario' or 'would this work with AR rather than a physical object.'

I have also recognised the importance of sharing other ideas in the same domain, linking to Youtube vids of similar ideas.

As we know, there is no new idea under the sun, but we can build on the ideas of others, that's education!

Bat Sqwad

I have to personally thank Amraj for all his hardwork with the illustrations and presenting. He has provided great chats and will be missed from our group. He is the hero we need.

Looking at the feedback from the presentation via slack and tutors, it has been really helpful seeing how an idea can be expanded on.

Things to think about include looking at the now

'Maze companion' which helps users decide which turn to take in a maze, and place it into a more everyday situation.

One idea was to have it as a barrier between you and a sweet. Inside the 'box design', would be a piece of chocolate or your phone (something addictive). As you work on something, the phone that would typically distract you would be sitting inside the box, and every 25 minutes, you could have a chance at pulling a lever, either taking the machines suggestion (trust), or your own (distrust). The machine could then (if it deems ok) open the 'hatch' letting you access said piece of chocolate or your phone. Depending on your previous responses, how much you trust the machine, and how long it had been since the item was in storage, would determine the sassy remarks received from the machine.

Bubbly Buddy

Timothy Harper - Mon 16 March 2020, 12:27 pm

A revised version of the Breakfast Bubby, introducing the Bubbly Buddy. This bubbly coated smart device is able to help you make better choices and have fun whilst doing so. Instead of sticking just to the breakfast box shape, this new BB is able to be formed to many different scenarios such as doing homework, or drinking water.

It uses a futuristic 'bubble touch' technology, whereby the 'flexible touch screen' is actually a type of reusable bubble wrap that can be popped. Behind each individual bubble is a RGB LED.

Users set up the mode and then use it for the specific scenario.

In a studying scenario, users might have trouble sticking to good study habits. Using the BB, they wrap the BB around the outside of their notebook. The BB then starts up a game to get their brains warmed up. The game involves users trying to pop the bubble when it lights up. However the speed at which the bubble stays lit gradually gets faster and more difficult. When the user misses the lights, it's game over. The BB will then enter study mode, not disturbing the user until after 25 minutes for a 5 minute break. This would follow the 'Pomodoro timer' technique which recommends users study for 25 minute bursts with a short break in between to maximise brain retention and focus. During the break, the user could choose to play a little game, perhaps just let off some stress by popping bubbles, etc.

Another scenario, drinking water, the BB would wrap around your water bottle, like a cooler, keeping your drink cold/hot. If you struggle to remember to stay hydrated, it would flash up and annoy you until it detects you pick it up and using inbuilt accelerometers, you drinking. This could be a once per hour distraction (more or less) depending on user preference.


Week 3 - Teams are formed

Timothy Harper - Fri 13 March 2020, 2:56 pm

Well week 3 has been a fun one! From world cafe, to meeting a real product designer (via stream) to getting into our teams, has been a real whirlwind of a week.

What we did

On Tuesday, we had the world cafe session. This involved moving around tables with different themes with a different group of people each time, having a host stay at a table and relay the information from the group before. In the rounds, we generated ideas based off context, and then tried to make them more usable.

My favourite idea was on the music table, where we ended up making a game where aliens battle on top of musical keys, and thus making tunes to battle to.

Hearing from Bash

Was good to hear from a real life designer. Points I took away were to think about who I want to be and what makes me unique. And to take a gamble, trust in a job finder and learn from your mistakes.

Team formation:

Based off of our favourite themes from the world cafe, I chose to focus on Sassy tech. I think it is a creative space and thus have now formed a team with some other fellas who are also interested in the theme.

As a team, we're eager to look into using music as a part of our concept but are meeting up Saturday to further refine our idea.

Week 2

Timothy Harper - Mon 9 March 2020, 1:20 am

What was done

Week 2 started off with soldering inductions. We started with breadboards and figured out the wiring of a simple light circuit, which uses a single resistor, LED, battery and switch.

Imgur Imgur

I found it a bit challenging as the solder wand I was using was dirty and made it hard to heat up the lead. I still need to resolder some terminals and put the battery on, and then fit it into the 3d acrylic case. It was a good experience to use it for the first time.

We had our project inspirations this week. The aim of the task is to clearly present a novel idea of physical computing that is playful and interactive. Everyone presented their ideas over the two days and it was a real good experience hearing the variety and imaginative things people could come up with.

I went with the idea of Breakfast Bubby, a breakfast buddy that seeks to overcome the issue of people skipping breakfast by reminding them to pour their cereal through a touch lighting array.

How it relates

Reflecting on my presentation, I failed to reach the two minute timer, and gave the bare minimum information - which serves as a good reminder next time to take up notes on what I'm going to say. I also made it difficult to convey the idea through the poster as it was hand drawn.

Looking at feedback from others, I appreciate that some thought the idea was interesting, but fails to show how it can help people eat breakfast. A few of the problems include what if they don't eat cereal (the design focused on encasing a cereal box). Then even if they did eat cereal, how could fancy lighting promote healthy eating habits.

The Breakfast Bubby would in an ideal scenario be awoken by a single touch, letting it know that its brekkie time. It would then turn rainbow. Upon pouring in cereal to a bowl, it would slowly transition from red to green as it deemed was enough cereal being poured into the bowl. This would use an accelerometer to determine angles, movement and weights of how much cereal was poured. Once it was green, you would place it back down and it would turn off.

Generally without some sort of rewarding system or distractive mechanism to encourage you to pour it in, the tech doesn't really have a lot of appeal. Perhaps if it blared an alarm until you poured your cereal, like a get out of bed / eat your breakfast alarm, a two in one technology to wake up and eat up, the user would find the tech useful.

Some of the feedback suggested that it could make the breakfast for us, as opposed to just telling us to eat. Making use of the bubbles could be interesting - apparently everyone likes bubbles!

Making people excited to eat food is tricky, incorporating a game element was suggested. Getting people to stop eating could also be a problem.

Another student suggested to add more interactive elements to the idea, such as somehow displaying the health of the cereal or the time. This could be interesting as it takes the focus off generally eating but more on what you are eating.

Using visual shapes on the digital array could be helpful (ie smiling or frowning).

The bubbly array was suggested to be in a bowl or plate shape instead of a box, so that you could overcome the problem of a specific foodtype.

After all the ideas were presented, we put them together into themes.


TO do's

I still need to come up with ideas for other uses of the bubbly technology / reiterate idea.

Exciting works

I enjoyed hearing about tech that put a funky twist on things. This might include the running hand, a simple game where you use two fingers to just run around anywhere, but turning that into an actual AR game sounds awesome - using the body as a controller.

I also liked the theme bothersome tech - which could be somehow incorporated into my idea with sound.

Week 1 Catchup

Timothy Harper - Mon 9 March 2020, 12:31 am

The work done

Week 1 we began classes by looking at our expectations, fears, aspirations, questions and rumours we've heard about deco3850. We wrote all of these down on different whiteboards around the class and Lorna went through each of them

For rumours, I had heard from DECO2300 that we are using ARDUINO! This kind of scares me a little as for that course, I explored VR instead of Arduino, but I'll happily add onto aspirations that I'm keen to learn how to do build and code it. Speaking to other students, I have a better understanding that setting aside another 10 hours on top of the 10 contact hours is really important if we want to succeed in the course, so I've set up in my timetable a schedule to accommodate this.

We went through the Studio outline. We have our weekly journals, the project inspiration in Week 2, team project from Week 4 - 13, Exhibition on May 22 and portfolio and critical reflection due in the weeks following. Looking forward to the final products.

After running through the course outline, we began our project inspiration. I looked for ideas using Instagrams

hashtag features, searching up tags such as #physicalcomputing, #interactivedesign, #futuristicdesign, #tangibledesign.

I found some interesting things to do with touch and lighting, such as lighting which turns on and off with a tap, and graphical lighting that is generated when you play a note on a piano.

We then continued looking for examples of physical computing that either did or didn't fit the brief and posted it on slack.

For the Wednesday, our homework was to explore one of the seven HCI Grand Challenges. These are challenges for the future of Human-Computer Interaction Design.

My challenge was #7: Social Organisation and Democracy.

This challenge explores how HCI can benefit 'smart societies' and tackle problems such as jobs, poverty, environment issues and equality.

With the issue of sustainabilty, HCI should seek to be environmentally friendly in regards to what it is used for, and what the tech is made out of in a world where natural resources might be limited.

With the issue of social justice, technology should be designed to be as inclusive as possible, and not just to benefit one social class.

This extends onto active citizen participation whereby all citizens can use the technology. How to engage the citizens in impactful participation and encourage long term participation and inclusion is another challenge without overloading citizens with too much information. People within minority populations and low social classes also have a risk of being excluded.

Democracy is another issue that is explored and touches on the issue that the internet actually filters our ideas and shows us what we want to see, as opposed to getting views from the other side of the coin. We are prone to getting caught in a 'bubble'. If algorithms outside of Facebook were doing this, we are on a slippery slope to strong censorship and fake news. We want to maintain a diversity of views and insure AI can facilitate this.

Awais' Experiement

Awais has made an experiment to help us come up with ideas for the project inspiration. Using a pack of playing cards and a cipher, we had to come up with some pretty wack ideas. The cards are associated with a sentence. A spade = Design to/for. A club = Place. A heart = Interaction Mode. A diamond = Interaction Quality.

Some ideas from the table we were on. You can see the sentences we made and from their ideated from


The cipher and cards and their corresponding meanings.


I actually found the experience quite fun and useful. I made up some things that I would have never even thought of before. We had a few rounds, the first few being individual and then as a table coming together and discussing our favourite ideas.

The first sentence I made was Design for DEBATE in a BICYCLE using SING with the quality FOREVER

I came up with a musical synthesizer that turns your voice into song, which is powered by a dynamo as your are riding on your bike. The song is actually a debate you undertake with other riders, and you can playback said recordings forever.

A different take on it is to upload a set of 16 words / responses and having to debate using only those responses to satisfy the FOREVER quality as the responses cant be changed.

Another take is to have music from top 40 debated on a stage powered once again by a bike dynamo. As sustainabilty is a topic of FOREVER, and using a bike is sustainable, it satisfies the quality.

Changing the design from DEBATE to OBSCURING, I came up with this You obscure your real voice by song through bike power, and this becomes your new voice forever, like how darth vader speaks through his breathing mechanism.

For the sentence Design for OBSCURING in a PARK using SING with the quality FOREVER

Hide treasures in the park under a water display (similar to what you would see out the hotel in Las Vegas). The shooting jets of water are timed to the sound of song and hidden under the waterfall, you throw a coin and it is forever hidden.

Obscure your identity in a game where others have to find out your heritage. Select some songs which represent your background and enter this contest in an open environment, the park. Get up and sing, challenge fellow passers by and encourage them to sing. Meet new people and have a talk to them about their heritage and their FOREVER origins, whilst hearing some good music.

  • Design for OBSCURING in a PARK using SING with the quality TRIBAL*

A public tribal performance that obscures the harsh realities of history to suit the needs of people with an agenda.

Round 3 - Multi

Some of the ideas from the table included

Design for training in a kitchen using a sound with the quality playful.

Pivoting on playful - each time the bin fills up, it plays a song that is dramatic. It could play - 'I don't feel so good' when the bin throws up some 'bad rubbish' that was put in the wrong bin.

If the compost was placed in the compost bin, it would play a happy sound. It can track who puts all the rubbish in the bin - who ever wasted the most has to empty it.

Another idea - a knife that talks to you. It has geometric and voice capabilities.

How it relates

I think this course extends our thought process on top of our work in DECO3500 (ideation), DECO2300 (where we did journalling) and DECO2500 which got us in the process of posing the 'why' questions.

The work to do

Project inspiration

Work that inspired / interested me

I was inspired by the futuristic ideas and cipher game that opened up our thinking completely. When doing the cipher, I found that being restricted to certain qualities made me think about what quality would could generate more ideas in its place. So I ended up changing qualities from FOREVER to TRIBAL, to make the sentence Design for OBSCURING in a PARK using SING with the quality TRIBAL. I felt TRIBAL and singing had a good match

Breakfast Bubby

Timothy Harper - Tue 3 March 2020, 12:49 am

Are you a morning person? Do you have your ritual sorted out? A glass of water, shower, shave and stretch, dress and eat and you're good to go?

Arguably which of these steps is the most important? Perhaps the shower, or a coffee to follow. Breakfast, I'm told by many is the most important meal of the day, but according to a study by Australian Food News found that almost half of Aussies skip breakfast at least once a week, and a further third up to three times a week.

Introducing the Breakfast Bubby, a smart fun way to start your day.

The Breakfast Bubby (BB) is a smart bubble wrap which you put your cereal box inside. You press the bubble buttons to wake up BB. It will then light up and flash colours until you shake out a recommended amount of cereal, and which point it will light up green and turn back off again.

This fun little interaction aims to encourage kids of all ages to eat breakfast and start the day right.

Inspirations / Links A study about Aussie eating habits A LED lighting array used to help grow food Touchable lighting which turns on and off when pressed or can be programmed to make a fun light show. Utilising accelerometers to add a new level of interactivity in turning on of a light.


Week 1 Intro

Timothy Harper - Mon 2 March 2020, 10:58 pm

Hey all, my name is Tim, and I enjoy shows like Alterned Carbon, which portray a futuristic model of the world, to things like Vikings which offer insight into the way it use to be.

Design wise, I like to come up with weird and funky ideas, I'm a bit of an opportunist and always thing of the marketing side to it as well. I'm majoring in UX.

I'm a bit of a noob at programming but will grit through it if need be :D

This semester I'm most looking forward to the exhibition and seeing how far we come.