Documentation & Reflection

week 13

Benjamin Williams - Fri 19 June 2020, 2:33 pm
Modified: Fri 19 June 2020, 2:33 pm

Construction #2

This week Bat Skwad met back up to finish constructing the SassMobile. Following on from last week, the DFMiniPlayer was still causing issues and would randomly work and not work. We thought this was down to the wiring but adjusting that didn't seem to fix it. In the end we ignored the countless errors the IDE was throwing and somehow prevailed. The Interaction and Speech components started to work consistently despite erroring. This brought us onto Tim's component. Providing power was still causing issues so we weren't able to fully integrate the components. We ended up going to the tutors for help the following day to get their guidance. I didn't have much to add to that conversation but still sat in a helped where I could.

I added a few positive feedback lines to the SassBot library. This included 'Good job', 'Good for you', 'Wise choice' etc. Steven was kind enough to record himself saying them again. I also finished up sorting the audio file so specific lines could be played on call.

Angry SassBot
Imgur
Interaction
Imgur

Website

Website development went ahead this week. I'm was keen to give my design a clean and simple aesthetic. The simplegreen colour and white worked well with the Andale Mono font. I organised my website into logical sections: Concept, Final Product, Design and Reflection. Concept explains how Sassy tech is used to break the bad habits of users which was screen time. In my interaction section I added some audio features that can be played as the user reads through the interaction phases. The design section describes each aspect of the SassMobile: Hacking, Mobility, Communication, Personality, Appearance and User Interaction. Ultimately, I wanted my website to have clear sections that a reader could randomly dive into as I didn't expect people to read the whole website lol. I'm really happy with how it turned out :)

Exhibition week!

Benjamin Williams - Thu 18 June 2020, 11:13 pm
Modified: Thu 18 June 2020, 11:13 pm

Exhibition!

Uh oh.

Arduino IDE crashed on the day of the exhibition! There was an update that caused the app to crash as soon as it opened. It took hours to figure out that you had to delete some hidden files to resolve this issue. At this stage the exhibition was about to begin and the SassMobile was still in separate parts... Due to the ongoing power issues still we couldn't get all the components working together. So we managed to simulate the functionality of the robot by showing each part separately. I had to manually play a sound when appropriate, while the SassMobile roamed around and flashed its lights. he IR emitter didn't work and so we just had to explain what was meant to happen. Despite these setbacks, most of the presentations went really well.

I ran Steven (DECO lecturer) through the interation process and explained the concept. He found the idea quite interesting and especially enjoyed the audio :D His one query was "What if you're watching some important news and want the SassMobile to leave you alone?" The answer being that you should throw a towel over its eye to blind it and handicap its ability to hack. Although it can still yell at you...

Overall the exhibition was a success :) Seeing everyone's projects was really interesting. My favourites were the Sound Mixer Lab, Pose Elevator and the Memory Ball.

Imgur

It was great to talk to Dimitri, Seamus, Alistair and Ansuan on discord. We had some laughs about our projects and a good catch up in general. I wish I saw more of them this semester :( Sad thing is that most of them graduate this semester. We definitely need to meet up outside of uni.

Week 12

Benjamin Williams - Thu 18 June 2020, 8:12 pm
Modified: Thu 18 June 2020, 8:40 pm

SassBot Construction

Great to see the boys again! It'd been about a couple months since seeing each other in person so it was nice to catch up and have a closer look at the progress of each component. Anshuman had been working on the main body of the robot. His eye's light sensor, ear vibration and mouth lights were all working well. His latest feature was a nose knob that could adjust the mouth lights. Tim showed off the IR emitter Tv controller which was cool. I showed the boys how specific sounds could be played and output settings adjusted such as volume. Combing each part proved to be harder than anticipated.

We started by adding my audio component to Anshuman's Arduino board. Despite wiring up perfectly, the board had problems registering the DFMiniPlayer. We spent a while reinstalling libraries and adjusting the wiring until we eventually got it working. We never pinned down what the issue was since the code would randomly throw errors. We decided that soldering these components together may be necessary to avoid faulty circuits. Nonetheless, during the time that the DFMiniplayer did work, I added some code to anshuman's program to make a happy sound play when the robot smiles, and a sassy comment when it's angry. Here's an example of this working:

Integrating Tim's component proved to be even harder since he ran his program using an ESP32 board. We managed to connect the two boards, but we couldn't get any sensors or components to work. After seeking the help of the tutors, we narrowed this down to not providing enough power to the boards. This way as far as we got on the Sassbot - solid progress.

To finish off the session, Tim showed us his progress on getting facial recognition working. The idea was to get the Roomba to move towards and face the user by connecting his phone camera to the Arduino using bluetooth. It occasionally worked... but it was fun playing around with it anyway.

Reflection

Spending the day with Anshuman and Tim was a lot of fun. It was good to get our hands dirty trying to integrate our parts. Debugging with friends is always a good time. We made some solid progress putting together the Sassbot, it's finally taking shape. In the coming weeks we'll aim to get everything working cohesively to begin designing some interaction plans.

week 11

Benjamin Williams - Thu 18 June 2020, 7:10 pm
Modified: Fri 19 June 2020, 1:57 pm

Prototype Preparation

I spent this week preparing my audio output component for the group's assembly of the full SassMobile next week. Since I have the parts available to build a second prototype, I made a replica prototype to give to Anshuman or Steve to permanently integrate the audio component into the SassMobile. I also had to reorganise the audio files on the SD card into folders for when specific sounds are targeted. I had issues doing this because there were hidden index files on the drive that caused the code to play a flat sound despite a sound being targeted.

Here is a code snippet of the sound organisation:


// phase 1

  myDFPlayer.volume(15);  // Set volume value. From 0 to 30

  

  myDFPlayer.playFolder(1, 1);  // how good is tv

  delay(5000); // 5s

  myDFPlayer.playFolder(1, 2);  // are you enjoying this show?

  delay(10000); // 10s

  myDFPlayer.playFolder(1, 3);  // I think you've watched enough now

  myDFPlayer.playFolder(1, 4);  // it's a beautiful day outside, not that you'd know

  

  myDFPlayer.volume(25);  // loud

  myDFPlayer.playFolder(1, 5);  // yaaaayyyyy

  

  // phase 2

  myDFPlayer.volume(15); 

  // 1. *Robot Changes channel*

  myDFPlayer.playFolder(2, 6);  //  i. "I'm bored of this channel"

  myDFPlayer.playFolder(2, 7);  //  ii. "Are you sitting on the remote?"

  

  // 2. *Robot Turns down/mutes volume*

  myDFPlayer.playFolder(2, 8);  //  i. "I have sensitive hearing"

  myDFPlayer.playFolder(2, 9);  // ii. "Was that a profanity?"

  

  myDFPlayer.volume(20); 

  // 3. *Robot changes language*

  myDFPlayer.playFolder(2, 10);  // i. "You know Mandarin right?"

  myDFPlayer.playFolder(2, 11);  // ii. "Now it's more educational"

  myDFPlayer.volume(25); 

  // 4. *User throws towel over robot* (light sensor)

  myDFPlayer.playFolder(2, 12);  // i. "Ahhh I'm blind!"

  myDFPlayer.playFolder(2, 13);  // ii. "this isn't funny"

  myDFPlayer.volume(20); 

  // 5. *User unblocks view* (light sensor)

  myDFPlayer.playFolder(2, 14);  // i. "Thanks a lot"

  myDFPlayer.playFolder(2, 15);  // ii. "I missed seeing your pretty face"

  

  //Phase 3

  myDFPlayer.volume(20); 

  //1. *Robot Turns off TV*

  myDFPlayer.playFolder(3, 16);  //i. "Time to go do something else" #3RR0R

  myDFPlayer.playFolder(3, 17);  //ii. "I think your TV is broken"

  myDFPlayer.volume(25); 

  //2. *User throws something at it* (vibration sensor)

  myDFPlayer.playFolder(3, 18);  //i. "Ahh!"

  myDFPlayer.playFolder(3, 19);  //i. "Ouch!"

  myDFPlayer.playFolder(3, 20);  //ii. "stop hitting me you loafer"

  myDFPlayer.volume(25); 

  //3. *Pull ears*

  myDFPlayer.playFolder(3, 21);  //i. "Ow stop it!"

  myDFPlayer.playFolder(3, 22);  //ii. "I have sensitive ears"

  

  //4. *Robot puts volume to max*

  myDFPlayer.playFolder(3, 23);  //i. "How do you like that?"

  myDFPlayer.playFolder(3, 24);  //ii. "Need subtitles now?"

  myDFPlayer.playFolder(3, 25);  //iii. "I lied about having sensitive hearing"

Reflection

It's been a couple of weeks since the any changes have been made to the concept. We've been focusing on preparation for building the SassMobile. I'm sure we'll run into troubles so it's important that I could do as much as possible on my component. My aim is to make the robot's voice and mannerisms as emotional as possible, so there are few thoughts I've had about how to do more. Making the volume dynamic in accordance to how angry the robot is ie. very loud when it's yelling. It's also occurred to me that the robot has no positive feedbacks that would be triggered when the robot is pleased with the user. I'll have to ask Steven to record some more audio... Looking forward to building the Sassbot next week!

week 10

Benjamin Williams - Sun 17 May 2020, 12:46 pm
Modified: Thu 18 June 2020, 6:08 pm

Critiques

This week I finished my prototype demonstration video and critiqued the prototype videos of other groups. In my vid I demonstrated how the Sassbot talks to the user with the various phrases. Describing a standard interaction, I showed when each personality mode is triggered and the corresponding lines that the SassBot will say.

I enjoyed the process of critiquing other groups since there were some very interesting prototypes. Team Triangle was a standout for me. I loved their super original concept of recording and mixing sounds with lab equipment to relieve stress. The smaller details of their prototype such as how the test tubes light up when they contain a sound and how shaking the test tube plays the sound it contains, were the subtle features that made this concept so quirky and novel.

Ryan's robotic hand gesturing prototype was also really cool. His use of Arduino motors to make fingers move was very impressive. I'm looking forward to seeing how this prototype develops to where the hand gestures become more distinguished.

Concept Development

Looking towards the final prototype delivery, I've put some thought to refining the features and adding new ones. Since the main aspect of my prototype is how the robot can convey emotions, I'd like to add more detail to the robot's facial emotions. Currently, the robot's face is a tissue box with drawn on eyes - not the most convincing interface. Ways to give the face more expressions would be to look at incorporating an esp20 digital interface to give the robot various, changing expressions. Alternatively, I could try implementing some analog indicators of emotion such as lights for eyes. Red eyes indicating that the robot is angry, yellow eyes to indicate that it's annoyed and green eyes to show that it's pleased. Another option is to create some mechanically moving facial features like eyebrows and a mouth. I'm going to start work on the eyes since they're the easiest and can convey obvious feelings.

week 9

Benjamin Williams - Thu 7 May 2020, 4:18 pm
Modified: Fri 19 June 2020, 1:37 pm

Prototype Progress

This week I made the finishing touches to my prototype in preparation for the demonstration video. Last week I set up the df mini player and speaker on my Arduino and got it to output some test sounds. This week I wrote up some lines for the robot to say in with its many actions. I had some fun thinking of cynical remarks and creepy comments that would be fitting for a sassy robot. My favourite line is played when the robot turns the volume up to max (one of it's more aggressive actions) and then asks the user if they want to turn on subtitles - implying that the user has been deafened. With this script written up, the next course of action was to record someone speaking them.

I was pretty keen not to have to do them myself, so I searched the web. I quickly gave up on this though, since it was such an effort to find each individual quote (and in some cases just a word) on youtube and then mash all the different voices together to create some weird multi-personality robot. Furthermore, these quotes needed to have creative commons rights. The fallback option was to use the speech function on word to say each line, which wouldn't have actually been a bad option since it's a pretty robotty voice. Alas, when all seemed lost, Steven (the tutor) came to the rescue and offered to record the lines himself using his best creepy voice. The result was really good, since it captured a lot of personality that wouldn't have been possible using the other methods.

I used Ableton to add some effects like reverb and saturation to the recording to beef it up a bit. With this done I was able to load the sounds onto the micro SD card and get them outputting from the speaker. For the purpose of putting in some effort on the aesthetics of the prototype, I applied my master craft skills to build a casing for the speaker. After 5 minutes of hard work, I completed the casing by shoving the Arduino inside a tissue box. The result turned out pretty good:

Imgur Imgur

Reflection

Notable progress points for this week was writing up a robot script for each phase of aggressiveness and having Steven lend us his best sassy voice to record them. Hearing the robot say these lines finally gives the SassBot a sense of personality, so it's awesome to see the concept take form. The tissue box design looks a little daggy, but I really like how the mouth turned out as it looks pretty funny - fitting for the Sassbot. In the coming weeks this casing will probably be ditched anyway as my audio output component will become part of Anshuman's robo body component. Time to get started on the prototype video...

Week 8

Benjamin Williams - Mon 4 May 2020, 7:55 pm
Modified: Fri 19 June 2020, 1:29 pm

Prototype #1

This week I picked up the borrowed df mini player and speaker parts from Clay. Now I could finally get started on my speaking robot.

I borrowed Mum's laptop to copy some test mp3 files to the micro sd card.

A helpful tutorial showed me how to connect a df mini player to an Arduino. The circuitry was actually pretty straightforward so I didn't have any troubles getting my code to use the speaker.

Imgur

I used a df mini player library in my Arduino code. The library provided a bunch of functions for manipulating the speaker settings and navigating around the sd card files. The most helpful function was the playFolder function which allowed me to specify a folder and file index to easily select a sound to play. Some better file organisation would make this more efficient, however doing so caused some issues. I can't get output when I store audio files in folders so for the time being I've just dumped everything in root. Further down the line I'd like to sort the files into phases of sassiness. Currently, I can only get it to play one sound.


void loop()

{

  static unsigned long timer = millis();

 if (millis() - timer > 10000000) {

    timer = millis();

    myDFPlayer.loop(1);  //Loop the first mp3

  }

  if (myDFPlayer.available()) {

    printDetail(myDFPlayer.readType(), myDFPlayer.read()); //Print the detail message from DFPlayer to handle different errors and states.

  }

}

Here are it's first words <3

Reflection

This week saw the creation of my first Sassbot prototype! I didn't manage to get the folders working but the robot speaks nonetheless. Next week I'd like to get my folders working and have some code that runs a smooth demonstration of the robot speaking. I'm also going to build a simple casing to give the robot a face and most importantly a mouth.

Week 7

Benjamin Williams - Fri 24 April 2020, 3:43 pm
Modified: Thu 21 May 2020, 4:35 pm

Prototype Developement

On wednesday Batt Skwad had another great discussion about how we would go about building our individual protoypes.

  • Tim is working on hacking into a tv by sending remote signals. He's doing this by working with infrared LEDs.
  • Anshuman is focusing on how the user can interact directly with the robot to send it away and essentially shut it up. He's playing around with various sensors.
  • I am dealing with how the robot will communicate with the user. Initially I thought I could do this by hacking into a TV (with IR LED) and put subtitles on the screen. But I ruled this out since remote signals can't add subtitles. We joked that the robot would be able to quickly switch between channels and put together a message from the various channel audios - this was inspired by Transformers when Bumblebee does this with radio. So we talked about the potential of hacking a radio or speaker and trying to do something like this, but the problem was that a radio doesn't have a screen and so it'd be awkward for the robot to have to go off and find a radio to speak through. We discussed attaching a speaker to the arduino, but after doing some research this is apparently very difficult. Running out ideas, we consulted Clay who suggested the use of a DF mini player. The DF player takes an SD card filled with separate audio tracks. The programming involved will look at making a certain message play at a certain time. If the DF player proves to have problems, the fallback option is to simply hide a phone in the robot to play messages.

Combining each part

Another topic of discussion was about how we would combine our separate components. Since we can't physically meet each other it's going to be impossible to physically combine each component. The problem stands that how would the robot know when to say "haha I turned off the tv" if the TV hacking IR remote is 10kms away in Tim's bedroom. Clay cleared up this confusion by suggesting that we simulate component interactions by implementing buttons to act as the trigger of when to do what.

Reflection

I'm happy with the progress we've made this week. It took a good couple of hours of discussion to work out what approach we wanted to take in separating and building the prototype. The result being that we went with option 1 of separating the prototype into components where I am dealing with communication, Anshuman with physical interaction and Tim with screen hacking. Not only does this option allow us to efficiently build the prototype, it allows is to focus and refine our own aspect of the concept. I'm especially satisfied with this outcome because communication allows me to further explore the human and computer emotional interaction aspect that I've been interested in since the start. Moreover, I have some experience with audio technology and sound design so I'd be interested to see how I can apply these skills.

Mid Semester Break

Benjamin Williams - Sun 19 April 2020, 6:41 pm

I spent a couple of days during the break playing around with my Arduino kit and working through the activities on Ben and Steven's google doc. While this was a new experience for me having never used one before, I was surprised at how easily I was able to follow the tutorials. My favourite activity was the one that involved making the LED dim or brighten in response to surrounding light levels - so cool! I got carried away trying to add controls to the circuit and I managed to implement a little on and off button as well. I found it really satisfying getting my code to communicate with the Arduino kit and do what I wanted. It's really opened my eyes to the endless possibilities that these kits are capable of. Subsequently, I started thinking about the the sort of things I need my project to be able to do.

My part of the group project focuses on how the robot audibly communicates with the user using stereo and devices with speakers such as a TV. I found this post about how to send remote signals to the TV using infrared light... https://www.instructables.com/id/How-to-control-your-TV-with-an-Arduino/

The code doesn't look too complicated at all. The only missing piece is an infrared LED which can be found in a TV remote. So in the coming sessions I'll try pulling apart an old remote and attempting to build my own Arduino version! The next steps in research will regard how I can send complicated signals such as navigating the menu and changing settings such as language. More to come...

Week 6

Benjamin Williams - Sun 12 April 2020, 10:18 pm
Modified: Thu 21 May 2020, 4:14 pm

Concept Decelopment

With the project proposal report due this week, it was really time to finalize our concept.

We spent Wednesday collating our ideas to come up with a final concept. Continuing on from last week's progress, we had identified some core concepts and goals.

  • The tech is in power
  • There are decision points
  • The user is forced to trust or not trust the tech

In designing for everyday use, I came up with the cooking, origami, and driving teacher ideas. While they were good ideas, they posed some obvious problems. The main one being that the nature of an untrustworthy teacher is simply annoying and impractical. So, to eliminate the the teacher and guide aspect of these ideas, we thought about how the tech could be a more passive character. Thus, Anshuman thought of a self aware robot that tries to reduce your screen time. The way it worked initially was that the sassy robot would hassle you to turn off the TV by yelling at you. It would sympathise with household technology and become annoyed when you 'abuse' technology such as by overusing it.

We brought this idea to Clay and Steve who helped us improve the design. Clay informed us that the arduino has the ability to hack TVs by sending remote signals. Perfect. Eventually the idea evolved to be an adruino that roams around the house on a roomba hacking into TVs and other screens. It could hijack the tv and start playing advertisements, turn up the volume or change the language. It would be able to communicate with the user by speaking through TV speakers or putting subtitles on the screen. The idea became pretty creepy as we tried to make it seem like the robot was self aware.

Studio

In the studio we started using Miro which was a success. It's really cool seeing everyone in the class working at the same time on a board. For the activity we looked at methods for collecting data during a pandemic (not going outside). I thought that facebook groups could provide some great data. From my experience, posts and comment sections in facebook groups bring about some quality content. I also thought that podcasts could be a good source since podcasts are generally very informative and easy to understand.

Reflection

I'm glad team Bat Skwad was able to refine our ideas and come up with a great concept. I was honestly struggling to think of an effective application of sassy tech in a novel, household environment - lucky we have Anshuman <3 The Screen Time Reducing Robot is a perfect way to explore how humans respond to a 'sentient' household robot. It's sassy attitude will be an effective way to get through to people watching TV since it's such an unconventional thing to have your roomba yell at you. Furthermore, we'll be able to explore how users tend to treat a robot that is generally pretty unlikeable and annoying. It will be interesting to see how long it takes for people to become violent and whether that bothers them at all. Alas, the underlying experiment is about how people interact with a robot that has emotions and whether this makes them see technology in a different light.

Week 5 -

Benjamin Williams - Sun 5 April 2020, 12:30 pm
Modified: Thu 21 May 2020, 3:52 pm

Concept Development

This week our team worked on delving deeper into our problem space by building off our original concept idea.

Stemming off our theme, Sassy Tech, the maze idea explored how a user would cooperate with a dodgy robot guide to complete a maze. When the user comes to a decision point (left or right), the robot would offer its advice and the user could choose to follow or ignore it. The more the user ignores the robot, the more moody and devious the robot would become. Alternatively, cooperating makes the robot friendlier and more inclined to offer good advice. Ultimately, the aim of this concept was for the user to experience cooperating with an untrustworthy robot. We boiled it down to the problem space of 'how humans put trust in robots'.

Steven the tutor helped us collect our thoughts and think more critically about how we could effectively develop a concept. Once we decided on a problem space, he advised that we identify some core concepts and a mantra. In identifying these points, we considered how each element relates back to the problem space.

Core Concepts:

  • Clear objective
  • Decision points multiple answers
    • Robot offers advice
  • Guide in power
    • No clear path
    • Some idea of progress
    • Sensation of being lost
  • Trust robot or not
    • Penalties for not trusting
    • Benefit for ignoring robot

With these foundations, we then spent our time putting forward ideas and developing a concept.

Ideas

  • GPS assistant

This idea is a driving assistant that gives you directions in your mission to reach a destination. Similar to the maze, the robot recommends a path. However, you encounter obstacles follow a dodgy route if you follow the robot's directions. Eventually, the user should begin to mistrust the robot and ignore its advice.

  • Cooking teacher

This concept involves a cooking instructor that offers methods and ingredients for baking a cake. The robot instructor offers some dodgy ingredients and cooking methods, ultimately leading you second guess the instructions. Since the user has a vague idea in mind of the ingredients that belong in the cake, the user is inclined to place their trust in their own instincts.

  • Origami teacher

The origami teacher offers folding directions in your task to create origami. Throughout the task, the robot teacher will offer some incorrect folds. The user should try to guess when these arise based on their idea of what the final origami should look like.

Those three concepts involve making the user suspect that the guide is supplying false information and that they cannot be entirely trusted. Again reinforcing this problem space that people shouldn't by default place their trust in a robot.

Research

When Bat Skwad struct a dead end, we decided that it would be helpful if we each went off to collect some individual research, and then come back with some new inspiration.

  • Overtrust of Robots in Emergency Evacuation Scenarios

I found an interesting article about a volunteer study done to test how people follow a robot to lead them to safety in the case of an emergency.

Trust in evacuation robots

The scenario was setup by placing a bunch of people in a mock building which begins to fill with smoke. Incidently, an "Emergency Guide Robot" comes to the rescue. By following the guide, participants found themselves being offered LED-lit directions to an unmarked, previously unknown exit rather than the door they'd first come in through. The bot would attempt to take followers to a clearly blocked-off area, admit that it was malfunctioning, or otherwise behave unreliable. The bottom line of this study was that humans should consider establishing an "appropriate level of trust" with robots.

  • "People seem to believe that these robotic systems know more about the world than they really do, and that they would never make mistakes or have any kind of fault ... In our studies, test subjects followed the robot’s directions even to the point where it might have put them in danger had this been a real emergency"

Reflection

After this week, it's clear that our concept is progressing, I'm just not sure where we're headed. After essentially ditching the Maze Bot idea we were feeling a bit directionless. We knew that we were exploring how humans put trust in robots, we just weren't sure of a good application to do so. Then Steven came to the rescue. Steven helped us organise our ideas into a list of core concepts and develop a mantra. With these foundations we were able to focus our ideas more and not stray too far away from the original goals. Consequently, we amassed a bunch of new ideas that all have the core concepts in common. Over the next couple of weeks we'll work on refining these ideas and choosing the best. Another consideration I want to bring up is that we will have to eventually build this thing, so while it is still the early stages of concept development, I'm trying to stay rational with what is actually achievable.

Week 4

Benjamin Williams - Mon 30 March 2020, 6:54 pm
Modified: Thu 21 May 2020, 3:33 pm

Update :(

With the Corona virus becoming a more serious matter in the last week, it has been difficult to focus on deco3850. Amraj has decided to drop the course since it has gone fully online. It's a shame because he offered some great insight on our idea and is a really interesting guy. I would have enjoyed working with him more this semester. Despite these set backs, the group has made some progress refining our idea.

Concept

Our our original concept was a robot companion that helps you solve a maze. While working your way through the maze, you are faced with decisions regarding which way to go. The robot offers its suggestion and some reasoning behind it. If you choose to ignore the robot's advice, it will become increasingly less helpful and even try to sabotage your progress. These days, unhelpful and almost menacing behaviour from a robot is very rare. The point of this concept is to make users aware of the potential for any seemingly helpful robot companion to act against its human master for its own benefit. This concept took inspiration from the laws of robotics, as this behaviour is designed to be a violation of laws 1 and 2:

  1. don't harm humans
  2. obey humans
  3. protect yourself

It's interesting how the nature of robots is that they can only go as far as their programming lets them. Subsequently, they lack the human cognition to make rational decisions in abnormal situations. Despite this, humans will trust a robot to lead them out of a maze because of its friendly face and by making the assumption that the robots acts under the three laws of robotics. This maze concept would explore how humans react when the robot starts to show signs that it actually has no idea what it's doing or is intentionally acting against the human. It is at this stage where the human should think 'hold up, this robot is actually just a computer, I shouldn't blindly follow it'. The robot could then start says stuff like 'Trust me, you're hurting my feelings', insinuating that it is self aware. At this stage the human user should be really confused, and begin questioning everything. 'Do robots have feelings? Is there someone controlling the robot? Is this robot my friend? Should I make the robot like me?' And ultimately this is the goal of the concept. To make users aware that technology cannot be trusted.

Here's a clip from one of my favourite movies, iRobot, where the first sentient robot, Sonny, is interrogated by the boi Will Smith. Sonny inquires about human emotion and how it is 'right' to do something when someone asks. In this case Sonny kills his creator because he asks him to, which is breaking the first law of robotics by following the second law.

Reflection

In response to feedback, we're trying to figure out if there's a better problem to solve than a maze since it would be pretty difficult to create. We just need a task that involves decisions where a robot can attempt to 'help' the user. Overall I'm happy with where our concept is headed. I think it's one of the more interesting concepts since it explores such an abstract human experiment, yet it is still one of the most relevant issues with the rapid rise of technology.

Week 3

Benjamin Williams - Tue 10 March 2020, 2:18 pm
Modified: Thu 21 May 2020, 3:30 pm

World Cafe

Participating in the World Cafe was my favourite activity so far. I enjoyed thinking creatively about such a range of different themes. Collaborating ideas to dig deeper into the themes was an exciting and rewarding activity that gave me a more thorough understanding of what could be done with a theme.

Emotional Intelligence

My group broke down this theme to be ideas that fall under the following categories: learning your own emotions, representing emotions, embracing and expressing emotions, and sharing emotions. The concept ideas on the table were either designed to help a user learn and understand their emotions, or embrace and share an emotion that the user knows they have. For example the Love Tower concept allowed the user to share their love with other people feeling the same. Light-up glass was about expressing party emotions and fuelling party vibes. On the other hand, Draw texture was about representing and understanding your own emotions by translating them into texture art. There could be a lot down with this theme

  • Draw texture
  • Love tower
  • Light-up shoes
  • Light-up glass
Sassy Technology

I looked at this theme by thinking about what the point of giving a technology a personality is. The point is to humanise the technology for a more personal interaction. Moreover, the ability to please or disappoint the tech gives another element of incentive to complete the given task. That's why this theme is good for teaching the user something. By having a personal relationship with the technology, you're incentivised to please the machine by showing that you have learnt the thing. For example, we thought that the handshake arm could be improved by giving it the ability to give you the finger if you stuff up the handshake. Alternatively, it would wave at you if you're a consistently good handshaker. Moreover, being able to teach and be taught handshakes by the arm adds to that human relationship element. This theme was my first preference. I thought the main constraint of this tech regards what degree of AI this tech would need and how to actually implement it.

  • Sassy Knife, Handshake Arm, Climate plant
Digital to physical senses

This theme was interesting to think about. The basic application of this theme is to translate digital things into sound, smell and touch, like a 4D movie. I tried to think in a different approach such as how colour can be associated with sound, smell and taste (eg. synesthesia). I thought it would be a cool idea to make a digital game that teaches synesthesia. Where users are taught to match sounds to colours.

Everyday sustainability

This theme was a bit boring and over-done for me. The ideas either tools to organise your daily life to be more sustainable (wardrobe, fridge, climate cactus), or installations that taught awareness about sustainability (ocean pool, recycle game). I thought the best idea was the climate cactus since it sounded like an effective way to save energy and interact with a fun piece of tech.

  • Charging mouse
  • Ocean pool
Emotional Totems

I didn't really see the point of this theme. Most of the ideas were about reading your emotions and visualising them. The rose that lives or dies in response to surrounding sounds would only make the situation worse. Seeing the rose die while your mum is yelling at you seems horrible. The cubes seemed a bit useless since they only tell you your emotion rather than do anything about it. The best concepts of this theme were the stress relief ones since they actually assessed the problem. The main constraint with this theme regards how accurately the tech can detect emotion.

  • Emotion cube
  • Stress ball
  • Sound Rose

Reflection

I really enjoyed brainstorming and collaborating with other students to flesh out these themes. I felt that it was a highly productive and rewarding activity to get a feel for the best kinds of concepts that could come out of each theme. By doing so I was able to grasp a better understanding of the themes and distinguish my favourites. Emotional Intelligence and Sassy technology facilitated the most interesting discussions. It was cool to think about the many weird applications of these themes such as the sassy hand shaking hand. Despite my early interest in musical things, I found that this theme was a bit one dimensional and difficult to make interesting.

Week 2

Benjamin Williams - Sun 8 March 2020, 6:36 pm
Modified: Thu 21 May 2020, 12:54 pm

Soldering:

I was really excited to learn how to solder and work on circuitboards since I had never really used similar equipment. It proved to be much more challenging than I anticipated. Working out how to link the light circuit according to the picture was annoyingly difficult, despite it being a simple circuit. Nonetheless, I eventually got it working and the light is now sitting on my bedside table :) Apparently, my soldering was a bit messy and could have had smoother blobs. I think this was because I was being stingy with the solder and as a result I was trying to spread blobs too much. In future, I'll aim to get a fatter blob and avoid scraping solder all over the place... I also burnt myself :( Regardless, it was a very educational session and I'm keen to improve!

Imgur Imgur

Presentation:

On Wednesday I pitched my Music Production Paper idea to the class. My speech went pretty well and the critiques endorsed the idea. I'm not great at public speaking, so I was focused on speaking clearly, not to rush and to get my idea across. Presenting without a written speech is definitely a better way to improve your public speaking skills. I was happy to see that the critiques all stated that I was a clear speaker.

Regarding the idea, critiques stated that it was an interesting and novel, but reasonable idea. Music production software is often pretty difficult to wrap your head around so people were excited by a new way to make music. One of my hobby's is music production, so Prod Paper is something that I would love to actually use. So, moving forward with project ideas. I'd be keen to build something music related. Many inspiration ideas were music orientated so it's surely going to be one of the themes. I'm looking forward to the class pooling our ideas together and coming up with some cool concepts.

Concept Theme Brainstorming

With the world cafe coming up next week, the class identified a bunch of themes that came out of our concept presentations. There were around 15 themes in total, so I could already filter out half of them to the ones I'd be most interested in.

Everyday sustainablility

There are a bunch of ideas that I can think of as ways to live more sustainability. Stuff like optimising energy usage, reducing waste, and recycling are some good areas. A presentation that interested me most was the climate gauging, little talking plant that dynamically changes your aircon temperature according to the outside temperature. Its cute face turns sad when you're wasting energy on unnecessary aircon and so it works as an emotional totem incentivising users to make it happy. Another good idea would be a sustainability teaching robot that teaches a family how to be sustainable in a range of ways including energy opimisation.

Musical things / Musical metrics

This theme is a winner for me as it was my concept theme. I'd be keen to combine this theme with another like body as controller to make some sort of novel instrument. I was especially inspired by one of the external uni projects where the user waves around a cloth riddled with sensors that creates sounds by its swishing and flapping movement. While it didn't sound especially good it was cool.

Emotional totems / Emotional intelligence

The crossovers between technology and human emotion definitely interest me. I think there are a lot of unexplored areas about how technology can be used to harness and exploit emotion to create new forms of interaction. For example looking into how different demographics of people can be combined though common emotions. Another area I'm interested in is how technology can possess and use its own emotion to connect with people. For example how people don't want to make a robot sad even when it's simply a program that pretends to be sad. You could do some cool experiments looking at how far people would go to trust a robot or help a robot after making this emotional connection.

Sassy tech

This follows on from emotional intelligence in the way that sassy robots could connect emotionally with users by being annoying or mean. There could be some interesting outcomes of how users would interact with a robot that they clearly don't like and don't trust. Moreover, a robot that they respect and are submissive to lol. This could get weird...

Creative Learning

I thought there could be a bunch of unconventional ways to teach children to read or learn a language. Using games with VR/AR to teach people to cook. Stuff like that.

Project Inspiration

Benjamin Williams - Tue 3 March 2020, 12:59 am
Modified: Tue 3 March 2020, 1:16 am

Prod Paper - Make music with paper and pen!

These days, producing music is a pretty easy thing to do with the help of Digital Audio Workspaces such as Garage Band, Ableton and more. These softwares allow users to harness midi technology to play, edit and record music without the need for physical instruments.

Prod paper is concept that builds on this technology by combining conductive paint and sensory paper. The idea is that users are able draw melodies onto paper and have it played back to them. Like the lines and dots of a digital midi project, Prod Paper would use the same language to process sounds. A secondary device paired to the users phone / headphones would then clip onto the paper, read the sensors and playback the music to the user.

Features:

  • lines can be rubbed out and redrawn
  • the pen can draw in different colours = different instruments
  • paper can be joined together for a longer song
  • the paper would feature grid lines to so that the user can draw in time
  • paper data can be copied off the paper and saved
  • book = album
Imgur

Inspirations:

https://www.eurekalert.org/pub_releases/2016-05/uow-pg051116.php

https://www.youtube.com/watch?v=JhOd2UIQjkc

Introduction

Benjamin Williams - Wed 26 February 2020, 8:22 pm
Modified: Wed 26 February 2020, 8:26 pm

Yoo, welcome to my first journal entry of DECO3850 :)

I'm in my last year of study of a BInfTech (UX/INFS) and I'm keen to finish it off by combing all the skills I've learnt to design and build a project. This hands on course should teach me a lot about the process behind physically building a project, rather than just designing and prototyping. Already, I've enjoyed being led around the workshops and shown how to use the machinery that we'll be able to use.

At this stage I'm not sure about any themes or the direction my project may go down, but I'm excited to find out what the group and myself will come up with. I have a range of interests from music to sport, so I'm keen to come up with an idea that I can really get behind. Otherwise, I'm going to struggle with the workload. I'm going to start research now...