Documentation & Reflection

Week 13 Cont.

Tuva Oedegaard - Thu 4 June 2020, 8:30 am

After Tuesday I worked a lot with my portfolio. I have hosted the temporary version here: https://tuvao.github.io, if anyone wants to have a look. I find it difficult to not work with any frameworks or plugins, so I don't know how I feel about the design so far.

Imgur

I read the article about optimising images we were given, which was very useful! I didn't know you could do so many specific things to optimize this.

Yesterday I also worked with the team critical reflection, which I feel like is going good. I thought it would be a tougher process to do this, but we are almost finished. I also started drafting what I wanted to be in the critical reflection, and I think I have a starting point.

Today I will work further on the portfolio and see if we can get the prototype working. The other ones are having their thesis presentation this week, so I understand it is difficult to put aside a lot of time for PhysComp right now! But, the problem is with sending data over server so I am not really able to test and troubleshoot this alone.

Week 13 Tuesday

Tuva Oedegaard - Thu 4 June 2020, 8:23 am

Today (written Tuesday) we had the reportbak with the class and then we met as a team to discuss the critical reflection. We ended up dividing different sections with word restrictions, so that it was easier to work on on the times we found suitable.

Furthermore, we realised it was a good idea to test if the full concept worked. First, we had issues with connecting to the server, and then we ended up having issues with different sensor sensitivities, so the code was difficult to execute for everyone. In addition, we had some different values and pins, which caused us to have to change a lot in the code each time we were pulling. Sigurd did not realise that I worked with this yesterday so he ended up, together with Thomas, to trying the same things as I did yesterday, and then spending some time to fix this. I had some problems with both my computer being super slow, and my internet connection not being very good (on mobile data now), so it was difficult to communicate properly.

Imgur

Unfortunately, we ended up spending a lot of time trying to merge everything together. It was a tiring process, and we weren't able to finish this today.

Image of me from yesterday, playing around with the sensor values.

Imgur

End of Week 12 and Week 13

Tuva Oedegaard - Mon 1 June 2020, 1:33 pm

Week 12

By the end of week 12, I tried playing around with the brightness a bit more. I tried asking Clay if he had any methods for converting from RGB to HSL, but he didn't have any good solutions either. We figured out that I might be able to use the "setBrightness()" from the Adafruit NeoPixel library (https://adafruit.github.io/AdafruitNeoPixel/html/classadafruitneopixel.html).

Later on, I tried this. However, it came with a warning that it should not be used as a gradient because it would be a "lossy" operation. So, I tried using the "W" section in the colour selection instead and adjusting the colour using the selected colour values, and a brightness value.

Imgur

The above shows how the colour could look like, so far we have only used it with red, green and blue values. However, even this did not seem to work. It might have been that the setColour function in Arduino did not work with the brightness.

Later, I found out that setting brightness only worked with RGBW-Neopixel strips, which it didn't seem as I had (as it was not working)

Week 13

After that did not work, I tried playing around with the colour values by converting HSL to RGB, maybe I could adjust the brightness only using the RGB values? By first glance, it looked like each value decreased by a set value when adjusting the brightness, but by looking closer at it (using the calculator), I saw that these were very small values. This means that my plan of subtracting a value from the rgb-values wouldn't work. Well, I tried; and it gave me complete wrong colours. I did not have the chance to test it last week, and at this stage, I had a friend coming over to borrow my iPad, so I asked her to test the current state of my prototype.

Testing

From the test I found that she found it most natural to adjust the colour one way; along with the rotation of the wrist. I also found that the cable is quite restricting, but the battery isn't good enough. However, she said it would make sense to adjust the brightness the way I had intended (without giving her that suggestion first), which was to rotate the same way.

Originally, I planned to test two different ways of finding a colour (apart from the shake), but I never ended up having time to make the second one. The other option was going to be that the users rotate the ball up to a certain degree, and then it starts displaying a range of colours, browsing through. This would be a randomised selection, and the user would rotate it back to select that colour. However, I asked my test participant what she would think of an interaction like this, and she said it was better to be in control of the colour. This was my assumption as well, hence why I moved from the shake, but it was good to have it confirmed.

More work

After this, I tried working further on the brightness adjustment. My test participant suggested hard coding a few nuances for each colour, and this would be my last resort. However, I found a formula to adjust it (https://stackoverflow.com/questions/50937550/formula-to-increase-brightness-of-rgb), which was very simple. The concept was just to multiply everything with a number, *1.5 would increase the brightness with 50%. I had aimed for something similar when I intended to subtract a value, but this could work.

I played around with the values of the accelerometer, tried finding an average of the three dimensions. The difference from the other colour adjustment was that this time I was going with the same RGB values, and wanted the same adjustments. With the other colour adjustment, I mapped each dimension to a RGB value, but this time I couldn't do that.

After going back and fourth with the values for some times and getting some weird 0-values, I found out that

  1. I divided a number that I had calculated to be between 1 and 19 by 10, to get the percentage or the value I could multiply the RGB-value with. The problem was, I divided this by 10, which is an int. This caused me to get 1 of 0 as answers. However, when I changed this to be 10.0, a float, I got the right values.
  2. Once the percentage was 0, I was timing the RBG-values with 0 and it ended up in an infinite loop of 0. I changed the RBG-values to never be able to be 0, which solved a problem of flashing dancing lights.
  3. I was originally overwriting the RBG-values with the new one. This caused it to permanently change the values, which means it was easy to get into a spiral of just constantly decreasing the brightness. I then changed it to always be multiplying with the value of the selected RGB. This was because I only wanted to change the brightness, not the colour itself.

These things led me to a working solution. I then added an option to squeeze to lock in the colour once more. Furhter, I worked on putting my new code in with the team code.

Team code

We are all working with a different board and different pin setups. So far, we have only commented out when we have different pins, so that the correct value will be implemented. However, I thought it would be a good idea to simplify this, by having a "user" string at the top, and then have conditional statements to change the code depending on what user it was. However, When trying this, I realised that you are not allowed to write any executable code outside of functions in Arduino. So, I first had to define a pin, and then in the setup function I could refer to a conditional statement changing this value. But, this caused a different problem with the Adafruit Neopixel not being setup properly. It seemed that this had to be done before the setup, but then it wouldn't be setup properly.

As this was just a bonus thing I wanted to do to make things easier for us, I decided to not look at it any further. I touched the topic of preprocessing as well, but it seemed like too much work for very little reward. I ended up just making clear comments in the code of what had to be changed every time.

Work this week

I started looking at the setup for the portfolio on Sunday, and I'm going to work further on that today. I have worked further on the implementation of the colour selection, and although I feel like it is not perfect, I want to put a line here now. I have to describe the technical details for the portfolio, and that is difficult to do before it is done. So, this week I'll work on the portfolio, work on the critical reflection with my team and start the individual critical reflection.

Week 12 Continuation

Tuva Oedegaard - Fri 29 May 2020, 12:26 pm

This morning I woke up with a possbile solution to the -360 issue; I added 720 to the value, instead of setting it statically to the lower or higher value. And then if that was still outside of the range, do it again.

Imgur

Furthermore, I have looked at converting from RGB to HSL. The purpose of this has been so that after the user has chosen the colour they want, they can adjust the brightness and saturation. However, this turned out to be more complicated than I hoped for. I found a library for it, https://github.com/luisllamasbinaburo/Arduino-ColorConverter, but this didn't work for the ESP32. I tried using the functions they were using for the library, but it didn't really give me the right values. I found the right values by using different rbg->hsl / rgb->hsv converters online, and the answers weren't matching. So, either the calculator was wrong, or the code didn't work properly. So far I've tried three different versions of an RGB converter and neither have worked properly. I am starting to consider whether I technically need this, maybe I can find adjustments to the colour otherwise?

Week 12

Tuva Oedegaard - Fri 29 May 2020, 9:50 am

Throughout this week I have worked on further assisting the group on getting everything together. We have discussed how to make things work, how our boards work differently and shared how we are doing it. And it works!

Imgur

I got some help from Sigurd on how to upload to Git through my cmd line.

Individual work

Thursday I worked on further improving the individual aspect of the ball. Firstly, I tried putting all the pieces inside of the ball, which turned out to be somewhat challenging. I talked to Sigurd and Thomas who have been working on an improved look and feel for the prototype, whether they think we would be able to use their solution. They said that their solutions have been difficult and perhaps not that rewarding to complete, so I ended up sticking with my Christmas ball for now.

While working with the ball, I also realised that the vibration sensor that Ben soldered for me two weeks ago did not work, it was falling apart. So although I have my thesis presentation today, I decided it was worth going into Uni to get that fixed, since this was the only opportunity this week. I got to uni and Ben soldered it back for me.

Before I went, I tried working on the next step of the prototype, the adjustment of the colours. I read about how to convert from rgb to hsl so that I could adjust the brightness and saturation. Sigurd provided me this, which was helpful to understand the topic https://www.niwa.nu/2013/05/math-behind-colorspace-conversions-rgb-hsl/. However, when playing with this, I realised that the solution I made last week might need some more working.

I used the map function to map the angle values from the accelerometer (using the MPU6050 tockn library), to values between 0 and 200 (255 is the rgb colour value but I didn't want them that bright). The issue was that I didn't know the range of the angle values. I had looked at these earlier and determined them to be roughly from -180 to 150, but now I saw that the z-values were around -1000! The z-values were mapped to how much blue should be displayed, which means this was always 0, and blue would be hard to achieve.

Imgur Imgur

I tried printing and playing with the values, as shown in the images.

I talked to Ben about how this could be fixed. He suggested using different libraries with more accurate values, but it didn't seem to have a meaningful effect. He also tried finding the range of the angle values, but wasn't able to. In the end, we talked about restricting the value of the sensor to be between -360 and 360, and then go up to 360 if it went below -360, to sort of create a loop. The main thing for me is to try to understand the maths and logic behind it, but I played around with that. I had to pack up before I got a finished result, but I made a function to check if the value was below of above -360/360. It ended up with the z-index always being -360, which didn't help that much. I have to fix that later.

I was going to visit some of my friends today, so I wanted to have a result, but I didn't have enough time this week. I will do this later.

Imgur Imgur

Week 11

Tuva Oedegaard - Mon 25 May 2020, 8:49 am

Team meeting

On Friday my team and I met and did an attempt to merge our code bases and functionality together. We decided that Sigurd and Thomas, who had the receiving and sending end should try, and me and Marie should put things together. Both me and Marie were using an accelerometer and gyroscope and we figured it was a lot of work to put these together as we were using separate libraries. Marie was using a library called I2Cdev, which I wasn't able to install on my computer. We tried a range of different things, and in the end asked for help from Clay. He wasn't able to solve the issue either, and recommended us both using the Wire-library, the one I had used. This meant that Marie had to re-write most of her code.

Imgur

We discussed a lot on how we could differentiate the shake and the throw, but after a while I realised that after the user tests I will try to do something different! This interaction would not be similar to a throw, so that problem solved itself.

Working by myself

After the team members left I worked on my own project. The new idea is that the user can tilt their wrist to run through different colours, so each new movement is a new colour. Then, they can squeeze it to select the colour, and then further adjust the brightness by tilting again. I used a library on recommendation from Sigurd, MPU6050_tockn, which allowed me to get a log of data from the accelerometer. One of them was the angle, which seemed useful. While getting help from Clay with something else, I asked him how I could make sense of the angle numbers and translate into colours. He mentioned a "map" function, and Nick overheard us. Nick happened to have worked with the map function before and was happy to explain it to me! That was really helpful, and a discussion around this with both Clay and Nick helped me be able to go back to my computer and solve it myself. I guess this kind of experience was the intention of the course, but we haven't been able to utilise this because of the restrictions.

Imgur Imgur

Making the wrist tilt and colour thing work was really really exciting!

In addition to this, I tried making the flex sensor work but I couldn't. My ESP32 board didn't have analogue pins which were needed for the flex sensor. Clay helped me solder on some pins and helped me figure out what resistor I needed for it to work. So then, after some more soldering, I got that working too, so now I'm ready to work with the other team members' code as well.

Imgur

Clay using a Voltage tool to know how much resistance I needed.

This week

This week I am going to further improve the tilt. This means ensuring that all the colours are accessible from the movement one is able to do with the wrist and smoothening the transitions a bit. I will also work on another tilting functionality, suggested by Sigurd. This concept is that the user tilts the ball up to a certain point and then it starts browsing through colours. When the user tilts back it will select the colour it was last on. I want to test both of these functionalities, and see which one people like the most.

Week 11

Tuva Oedegaard - Mon 18 May 2020, 4:21 pm
Modified: Mon 18 May 2020, 4:32 pm

Looking at feedback

New week! We started the Monday by looking at the feedback we got as a team. Prior to the the meeting I also looked at my individual feedback and made notes on what I wanted to change or do before the final deliverable.

I think we, and me individually, got some very useful feedback, and some not very useful. For one of the teams, I only got my questions answered and nothing else. But, all in all, I got even more feedback for what I should do with the colour adjustments, but I'm actually still not sure exactly what I'm gonna do with it. But, I have more to go off now. I definitely want to look at implement the haptic feedback further, as this was mentioned as everyone to be a good thing.

With the team we decided that we were going to try to put everything together, and then after that work on some individual parts. To make sure it was still valuable to put it together we had a round to discuss if it was necessary. In the end, we agreed that putting it all together would make it easier to display the full functionality of the prototype, and better the "proof of concept" feel, as opposed to having different parts doing different things. We agreed on having this as a main focus, as we didn't know how long it would take.. If we did opposite, worked on our individual improvements first, we might end up being very cramped on putting it all together, which would be less than ideal.

Plan for these next weeks are to meet on Friday to put everything together, and depending on how long that takes (might end up being a long and cumbersome process), we will plan the rest more in detail. But, we roughly agreed to spend around 1 1/2 weeks on putting it together and 1 1/2 weeks on our own individual implementations.

This week

This week I will be working on finishing the haptic feedback I started on Friday, and then meet with the group on Friday. If I have time, I'll start exploring the colour adjustments as well!

Week 10 part 2

Tuva Oedegaard - Thu 14 May 2020, 1:52 pm

Building and fixing

After class on Thursday, I ended up going to the lab as I realised it would be a good opportunity to fix the small bugs I had on my prototype. Initially, this was just that the battery wasn't working properly, but when I got there I got another error of the sensor stopping to work once i touched it.

Together with Ben we did some debugging and found out that it was probably just the connection to the breadboard that was the issue, and wiring everything together with jumper cables or a different board would make it more stable.

Imgur Imgur

Ben helped me create a small enough breadboard to fit inside the ball. After this wiring everything together with male to female and female to female cables, nothing was working. Ben helped me debug with using a volt test to see if there was signal going through to different places. This was very interesting, I've never seen this been done before! And very helpful, as we in the end figured out that I had swapped two cables. Then, everything worked! Fixing these issues automatically fixed everything with the battery as well.

Ben also gave me a tip to reduce the lights to ensure it is requiring less energy to produce!

Imgur Imgur

After this, I realised I might as well take advantage of being in the lab. I decided to ask ben on what I should use to create haptic feedback, as I believe this could significantly improve the feeling of the shake. I then also realised I wanted to use the same as my teammates, which happened to be the vibration motor from the auxiliary kit we got. After trying a bit with this, I got it working surprisingly fast! It is such a great sense of accomplishment to complete something on the technical side, as that is not my strongest side.

Imgur

Week 10

Tuva Oedegaard - Thu 14 May 2020, 9:22 am

This week was all about feedback! Our team got together on discord on Tuesday and went through all the teams we were going to appraise. At first, we tried watching the video together by sharing a screen, but it ended up not being good enough quality. We watched the videos, read the documents and had a collaborative document on google drive (tried notion first, but it wasn't very good at dealing with multiple users). When we had a comment, we wrote it down and then collectively walked through it afterwards. I think it ended up working very well! We collaborated well and we were all able to provide feedback to contribute to the full appraisal.

Now the next step is to look at the feedback from our peers and see what kind of interaction method I should use to adjust the brightness.

I will also go into uni today and see Ben and see if I am able to make the battery working a bit better. So far it is working well when I have it connected to the computer, but it gets patchy once I connect to the battery!

Week 9 post 3

Tuva Oedegaard - Mon 11 May 2020, 9:12 am

The end of week ended up being dedicated to finishing the supportive document and working with the video. This took a lot longer than I anticipated. I struggled with finding good sources related to my problem, I found it hard to know what to include in the interaction plan and I had no idea how to create the objectives and the success criteria. For the latter I had to google a lot, ask the tutors and I'm still not entirely sure whether it is enough or specific enough.

The video making always takes longer than anticipated, but luckily I managed to get it approximately how I wanted. Today I'm just going to fix my fonts and backgrounds so that they fit with the group theme, and add the group video to my own video. Then tomorrow its time for appraisals!

Week 9 Post 2

Tuva Oedegaard - Wed 6 May 2020, 6:15 pm

Tuesday class

In the Tuesday class I report back one of my concerns being whether I had tested the solution enough, and I got an input to try to use the appraisal and the video itself as a user test. I will definitely do this and design my video prototype with that in mind.

Recording video for the group

After this class, I tried recording a video of a shake. I tried doing it Monday and asked my roommate to film me. The prototype wasn't working as expected and it ended up taking longer than expected, and when I later found out that he was not using the angle I asked for I didn't want to ask him again. I ended up using the webcamera I borrowed from Uni instead and filmed myself. BUT, it still wasn't working properly!

I spent I think around an hour trying to make it work. After a lot of back and forth trying to tape it and all, I realised that one of the cables connected to the NeoPixel strip had fallen off. Luckily I had another version available, but I had to change the code to fit a different number of LEDs. Then, I realised the lights turned off whenever the battery was flipped. I had described it as "patchy" earlier, and this was probably because the connection to the battery was bad. Luckily, again, I had another battery lying around for the Photon, and things seemed to work when I used that.

A third issue was that the shake detection seemed to be a lot worse once I disconnected the cable from the computer. The lights were now lighting up, a bit weaker than with the computer though, but every time I tried filming it, the shake wouldn't work. When I plugged in the accelerometer values were just 0, and nothing else than taking the cable in and out seemed to fix it. In the end, the solution seemed to be to insert battery AFTER cable was removed, so that the values wouldn't null out themselves or something similar.

Imgur

Working with the report and video

Tuesday and Wednesday was spent working with the document and the video for the deliverable. I find it hard to know what the task is looking for and what is enough. I've tried asking Alison about it, but no answer yet so I can't progress more than that. I am also unsure about the objectives, what format they should be in, how specific etc. ("tap functionality should be done by 1st of June or an appropriate functionality to solve specific issue should be completed).

I've started planning the video too, which is progressing good. Hopefully it won't take too long to make!

Imgur

Week 9

Tuva Oedegaard - Mon 4 May 2020, 10:26 am

First thing I did in week 9 was to get the wiring of my project right! I used the cables and breadboard I got from Thomas and wired it up correctly. Last week I tried putting it together with tape, which was very unsuccessful. The next thing I tried was to use the black tip of the jumper cables as a "container" for the jumper cables together, as I had a lot of male ends I needed to connect together. This worked to a certain extent, but I started doing it because one of them fell off randomly, but I discovered later that it was very difficult to actually hollow out the black tips.

After putting it all together, It was actually pretty hard to place everything into the ball. I didn't want the LED to be directly on the wall of the ball, because then the light would be too strong, but I couldn't find a way around it in the end. It would also have been easier if I was able to just put everything in there and then tape it and leave it, but I need to have access to the ESP32 both to take in and out the battery (when I'm not using it) and to insert the cable in case I have some new code, og want to charge the battery.

Imgur Imgur

Week 8

Tuva Oedegaard - Mon 4 May 2020, 9:16 am

Building

Last week I made a prototype including a vibration sensor and NeoPixel LED strips. One of my research questions was if the shake was too sensitive or not and I mentioned in class that I found it hard to test on people as they would have to come to my computer. With that, they suggested using ESP32 or Photon to make it wireless, which I found intriguing. In addition, I knew before doing any tests that the sensor was too sensitive, so I wanted to improve that further by using an accelerometer/gyroscope to detect movement and shakes further.

After talking with the tutors and planning on Tuesday, I went to the University on Thursday to pick up a bunch of sensors.

I then spent Friday playing around with the ESP32 and the Photon! I found it hard, in the beginning, to transfer Arduino code to the ESP32, as for example, the ESP32 had 3.3V and Arduino has 5V. I couldn't get my initial setup with a standard LED to work, but I tried one more time after Ben told me the volts should work the same, and it worked!

Imgur Imgur Imgur

The last one shows it working with a battery as well!

I tried playing with the Photon as well, but Ben mentioned the Photon to have a bit of its own ecosystem, and since I'm collaborating with my team it would make it easier to make things work on the ESP32.

Imgur

I then started working with the accelerometer together with the ESP32. It was difficult to find tutorials for this, but in the end, it was fairly simple to connect the wires (ground goes to ground etc). Once I got it working, I had a bunch of numbers on the screen that told me which way everything was going, but had no idea how to make sense of it. I had earlier tried some libraries, but these seemed to only work with Arduino, not the ESP32. Clay helped me making sense of the numbers by creating a formula to create a sort of "total number of movement", and I looked at the numbers and created my own threshold.

Imgur Imgur

Next step was to hook up the LED strips the same way as previously and try to merge the old code with the new one!

Imgur

Now I had a wireless version of what I had before! One issue, however, is that my breadboard is too big to fit back into the ball again. I asked the tutors whether I could pick up female jumper cables and a smaller breadboard, but no answer so far.

Improving

Now that I had the technology working, I wanted to improve the shake from what it had been before. But what is a good sensitivity of a shake? Should it change after you've done the shake, or as you're doing it? How sensitive?

I ended up doing some research. The iPhone has a built-in feature that if you shake it after you've written a text, it asks if you want to undo what you have done. I tried playing with how sensitive this was, but I think if you push cancel a few times in a row, it assumes you are just in a shaky place and don't really want to be prompted, so I couldn't test this for too long. What I noticed, however, was that it had a haptic feedback, vibration that tells you that something had happened. I ended up searching for "shake" on Apple Store and downloading the first apps that seemed to be based on a shake. Here is how they worked:

  • Truth or dare: Like a dice, get to see it spin and gradually go slower and present a result.
  • ShakeTips: Shake for tips, receive vibration and audio feedback. If you shake for too long, it won't do anything
  • Shake it: Pick a random thing from a list. Same, if you shake it too long, nothing happens. No haptic or audio feedback, changes when you stop shaking.
  • Shake to charge: Tracks while you shake, stops around half a second after you stop. Only visual feedback.
  • Baby Toys-Shake or touch. Audio feedback while shaking it. Stops when you don't shake it.
Imgur Imgur Imgur

I noticed that the ones with vibration feedback was a bit easier to use for me, so this might be worth exploring in a future version. Most of the apps gave feedback once you stopped shaking, which gave me a good indication of when the colour should be displayed. I ended up playing around with it for a bit, seeing what felt natural. With the end result, the ball won't change colour when you just pick up the ball as previously. I even tried throwing the ball up in the air, which is Marie's interaction form, and it didn't count this as a shake! So that is really good, that was part of the aim.

Future

Sunday I went to Thomas and Sigurd's place and used their sandpaper to make the surface of the balls matt. I ended up chatting about the problem of making it fit in the ball and as it turned out they had both female jumper cables and a smaller breadboard! So I will spend today and this week on putting it all together with the new things so that it was fit in the ball!

Week 7

Tuva Oedegaard - Sun 26 April 2020, 8:32 pm

This week I worked with the prototype, analysed the diary study results and started planning the additional interviews.

Working with the prototype

After playing around with the light sensors over the break, I finally got the vibration sensor. I tried setting this up and to my pleasant surprise it ended up being fairly easy to make it work. I gradually added more functionality, first just printing something when the vibration was triggered and then merging the code for the light and the vibration together. I found ways of finding random numbers online and created a function that would display a random colour and triggered it every time the vibration sensor was triggered.

I then cut open the see through Christmas decoration balls I bought last week (a lot more effort than you would think as I didn't have any sharp knives!!), put the sensor and lights in and taped it together. Initially, I thought I would have to solder it to be able to get some longer wires, as the short wires made it hard to properly shake the ball without everything falling apart. I texted a few friends to see if they had a soldering kit, but that same day I received the Auxiliary kits. That contained wire extenders, exactly what I needed! So I don't feel the need to solder in the near future.

Imgur Imgur

I also had a walk to the city and walked through the shopping mall to see if I could find something else that would work better as a ball. I know Lorna sent a few links to different balls, but I find it really hard to know the texture and "squishyness" of the balls without seeing them in real life. Most of the stores were closed, but I saw two balls that were sorta what I was looking for in terms of texture.

Imgur

This one is not round and also shaped like a panda, but it had a good squishyness and also the white colour would make it possible to see light through it.

Imgur

This one would have been really good if it was blank! I didn't buy either of these but at least now I know a few more things to look for (maybe dog toys?)


Analysing diary study

Last week I completed the diary studies and I spent this week gathering the data. I created an overview of all the answers and then mapped every story to an emotion.

Imgur Imgur

I found that

  • Bright green
  • Black/grey
  • Light blue
  • Bright Pink
  • Yellow
  • Dark blue
  • Light green
  • Orange
  • White
  • Red

were the most used colours, and that happy/excited and content emotions were the most shared. The result showed that the participants seemed to have understood the concept and their stories were interesting. It was interesting how they chose colours differently though, and even within an individual they could give different colours to stories with the same emotion (e.g. P1 was worried one day and said black, and then the next day worries about a different thing which was orange).

Planning interviews

In the proposal, I said I wanted interviews to see

  1. How sensitive should the sensor be
  2. What other interaction forms can I use to easier adjust the colour.

I found it difficult to ask friends about how sensitive it was because then I would have to make them come to my room and touch it... I will try to ask my roommate, but it is still a bit cumbersome as the device has to be connected to my computer, and simply "do you think it is too sensitive/not sensitive enough" might be a bit hard to answer as it is connected to wires and all.. I have to maybe figure out a better way of doing it, or postponing it a bit. I have already realised it is probably a bit too sensitive, so will try to fix this first.

With the second question, instead of having a complicated interview, I decided to go for quantity instead of quality. If I have been chatting to friends and family (and tutors) about the idea, I grab the opportunity to ask them as well "Now that you know the concept, and say that you want your message to be blue (or whatever colour the ball is if I'm holding it). How would you make this brighter?" It has been interesting to see people's reaction because it seems that it needs some thinking, that its not very intuitive to them. But so far I have gotten

  • Bounce up
  • Throw in the air (and measure the length in the air to distinguish from the "send" throw)
  • Throw to the wall
  • Twist two different halves of the ball
  • Twist it to the side (holding the ball and twisting your wrist)

I will continue to ask a few people next week to see if I can get a few more opinions, and then propose and maybe order pieces if necessary so that I'm ready for the next iteration of the prototype.

The concept

Tuva Oedegaard - Tue 21 April 2020, 10:07 am

The concept

My team has gone with the strategy of working on the same concept. The concept is working as follows:

Imgur

The idea here is for people close to each other (friends, family) that in these time can't see each other, to share thoughts. This is both to vent and to share positive emotions.

Because of the circumstances, we have divided the concept into four different parts. I will be working on the part of the prototype where the user shakes the ball to determine a colour, as many times as they like until they are happy with the colour. This is step 3-7 in the drawing.

The ideal solution

Ideally, we would like to be able to put the individual parts of the prototype together as a coherent part; in the end, a bowl of balls where each ball is able to send and receive data to and from other bowls of balls.

Midsemester break and Week 7

Tuva Oedegaard - Tue 21 April 2020, 9:28 am

Over the break

I wasn't initially going to work over the midsemester break, but at some point I ran out of things to do and ended up trying out some sensors. I played around with some ADAfruit Neopixel LED-strips I had from last semester and refreshed my memory on how to get them to light up.

Imgur

In addition, I bought some balloons. The team concept is built up of "balls" that lights up in different colours. We hadn't been able to find an appropriate form of this yet, so I suggested balloons and wanted to see if this worked as I had hoped. The ball had to be both see-through (for the colour to shine through) and squishy because one part of the concept is to detect when the user squeezes the ball. Some white balloons would fit to that description.

However, this was the result:

Imgur Imgur

As the images tell, the "ball" didn't really end up looking like a ball. In the meantime, however, the boys in the group had been out shopping and found some other balls we could potentially use.

Imgur

Later in the week I went shopping for the same balls, and are yet to try them out this week!

Our group also ended up paying for express shipping for the sensors we wanted to come as fast as possible (1-2 days), since we saw that the Auxiliary kits wouldn't come in time. Instead of 1-2 days, it took a full week, so that was a bit annoying. I went to Thomas and Sigurd on Sunday to pick them up and then met Marie at the University to give her her part.

In addition to the balloon trial I ended up planning and starting my diary studies as well. I asked 5 of my friends to every day write down 1-3 messages and a colour related to that message. This was after explaining the concept to them via email. Throughout the week I reminded them sporadically but they all seemed to remember pretty well. I got the results back on Sunday and planning to analyse it today, excited to see what they did! It will both be interesting to see how they interpreted the task and what kind of messages they would send (they were asked to send mostly positive, but venting was also fine) and what colours they used. I will also use the results to see what kind of colours are the most common ones so that those can be the first ones to be displayed on the ball.

Unrelated to the course, I somehow got caught with an amazing productivity and creativity over the break; I learned how to juggle, completed an online course on UI, started making my own website (with backend from scratch!), worked on my thesis and a whole bunch of workout/stretching things.

Week 7

This week I will focus on the prototype; start playing with the vibrating sensor and try to map that out together with the LED-strips, and finally put it together in the ball.

I will also aim to plan interviews and to gather and analyse the data from the diary studies.

Week 6

Tuva Oedegaard - Fri 10 April 2020, 3:04 pm

Between last post and this week

After my last post in Week 5, the group divided tasks for the report and wrote over the weekend. My part for the team section was the "Response to feedback" as well as describing the initial inquiry we did. I found it interesting to connect the dots back to where we started and see how far we have come. After each team member had written their part, it became apparent that there were still some gaps to be determined in our concept. Next section describes how we solved that. I also sketched out the storyboards for our report. Initially, we had only one story with positive emotions, but I suggested a scenario where we could use positive emotions to lift up negative emotions too, which I felt gave a better picture of a broader use of the concept.

Tuesday Class activity

In the Tuesday class, we did report back and then some activities. Although much shorter, and somewhat nice to hear about what people are doing over easter, the report back part of the lecture is to me still quite unnecessary. It is difficult to hear what people are saying and as we are trying to keep it short, it doesn't feel like it is giving much value to hear 'that someone has started prototyping'. Hearing about some people idea's and how they have been developed could be interesting, but so far it is too many people to give enough insight for it to give me any value.

The next activity, however, was quite interesting I think. It really encouraged us to explore new ways of doing user research. The document has been sent to us before, but this was the first time I actually opened it and was encouraged/motivated to read through and explore it. It actually gave me inspiration for methods of exploration for my individual part of the project. The miro board was a bit messy and difficult to navigate in, and different sized made it hard for me (on my computer at least) to skim through the other answers, but it worked as a way of collaborating compared to just typing it locally.

The last activity for the class was to together with the team fill in a miro board to define our concept thoroughly. We thought we already knew everything about our idea, but wanted to complete the miro board of it to ensure we 1) Had everything defined and 2) All agreed on details on the idea. We did discover that our target audience was rather vague (in the report it was described as "people who are close with other people"), and together we agreed on a user group. The miro board also encouraged us to agree on some specific details of the concept, like timeframe and materials. Below is a snippet of our miro board.

Imgur

Further teamwork

After this class, we had a meeting where we decided who were doing which part of the prototype. Initially, when trying to decide which parts each person could do, we got slightly nervous that we might not have enough work. Later, when we finished writing our individual parts, I saw that we were able to come up with a lot of interesting questions and research methods for each individual part. This means that even though the technical part for this initial prototyping round might (might) be small, we still have a lot to explore and research, which in the end will benefit the full prototype.

I found it difficult to create a full plan for the full prototype because so many of our parts or future plans were determined by user research. We have a lot of add-on ideas for the concept, but it is hard to tell now which ones to go for and not.

Thursday workshop

The Thursday workshop was an Arduino tutorial. Although I have worked with Arduino before, it was good to get a more basic understanding of how to read the circuit sketches. We ordered some parts we might need for the prototype on this day as well, and hopefully, they will be here soon. We paid extra for fast shipping as we realised that the Auxiliary kits might not get there in time.

General thoughts

This week and last week has been a lot of work with the report, and the journal has suffered a bit from that, unfortunately. I have done a lot of work but there is no point in re-stating everything I have done, as it will become apparent in the report nevertheless. I personally think we have done a good job in refining and asking questions with the new concept. I am a lot more confident with this concept than the last one, and hopefully, it could become meaningful to someone.

Week 5 - More idea refinement

Tuva Oedegaard - Thu 2 April 2020, 9:00 pm

Tuesday class

The Tuesday class consisted of reporting back on what everyone had done and any concerns. I felt this class to be somewhat useless, it was difficult to hear what people were saying and a lot of repetition since every team member were answering the same questions.

Team discussion

After that, however, my team and I had a discussion on what to do with our concept. We tried defining who was doing what part of the prototype and we spent the time coming up with a template for a user research. We wanted to know more about our users before deciding on an idea, specifically we wanted to know more about what they would want the purpose of such a concept to be, what would they get out of having their emotions on display like in the concept.

Our intended learning outcomes of the interviews were:

  • Emotional Input
  • Most Valuable way of Translating the Input
  • Most Valuable way of Displaying the Input
  • What do we want people to leave with?

Interview conduction

The interview I did (digitally) was very interesting. I got a lot of new feedback on how to do and not do it. My participant was mentioning how they were good at sharing feelings, in the form of discussion. They did not see the point of our concept because they couldn't imagine being in that situation where you lived with someone that you couldn't communicate with. On the other hand, they had friends who never wanted to show emotions and they could not imagine them ever being like "I'm angry" or wanting to display their emotions to anyone they live with, except from their very close ones.

This was very interesting because it had me question the purpose of the concept - my interviewee didn't seem to want to display emotions like we had imagined - why not just talk about it? The only way they could see a visual or audio display of their emotions to be useful was if they were in the extremes of negative emotions and did not want to talk about it (yet).

This sparked me to think that we might be able to focus less on the negative emotions (that people might not wanna share - or is already sharing), and more on the positive emotions. I made the below sketch for my thoughts before the Thursday workshop.

Imgur

Thursday class

The Thursday class was good because we got time to work with the project while having tutors available. We went through the data from the interviews and noticed a trend of people talking about their emotions if they wanted to, or being comfortable with not talking about them if they didn't. We had a chat with Steven who prompted us to figure out what goal we were trying to achieve. We had an interesting discussion of different things we wanted to achieve - from lifting the spirit at home to removing negative emotions. We wrote down each time we had a different wording of a sentence and ended up with "Encourage positive emotional sharing with close ones remotely" as our main goal. Then we had to find questions and target audience to "solve" this. We had a break and developed one idea each, my idea ended up being the one below:

Imgur

This way we would incorporate novel ways of interacting, it would be playful and hopefully fit the goal. I added several different tweaks and add one that could be considered as well.

After discussing all the ideas, we ended up with a concept very similar to the idea I presented. This can be an interesting and fun way of sharing between households and lifting the spirit. Throughout the week we are going to write in the proposal about what we are planning to do and do some research on the field(s).

General thought

I'm generally excited for this course still, even though it has changed a lot. I am curious about the testing (if happening at all) but other than that I am excited for the development and seeing how it all evolves. I think my team has done a good job of communicating. We have a lot of different ideas and no one really knows where we want this to end up in terms of concept, and I think we are doing a good job with asking constructive questions to help ideate and build on ideas, rather than being critical.

Week 4 Wednesday and Ideation

Tuva Oedegaard - Sat 28 March 2020, 1:23 pm

Wednesday class

Wednesday was very similar to the Tuesday, except we didn't present anything. It was interesting to see more game-centered concepts in this session, and I was thinking about ways they could convert the game setting to a more everyday life setting. I spent the breaks thinking about our concept in relation to the feedback and what we had to think of in the upcoming discussion. I made an overview, which is included below.

Imgur

Feedback

We got a total of 25 posts of feedback, which is not even half the class. That is a bit disappointing. In addition, it shines through that some just write feedback just because they have to, half to most of the comments weren't that useful. E.g. we asked for other input and output variants and we got a bunch of comments on "You should have more interactive output" or "other input would be cool", but no specific examples. We did, however, get enough feedback to know where to go with our idea, so it wasn't all useless.

Team ideation

Friday we had a skype-meeting to further develop the idea and look at the rest of the feedback from the class. After discussing all the feedback we used my sketches and realised that purpose would be the most important thing to figure out. Collective awareness and outward display of emotions ended up being the purpose we wanted to focus most on. This was decided based on group discussion and we all agreed.

Further, we found our user groups to be shared houses. A lot of the feedback mentioned that it would be more useful to put our original concept, which was more of an installation, to an everyday situation. In the discussion Tuesday we discussed moving it to the home and Friday we agreed on this being the new concept. Within this, our focus will be students. We thought this would be very interesting in these days, as this is a group that is probably not used to being home as much as they have to be nowadays.

We also wanted to determine what we wanted the users to leave with when using our concept. We found this a bit hard to determine and ranged from encouraging discussion to raising awareness or just emotional relief to improve communication skills. We decided to further determine this point after talking to some of our users next week.

Input and output was the next discussion point, which is where we started refining the idea more. The final idea ended up being a cylinder filled with balls that represent the people in the household. Each ball will somehow give or show feedback to represent how the person is feeling. Whether this is automatically tracked with pulse, temperature, muscle tension etc or manually input as well as how we will represent emotions as an output (colour, haptic feedback, sound, etc) is not decided yet. We consider this a good opportunity for individual exploration.

Status

After this discussion, we have more or less completely changed our idea, although the core idea of tracking how people are feeling and showing it outwards is still standing. I feel like we have good opportunities for individual focuses here, although I am curious to see how it would get together (or not) in the end. Skype meeting worked mediocre, bad wifi etc made the meetings a bit hard but we got through it.

Week 4 Tuesday

Tuva Oedegaard - Tue 24 March 2020, 4:37 pm

After presentation reflection

We presented our idea through a pre-recorded video in the first session Tuesday. Our team got a lot of useful feedback. Lorna’s initial comment was where it was supposed to be and if we could transfer it into a more everyday life situation? In the break, we walked through some of the comments on slack and we continued with a few of these after the class session as well.

Some feedback focused on the context of the concept – this is maybe more of an “installation” type, rather something someone would use or engage within their everyday life. I was also thinking that we have a perfect situation to test in these times – most people are working from home and it would be interesting to move the installation home to people that now has to work online – that are not used to working online.

Some people suggested different inputs or outputs or even making this user. Some had very valid points of the fact that this idea is simplifying the complexity of emotions. Maybe we’d want to explore how we can include the complexity of emotions in this fairly simple concept.

We should also consider a more concise user group. So far we’ve been targeting most people that pass by this point, but if we move it to the homes we might wanna look at separating into for example family homes, share houses, people living alone etc. This can be good opportunities for a split in the group too, where we look at different target groups.

Imgur

Day 1 of online teaching - how did it go?

I must say I'm impressed it went as well as it did with the online teaching today, although Zoom crashed. It was annoying that we are not able to use videos, as this gives a better feel of interacting with real people than just seeing the photos. The video-sharing web page worked well! I am noticing that it's gonna be hard to complete the semester like this, I actively did not sign up for online studies because I like to be there physically. But there is not much to do about that.

Week 3 Team formation

Tuva Oedegaard - Sun 15 March 2020, 11:03 am

Bash

In the Wednesday session, we had a call in from a former UQ student based in London. I was surprised that his #1 tip was to be friends with the recruiter, not to work on your CV or portfolio, although it makes sense. I have previously landed a job from a recruiter and I can see how that is a good tool. The exercise he was running was very interesting, I ended up with the word invested. I feel like this represents me in a good way because I never half do something, I am always curious and well, invested. I will now try to use that word further in my job searching process.

Team formation

In the Wednesday session also got our teams, which was no huge surprise for me. I have worked with all team members before, which can be for both the good and the worse. Good because I know these team members are motivated, well communicated and ambitious, for the worse simply because it will be the same input as before, no new perspectives. But I think we will be able to ideate well and still be able to come up with creative ideas.

Imgur

Team ideation

We met after class and ideated, with Marie joining us through Discord. We spent the time getting to know the topic, emotional intelligence. I shared what the discussion was about when I was at the table and we used the butchers paper and posters to generate more ideas. The way we did it was to come up with both input (externally), input (internally) and output and tried to mix and match. We planned for a meeting on Friday and agreed that we all had to bring 2 ideas to the table to that meeting.

Individual ideating

For the individual ideation (to come up with 2 ideas as a base for the Friday meeting), I found myself doing the same things as we did in the meeting, coming up with the same or similar ideas. I therefore asked my roomie, presented the task and mentioned the different ideas I had. He happened to know about a brain activity tracking headset and helped me find it! It would be very very interesting to actually track actual brain activity to somehow find out about emotions or how someone is feeling, and use this data to display some output. We saw that the headset itself was 100$, which is completely doable. What might be the challenge was the software to control the helmet, which might be a lot more expensive. Below is some of my ideation process:

Imgur

We ended up ideating on how it is actually possible to train a headset to trigger a certain activity and how this could be used to for example be picked up by another headset. An idea we had earlier was to somehow display how others feel about you (Throw emoji balls at someone), and we tried imagining how this could be done with brain activity headsets instead. We also explored how we could use some sort of tracking software, that tracks the tone/emotions of the things you are typing (like Grammarly), and for example track twitter, and display somewhere how the world is feeling at the moment.

I had a lot of fun ideating this topic, as a lot can be done. It is, however, easier to come up with "blue sky" ideas rather than realistic ideas. These were the ideas I ended up suggesting for the group:

Imgur

Idea decision

On the Friday meeting, we each presented our ideas for each other. I think we handled it well in terms of giving each idea enough attention and had a small ideation for each idea. The process was all in all a group discussion where we built on each other ideas and that way came up with new ones. I tried asking questions that I knew we couldn't always answer, like "Where would that be?" or "How can we make this fit into the everyday life", or even playing with ideas like "What if we changed this to a different output". This worked to refine ideas and play around with concepts. We ended up setting up a lot of different outputs, inputs and variants of the final idea. In the future, I would love to see how we can make this idea more purposeful, but for now, it is a sufficient idea.

Week 3 - World Cafe

Tuva Oedegaard - Tue 10 March 2020, 2:31 pm

Tuesday in week 3 we had a World Café in class. This was a session where I got to explore all the different ideas and think of how they could be developed with different contexts, different target groups and refining ideas, going from dreamy and futuristic ideas to how we could actually implement it in the real world.

We had a total of 8 rounds with three different topics. I forgot to take photos of all the tables I was at, and it is, therefore, difficult to remember all of the specific topics. I will, however, elaborate on the ones I chose, and the ones I remember to be particularly interesting OR difficult.

Emotional Intelligence

With this topic, we explored how emotions can be expressed or communicated outwards. We explored what form an emotion could be expressed though, like colour, temperature, size of something, sounds/music, etc. We noted the fact that emotions are very very complex and hard to for example display a device as red because you are angry because you would never feel just anger. We explored how we could show different colour patterns or gradients to visualise different emotions or how you are feeling. That got us further to the controversial thought of what if everyone lights up (halo, clothing, etc) displaying their emotions and feelings? There would be a lot of privacy and accuracy concerns with that idea, but it would be an interesting concept to explore because we are in a world filled with mental health issues and different range of emotions. It would be interesting to see what would happen when you couldn't hide what you were feeling. Another aspect was to display the emotions of a whole building together, and cities would light up in vibrant colours.

I was at this table in round 2, so I never got to explore how this could be put to life, but had a look at the table after the last round and saw they were more focused on the "expressing emotion" part, where you throw things or do physical things to express how you are feeling. This was my first choice of topic, as I thought it would be interesting to explore further.

Imgur

Change through discomfort

This is my second choice of topic. I was at this table in round 3, where we got to explore how we could make the ideas come to life. Previously it had been explored what kind of situations you'd want to change, and I tried discussing whether we'd want to make something controversial that didn't necessarily want to change from a bad habit, but just change in general. We couldn't think of many ideas for that except from making friends on the bus, but that would be an interesting area to explore. The most recent previous group put together ideas for 1) Procrastinating, 2) Getting out of bed and 3) Compulsive Swearers, so we tried finding something in that space that could be realised into the real world. It was slightly difficult for those ideas as most were already invented in some way. We looked at a previous concept of "change" being "avoid saying Ahh/Uhmm when speaking publicly". We thought this could be a fun concept to play around with, and ended with an idea in the form of a light that lights up every time someone says "uhm" in a public speaking setting. The light would be visible for everyone including the speaker, to constantly remind them. This might throw them off, but it sure would be uncomfortable and encourage change.

Imgur

Guided Movement

This was the very first table I was at and also my third choice. We explored how this didn't have to mean reaching a physical destination, it could also be a mental state or a goal to reach. Many of the ideas provided was considering exercising and turning it into a game, but some were about travelling and reaching a destination. We explored different ways of getting to a destination, by for example shoes that vibrate to tell you which way you are going. We explored different ways of notifications such as a gradual vibration notification or pressure wristbands.

Digital sensations

I wanted to mention this topic because I found it very difficult to work with. I was there for the last round where we were supposed to come up with realisable concepts. The ideas that were there before were building on one idea (control weather through meditation and breathing), but we found this difficult to realise into something realistic. We tried twisting our brains into sensations with technology that we have in our daily life. Buttons were mentioned as a physical sensation that is removed with the mouse and touch world, so pressing physical buttons would be very interesting, but not particularly possible. Another concept we came up with by the end was to make the shapes of the letters you are going to type, either by using your full body or two or one hand sign language. This would be realisable because we do have technology that can recognise shapes through a camera, but we struggled to find any good purpose for this.

Summary

I thought this activity, the World Café, was very encouraging and inspiring. However, I thought it was a bit difficult to "land" in round 3. So far we've gone above and beyond with our ideas, not thinking about implementations, but now we had a short amount of time to make it realistic. It was difficult to not just think of new ideas but to think with a realistic perspective, and I experienced that we weren't able to come up with any very good ideas.

In addition, the hosts weren't always good at accommodating for new people, so I found it hard to put my head into a new topic. A final challenge I think this challenge faced was people that didn't want to participate. It let a lot of the communication to only a few people, which is a shame because more minds means more ideas.

Week 2 - Idea feedback

Tuva Oedegaard - Sun 8 March 2020, 7:48 pm

Add on about the feedback

After the feedbacks written by my peers were posted I had a look to see what comments were written about my idea. There was a limited number of comments, but everyone tagged the idea to be both novel and clearly communicated. Some comments mentioned that this would be a cool idea, and that it could be interesting to add on several interaction types, not just holding and that would be it. One suggestion was to match the tunes/sounds with an existing song instead of composing a new idea, which could be an interesting take to it. Another comment questioned whether music would make people happier in a hectic condition. It would be an interesting take to experiment with different sort of emotions and purposes of the music - is it just to have a fiddle while being bored at the bus? Maybe it could trigger some sort of emotions, either enhance how they are already feeling or aim to do the complete opposite? A controversial version would be to intentionally trigger sadness/anger.

Week 2 Progress

Tuva Oedegaard - Thu 5 March 2020, 8:44 am

Presentations

This week was dedicated to presenting initial concept ideas for the course. It was very interesting to see all the different ideas and inpirations people had, specially since we did not have a topic to stick to. One of the ideas that excited me most was one that used pressure pads to create different colours as a game. While the idea itself was fairly simple, the idea of creating variants of colours using pressure seemed very interesting. Perhaps one could extend on this idea and for example instead of only using pressure you could add on with gestures, or even pulling, expanding the material. What I also noticed was that some people were still stuck on the screen interaction mode, which had me question why it had to be a screen. I saw many trends in the domains that people were exploring such as plants, kitchen and mental health.

Regarding my own idea (hearing sounds through your hands and creating music through a wire on the bus) I still think this is an interesting idea that has a lot of potential. I got a feedback saying that we could explore the idea to give different sounds depending on where you were stanidng on the bus, which could be interesting. I would love to see the suggested technology in real life.

In class we were prompted to do a paper critique for each idea, which was very useful to stay focused and not zoning out!

Outside of class

This week I completed my last required sessions; soldering and space induction. This was very useful and I am excited to start using the spaces.

Imgur

Bus Composer - Ideation exercise Week 2

Tuva Oedegaard - Mon 2 March 2020, 2:55 pm
Modified: Mon 2 March 2020, 6:48 pm

Poster

Imgur

What is it?

This idea is taking place on the bus. It consists of a number of wires installed along the ceiling, on the same level as the handles for the standing people to hold on to. This experience uses a new technology – allowing you to hear music through your hands! Everyone on the bus can grip around the wires and hear the music it is playing. The music is based on the temperature, pulse and different personally identifying factors to play a tune or beat. This sound is unique to every person, and the more people holding on to the wire, the more interesting the music gets!

This idea is based on the fact that the bus is boring. I am building on human values like appreciation and encouraging openness, acceptance, exploration and listening. With this, people can open up and perhaps start conversation. Regardless, it will encourage to a way of interacting they have never encountered before, and everyone joining will impact the installation differently.

The Bus Composer is for anyone who takes the bus, any time, any age or background. It can be interesting to see different interactions with the installation on different times of the day, several people interacting with it in rush hour would provide some potentially hectic, or amazing music.

Inpiration

For this idea, I was inspired by an exhibition I went to in San Fransisco once, where you could hear music through your teeth. https://www.exploratorium.edu/exhibits/sound-bite

In addition, hearing glasses are now available, which inspired me to think that hearing through your hands would be possible in the future. https://www.zungleinc.com/

publictransport collaborativemusic exploration bus

Week 1 Progress

Tuva Oedegaard - Sun 1 March 2020, 5:25 pm
Modified: Sun 1 March 2020, 5:59 pm

This week

This week we were introduced to the course with assessment items, learning outcomes and more. I attended the UQ Innovate induction, which was very useful. In addition to this, I completed all the required online tutorials pior to the session. I understand that these provide important information regarding safety in the makerspace. Day 2 we spent ideating and discussing the challenges, which was very interesting.

Imgur

I had the challenge of ethics, security and privacy which is highly relevant for any technology (and most aspects in life, really). We found the challenge to be about the different concerns around the topics, specifically regarding IoT, Online Social Networks, Big data, Healthcare Technology, VR, intelligent environments and Cybersecurity. An example of this could be the different ethical considerations and concerns that comes with new technology such as smart homes. I found it very interesting that healthcare technology is considered its own category here, as it covers a broad specter of different technologies. We found that government regulations had not been coverend in the paper, which we feel like is a very relevant topic. We also discussed how there should in general be a bigger privacy focus and how you can actually design FOR privacy, instead of considering it afterwards. An example of this is installing cameras to monitor a person with alzheimers. This can feel uncomfortable for friends and family visiting, and rather one could design for privacy by thinking of what the monitoring is needed for, and for example design a wearable device like a watch instead.

Imgur

For the ideation section we used Awais method of different sets of card with different factors. I really enjoyed this method as it forced us to think of very specific scenarios that we would not have thought of otherwise. It was similar to a method he has indroduced me to before, the twist cards. One thing I found to be slightly cumbersome/confusing in the task was step 3: We were supposed to select an idea that we liked and build further on this. The process of selecting an idea was a bit difficult as we yet didn't know everyone's ideas. The solution was that everyone quickly read their ideas from the silent round out loud, but I did not think this was sufficient enough to put our minds in the ideas. The method was really useful to spike our creativity, but just this step could use a little more work, e.g. develop a method for how to present the individual ideas.

Imgur Imgur Imgur Imgur

For the idea generation for next week's assessment I chose to come up with specific interaction scenarios and think of different ideas for each. In addition, I tried to break down each word of the assessment description so understand what the task was asking for. I discussed this with a tutor to get some clarity, which was really helpful. My breakdown is in one of the photos below.

Imgur Imgur

Introduction

Tuva Oedegaard - Wed 26 February 2020, 1:25 pm

Who am I

My name is Tuva and I am on my third semester of a Master in Interaction Design. I am from Norway and is 23 years old. I have a Bachelor of Applied Computer Science from Oslo Metropolitian University and my main interest is User Experience. On a general basis, I would love to get better at designing User Interfaces.

PhysComp

In Physical Computing I am really hoping to learn something I have never done before. I am mainly Web-focused and wokring as a Front End Developer, and that is probably what I’ll work with in the future as well. Based on this, Physical interaction is not something I’ll get to do later on in my life! I hope to come up with an idea that can be changed, developed, tested and that I’ll get to see a product that I can be proud of in the end. It would be a lot of fun to try to use the different physical tools we have been introduced to, because it might be likely that I’ll never get to work with it again.