Documentation & Reflection

Week 14 & Exhibition

Sicheng Yang - Fri 12 June 2020, 6:28 pm
Modified: Fri 12 June 2020, 6:28 pm


Finally it was time for the exhibition. Although there is no fixed schedule for the course from now on, there is actually a lot of work to be done.

Live Demo

For the exhibition, I re-created a short live demo of about 1 minute, which was starred by my dear roommate.

The storyboard is almost the same as the previous video, but in the previous version, using two different view shifting between was found with really good effect, so that the audience can see the small display in the helmet more clearly actual effect. Although it is still a bit vague due to shaking, I think it is better to recognize with the help of the new UI than the previous version.

Project code online

In order to facilitate customers to view the complete status of my project code, and avoid damaging the structure of the website (in fact, the code is quite long). I chose to host the complete code on GitHub. Only a few fragments are presented in the portfolio. The link is at here.

As for the rendering of the code, I found a useful plug-in Rainbow, which provides some commonly used editor themes to highlight the code in web pages. I ended up using Monokai theme, which is one of my favorite theme, and it also fits the dark theme in the portfolio.

Live stream equipment


I purchased a mobile phone holder that can be mounted on a tripod, which allows me to demonstrate my prototype at a more flexible angle. Especially for showing of my small display.

Unexpected IDE breakdown

Unfortunately, my Windows system had an update at noon on the day of the exhibition. I seriously suspect that this patch broke my Arduino IDE, and even the web Arduino IDE cannot connect to the serial port. This makes me very anxious, because I may not be able to solve any temporary problems. But in the end the exhibition ended without serious errors, and I was very lucky. But in fact, my IDE cannot be opened so far. Come on, Microsoft.


Despite the exhibition is online, the event is generally very lively and even a little busy. I didn't even have much time to visit other people's projects besides my own live demo, which is a pity for me.

Thanks for Nick having this screenshot for me, cause I totally forgot about this.


Many visitors came to our channel, many of them were our tutor or lecture before, and even my current bosses. It's actually quite nervous to show them my work. Fortunately, the presentation process went smoothly and there were no big bugs.

But I didn't count the issue of phone camera fever. Prior to this, I had no experience in using mobile phones for long-term live stream. But its fever issue seems to be more serious than I thought. When the live stream went to the second hour, the phone was almost stuck and I couldn't even quit Discord. But the strange thing is that other people can still see me in the camera. As a result, when Clay arrived, I could only crouch in front of my laptop and shake my head up and down. But anyway, it is still working.

It seems that a thorough rehearsal can guarantee the good effect of exhibition. I hope I will have the opportunity to do such a rehearsal next time.


This is the last time I exhibited at the master's period. But anyway, the online exhibition has brought us a very novel experience. It's nice to see many familiar faces gathered in a virtual space to talk to each other.

I spent some time noting the results of the exhibition in the portfolio. But in fact, most of the visitors did not provide a lot of feedback. After all, the online display can do more concept-related demonstrations, and cannot actually experience it. But I still got an interesting suggestion. He believes that this helmet can be used not only in jogging, but also as a meditation breathing device. If the device can eventually be made portable enough, this is actually a direction worth exploring. Through the built-in different modes to meet the breathing training in different use scenarios, because they can carry this device to anywhere. Also, such changes can further integrate the exploration of breathing training in different directions in our group, including the use of AR effects to achieve the concept of Paula's art installation.

It is a remarkable experiences during the whole semester, glad to spend this hard time with you all 🎉 .

Week 13 Journal

Sicheng Yang - Fri 12 June 2020, 5:19 pm
Modified: Fri 12 June 2020, 5:21 pm

This week is the last week of class this semester. After dealing with other messy things, I put most of my energy back on this course. So I developed a lot of new progress in this week.


This week I made improvements to the prototype. UIs that were previously considered too informative to understand in jogging from user testing were redesigned this week. In the previous design (images below), I presented the next 4 steps of breathing at a time, the purpose is to provide users with predictions. But in fact, this will increase the user's cognitive load zone, especially in sports, which has a negative effect, that is, the user cannot obtain effective information.

old UI design

But users generally think that graphical visualization is a good idea. Therefore, in the context of synthesizing this conclusion, I decided to present only the information of the current step number on each screen, but also use the appearance rule of the graph to provide the user with the next prediction effect.

new UI design

The new UI design is shown above. It uses a circle metaphor to connect with the mouth and breath. A filled disc indicates that the user should inhale, while a circle indicates that the user should exhale. The different sizes show whether this step is the first or second step. And I also used larger fonts to mark the sides to assist users in understanding the graphics in text. After I talked with the team members, they all thought it was a better solution. But then I will conduct user tests to verify its actual performance.


Making a portfolio is actually a challenge. I have to admit that its production cycle is too short, and I spent most of my energy on writing content. I actually allocate less time to design.

The origin of the design comes from a cool helmet photo I took against the door panel. I did not hesitate to use it as a banner, as I mentioned in the previous journal. I then decided to create my website in a cyberpunk style.

cyberpunk color scheme cyber-banner

I refer to a cyberpunk-style color scheme in an attempt to present the title in a neon way, but in fact this does not seem to bring a good visual effect to the website. So I finally gave up the plan. I decided to use yellow and black color as the main color scheme for design.

final style

But at least, I tried hard to improve the visual effect of the homepage. I added a fluorescent light effect to the title. I think this effect is quite cool. I hope some users will notice it.

banner animation

In addition, I also wrote a simple function based on jQuery to automatically grab the chapter titles to generate an in-page navigation directory. So I don’t need to worry about the modification of the index when I am modifying the content. This function takes into account accessibility, the directory will be hidden entirely when JavaScript is not enabled, so it will not bother the user. Unfortunately, I ended up using the function on only two pages. So maybe it didn't really save my time.

const createIndex = () => {

  const titleList = $('.main-content h1');

  const pageNav = $('#page-nav');

  const navTitle = $('<h1>');

  navTitle.text('Page Index');


  titleList.each(function (index) {

    const title = $(this).text();

    const titleId = $(this).attr('id');

    const titleIndex = $('<p>');


    titleIndex.attr({ 'data-link': `#${titleId}` });

    titleIndex.text(`${index + 1}. ${title}`);



  $('.page-nav-item').click(function () {

    const link = $(this).attr('data-link');


      top: $(link).position().top - 30,

      left: 0,

      behavior: 'smooth',



  $('.in-article-img').click(function () {





Next week, we are about to usher in the exhibition. It feels a little unreal, and the semester is over in an instant. But looking at the prototype in my hand, I think I have done a lot of good work this semester. Looking forward to the exhibition next week.

Week 12 Journal

Sicheng Yang - Sun 31 May 2020, 11:47 pm
Modified: Fri 12 June 2020, 5:33 pm

This week is a relatively busy week. In fact, the coming poster and demo of thesis took up most of my time, so this week is mainly based on retrospective reflection and small improvements.

Work done


This week I went to buy a 9v battery connector. After the test, its performance is very good, it reduces the most weight on the helmets and improves the experience. In fact, I wanted to use a 9v battery in the early stage. But when I searched it on Jaycar, I thought that this is a special connector to Arduino, so I never found it. But I recently realised that it is a universal type, and finally found it in the area of the battery compartment. Really unexpected.

battery connector

In the studio we discussed how to do the final work to make the prototype look complete. I realised it was time to cover the bare lines and Arduino. In the end I got advice from tutor to use beanie (new term learned). This is a good solution and adds a bit of vitality to the prototype. But considering that my prototype needs to be exercised and the cap needs additional fixation, maybe I can try the clip.

Beanie on the helmet


I have reworked the method of median filtering for audio and pace. At the beginning, I chose the average sampling method, that is, sampling a hundred times and averaging all the data. This method has been greatly improved compared to direct sampling at the beginning. Therefore, I always think that this method is very effective. After getting the idea of median filtering from Clay, I still insisted on calculating the average value at the end, but the range was narrowed down to 10 values around the median, as shown below.

  int arr[100];

  int sum = 0;

  for (int i = 0; i < 100; i++) // get sensor data 100 times


    arr[i] = analogRead(soundPin); // data from analog in


  sort(arr); // bubble sort, see below

  sum = 0;

  for (int i = 45; i < 55; i++) // get 10 values in middle 


    sum += arr[i];


  sum /= 10;

As a result, the sampling values I obtained are still not very stable. Until recently, I suddenly realized that it is meaningless to perform average sampling in median filtering. So I removed the process and used the real median directly.

  int arr[100];

  int sum = 0;

  for (int i = 0; i < 100; i++)


    arr[i] = analogRead(soundPin);



  sum = (arr[49] + arr[50]) / 2;

In fact, the direct use of the median has brought a huge improvement in the accuracy of sampling, especially for the calculation of pace. This is very unexpected to me.

But I also learned an important lesson from it. Sometimes giving up some inefficient solutions can bring more improvement to the project. If I'm unwilling to abandon it completely just because I have put effort on it, it will bring more restrictions.

But in any case, this is very exciting for me. It solves a long-standing problem for me, but it is so simple.

Work to do

Next week I will concentrate on making the portfolio, including the videos and content showing in it. Now I have made a skeleton website, and then I only need to fill in the content. Could be easy.

But I also want to try some new tricks, such as using multiple fixed backgrounds to create sliding effects. It depends on time.

Week 11 journal

Sicheng Yang - Sun 24 May 2020, 10:11 pm
Modified: Sun 24 May 2020, 10:12 pm


This week I tried to improve the prototype. Based on previous user feedback, I plan to move the Arduino and battery compartment to the waist and connect it to the helmet with a jumper to reduce the weight on head. This is a difficult decision, because my prototype has tended to be completed for the hardware part before then, and it is mentally difficult to disassemble and remake it. So, I first tested the feeling of connecting the jumper to the waist and the possibility of fixing the Arduino at the waist.

My idea is to place the Arduino in it with a sports belt. I first experimented with the battery compartment. In fact, it will still shake when it is in it, which is not very stable, and there are many jumpers on the Arduino, which is difficult to fix further. Another problem is that when the jumper extends from the head to the waist, there is actually a feeling that the head is constrained, which is not convenient for exercise. So, I finally decided to abandon this change.

long wires

Another direction of improvement is that in fact, the major source of the weight on head is the battery set composed of 6 AA batteries, so I bought a 9V battery as a replacement. But there is no 9V battery connecter at the moment, so I only used a jumper to test it, and it worked fine so far. I plan to buy a connector at Jaycar next week.


Name of concept

This week I finally thought of a gorgeous name for my project called Breath I/O.

I/O represents both breath in / out and input / output for interaction, which can represent the concept of this project better. A topic this week is to prepare a pitch for about 1 minute to introduce your concept to the audience. So here it is:

People breathing every day, but nobody pays attention to it. But if you spend 10 seconds with me on your breathing… 1.. 2.. 10… have you felt your breathing rhythm? Do you know that keeping a good breathing rhythm is benefit to your exercise? Many beginners don’t or find it very hard to keep because of distraction. But with this wearable device, it can detect your breath and help you to breath with a good pace. All you need to do is put it on your head and enjoy your jogging and the small screen will notify you when your breath doesn’t follow the rhythm.

Keeping it short but attractive.

# Website

home page

The preliminary work of the portfolio started this week. At least we have a banner, hooray, and more content is still being added. Stay tuned ;).

# Future work

In the next week, I will continue to improve the programming and UI. Based on the feedback, I’m trying using graphics to replace most of the text, and add a tutorial at the first time, so that users can understand the way of interaction more clearly.

Week 10 Journal

Sicheng Yang - Mon 18 May 2020, 4:59 pm

Work done & Reflection

This week is mainly appraisal. Our group watched videos and documents from HiDistinction, Hand Gang and ZooKeeper. Some of them inspired me. For example, Kasey used 3D printing to make the shell is very pleasant. Although the technology behind is not complicated, the effect after careful design is very good. The hand made by Ryan is also very interesting. Although the function is not yet complete, it shows a strong playful attribute. If we take classes offline, we may experience these prototypes up close.

Kasey's work Matt's work

It also made me think more. My thesis is also making deign project for environmental protection and energy saving, and some literature I read about points out that people are easy to feel guilty in the negative feedback of energy saving and finally sacrifice their comfort. In the end they would stop using such products quickly. Therefore, we should try our best to provide positive feedback to users to encourage them to protect environment. I found some connection between my project and ZooKepper’s work, so I also fed this view to them. But this is a more conceptual suggestion. I wonder if giving this advice in the 10th week can really help.

In addition, I also saw Matt's decompression device, which was made using materials that are easy to find in life, such as roll paper. This also reminds me that I use a hanger to fix the display. This semester everyone is constantly developing in simplicity. But if there is a chance, I still want to make a good-looking shell for my device to present it to users more meticulously.

Finally, we discussed the final exhibition. I suddenly thought that my device, as a wearable device, has a small display screen and can only be seen by the user. This can be a challenge when stream online, because audiences on the other side of the screen may be difficult understanding what is happening. An alternative approach is that I may be able to connect my prototype to the computer with a cable and make a simulated display and show it to the audience, or I can find a way to hang a camera behind my head. I personally prefer the former.

my small screen

Week 9 Journal

Sicheng Yang - Fri 15 May 2020, 9:30 pm
Modified: Fri 15 May 2020, 9:32 pm

Prototype so far

The first relatively complete prototype has been made so far. In the workshop report I got suggestions on how to fix the display. I ended up using a crooked hanger fixed above the helmet. Because the hanger is thick and triangular, it is generally stable, and the display can finally be seen clearly. In addition, regarding the pedometer (actually an accelerometer), after the median filter is implemented, the value has reached a relatively stable state, but in fact in some cases one step will still be recorded as twice. However, the 2: 2 breathing method used in this prototype is based on the number of steps, so the inaccurate number of steps actually has a greater impact on the user experience. But for the time being this is an acceptable result.

So I took a nice photo of it, using the black door as background makes it cyber-ish.

Prototype Helmet

code sharing

Always forgot this function from journal. I'd like to share the median filter implementation.

bool getIsBreathing()


  int arr[100];

  int sum = 0;

  for (int i = 0; i < 100; i++) // get sensor data 100 times


    arr[i] = analogRead(soundPin); // data from analog in


  sort(arr); // bubble sort, see below

  sum = 0;

  for (int i = 45; i < 55; i++) // get 10 values in middle 


    sum += arr[i];


  sum /= 10;


  // judge if over threshold

  if (sum > 100) { //baseline is 70

    return true;


  return false;


Above is the function of getting breathing data and filtering with median filter. Because values from sensor can shift between some random low and high values that influence the performance. But in the most conditions, it will get correct data. So the median value usually will be the most stable data. In that case I can get more reliable data.

The accelerometer uses the same method, but it involving complex Wire library data reading so I think sharing the breathing filtering would be clearer.

void sort(int myArr[]) // bubble sort


  int len = sizeof(myArr) / sizeof(myArr[0]); // get array lenth: total-byte-lenth / first-element-byte-lenth

  for (int i = 0; i < len - 1; i++) // bubble sort


    for (int j = 0; j < len - i - 1; j++)


      if (myArr[j] > myArr[j + 1])


        int temp = myArr[j];

        myArr[j] = myArr[j + 1];

        myArr[j + 1] = temp;





And here is a classic bubble sort method to get the array sort quickly. First time dealing with C language, I'm not so used to having no Array.sort() function helping me out. But those solid algorithm can always save me from the mess.

Video designing

Video design is always an interesting part. I think if I can't find the job of interaction design, I might be able to be a director. I asked Nick to film a story as he after running and checking the app then scratching his head feeling it useless. I think this part works really well.

Because the camera of my prototype basically can only be seen by the user, it makes filming more difficult. In short, I shot a video of a front user running with helmet and a video of only the head and screen from the diagonal back and edited them together. The effect is better than I expected.

I also tried to use Fritzing to draw the circuit diagram, which is very convenient. I should have used it to design my prototype, otherwise it may not be wrapped by the line as it is now.

circuit diagram

Here is the video, take a look if you are interested.

User Testing & Reflection

When I was shooting the video, I also collected user feedback.

The current feedback focuses on users reporting that the helmet is too heavy, mainly because I used a 6 AA battery pack. And because of the time, I wasn't able to build a tutorial screen. The prototype will directly enter the training mode after booting, and the image will not appear until the user finishes the first step, which caused trouble for the user to start using it. Maybe later need to add more detailed instructions to guide the first time users. But after they were familiar with the interface, they all said it was easier to understand.

The good news is that they all believe that this prototype will help running breathing training. But the bad thing is that they did complain about the inaccuracy of the pedometer (yes, they can feel it) and they feel confused about the training when it gone wrong.

So in the following, I may mainly try to solve the problem of the pedometer and make it more accurate.

Week 8 - Progress

Sicheng Yang - Sun 3 May 2020, 6:48 pm
Modified: Sun 3 May 2020, 8:24 pm

Work Done

User Research

An additional user interview was conducted this week. The participant was a person with jogging experience. He introduced to me the breathing method of “two-step, one-breath”. This provides new ideas for the project. He also said that in the sports app he uses, functions such as history and medals are very important to him, so that he can have the motivation to stick to it.



At the beginning I had the idea of linking running pace and breathing pace together. The original idea was to use a compass. But in fact, the use of compass is not good at collecting running movement, because it mainly collects rotation data. Luckily enough, I borrowed an accelerometer from Lorna. Accelerometer is sensitive, but it provides more complex data than usual devices. At Clay's suggestion, I used a tilt switch as an alternative. The use of the tilt switch is very simple of course, and it performs well in tests with hand shaking. But in fact, when it is fixed on the body, the running swing seems to be insufficient to activate it. So, in the next test, I still want to use an accelerometer as a means of detecting the exercise cadence. I hope it will succeed.

Based on user data, I decided to use the “two-step, one-breath” training method. A simple UI is designed accordingly. It is critical to allow users to predict the next breathing method during sports, so I adopted a line-like graph, which is also similar to music games. Up is to inhale, and down is to exhale, and it is executed when the graph reaches the far left. However, because its movement principle is based on the number of steps, there is currently no animation connection in each stage, which may cause difficulties for the user's understanding, which is a relatively priority solution.


How to fix the prototype on my body has puzzled me for a while. Although the initial ideal design of this project is a design similar to Google Glass. However, due to the volume limitation of Arduino, it cannot be achieved at least at the current stage. So I’m looking for a reasonable way to fix it near the head. Finally, I found a way to fix it on the helmet. This method works well and is also suitable for sports.


Work to do

As I mentioned earlier, the current prototype has been fixed on the helmet in a relatively stable manner. However, the battery compartment has not yet been fixed, and the actual effect needs to be tested. Another problem is the fixing of the display. The current fixing method makes it easy to shake, which is fatal to users in motion. I am still looking for a solution.

I am also currently exploring the possibility of completing the entire interaction without using buttons but simply using breath. This will bring the project closer to the theme of body as controller, and at the same time make it more convenient for users in sports to interact.



One of the inspirations for the final design of this project is Google Glass. It has always been one of my favorited projects, but for ethical reasons it has not been successful. But I think its design ideas are valuable. Using glasses directly as a display is very cool and intuitive.

At present, I choose to fix the monitor farther in front to help eye focusing. But I am also curious how Google Glass displays at such a close distance. This is something that needs further investigation.


I also explored several methods for reading serial data using Python and Node on PC. In the end, I found that using SerialPort module from Node to read serial data and then using to render the content and updating in realtime to the front-end is very convenient. This may be helpful for making and testing high-fidelity UI interfaces for this project. And I can also use this method to simulate a extended function that uses a mobile app to manage exercise history data.

Week 7 - Design concept

Sicheng Yang - Mon 27 April 2020, 10:19 pm


My concept is to make a wearable device to help the user perform breathing training during jogging to enhance the exercise effect. As mentioned in my last journal, the current desktop research has pointed out that breathing training during exercise is helpful for wellbeing, and users also said that they have tried breathing training and felt effective, but they will have difficulties in the early stages of training. Therefore, I specified the target users as those who have just begun to try jogging breathing training.

The input of the device uses a microphone, and the output of the device takes into account the user's listening habits during jogging, so visual feedback is preferred. Therefore, the ideal design of this device will be a glasses-like device with a microphone.

Ideal product


The preliminary ideal design is shown in the figure. It will use a microphone to detect breathing, and an accelerometer to detect cadence to help determine the breathing frequency. A small display is set in the upper left corner of the glasses to remind the user of the current situation. There will be a beating ring and a fixed ring in the display. When it enters the inner ring, it will prompt the user to exhale and leave the inner ring to inhale. Users will get feedback if they deviate from the recommended frequency. The ideal design is that it should be a smart device that can synchronize data with the mobile phone as the record of exercise. But still, there will be two buttons on the glass legs, including start/stop button and a reset button, allowing users to quickly control.

Week 7 Journal

Sicheng Yang - Sun 26 April 2020, 6:22 pm

The direction of the project starts to become clear during this week. I mentioned in my report back on Tuesday that although desktop research pointed out that breathing training in sports is helpful to the athletes themselves, I am not sure if this is what the user needs. After conducting more online interviews, I find the new direction. At the same time, I also tried to connect the display and the microphone to the Arduino this week. They can all work now, but they still need further development to provide more dynamic functions.

User research

Three online interviews were conducted this week, and the participants were mainly people who had regular jogging experience. I asked them about the use of sports apps and their understanding and attempts at breathing training. In fact, all three participants with regular running habits have experience in using breath training. They all said that they will encounter many training obstacles such as attention distribution in the early stage, they mainly change to unconscious control in the later stage of training, so they feel that breath training supportive devices are useless to them currently. But they also said that breathing training had a positive effect on their exercise effect.

Therefore, I want to amend my main target users to be newcomers who have just started jogging and use this project to help them perform breathing training more easily. I also want to add long-term functions for them, such as providing statistical information to monitor the effectiveness of their breathing training. One participant mentioned that checking the pace is one of the main reasons they use sports apps.

Prototype building

This week I further built my prototype. I successfully connected a microphone, which can provide analog signal input, but the microphone value change is only obvious at a specific angle. The advantage is that the signal fluctuation caused by blowing is sufficiently obvious. On Tuesday Ryan mentioned the analysis of audio to provide users with more detailed data. But this brings more technical challenges. In fact, I only found one library to allow recording on Arduino using SD card. And some audio analysis libraries also analyze audio files. I have not found a reasonable way to convert electrical signals into audio signals in real time and analyze them. After consulting with a friend with signal processing background, I thought it was too difficult, so I chose to use electrical signals directly as triggers. But at least so far I have found a relatively stable breath triggering algorithm.


I also tried to connect a simple OLED display. I tried to draw some simple patterns using display. In the end I think that using visual feedback is a more reasonable method. My current plan is to place it as a Google Glass-like display above the glasses to provide feedback. Because the output audio is less intuitive, and some joggers themselves have the habit of jogging to listen to music. Frequent interruption of music playing for providing feedback will bring a negative experience.


Work to do

The next plan is that I will split my prototype into two parts, one for functional testing and the other for design testing. Because my device will be a portable device, its wearability is also an important part of the design. I will further develop the prototype of the function, which is to integrate the microphone and display and provide meaningful feedback. I am also considering adding sensors such as accelerometers to provide additional data input.

In addition, I will also conduct more user research. I plan to find some online resources to observe the breathing frequency and speed of athletes while running to gather more information.

Journal Week 6

Sicheng Yang - Sun 12 April 2020, 11:22 pm
Modified: Sun 21 June 2020, 3:02 pm

Work done

This week is relatively short. Because of the public holiday, Friday's workshop was not available. So the major work of this week is about proposal writing. I read several works of literature for proposal. I found that there is already a mature product on the market for home breathing training, which costs USD 99. But it uses a belt on the abdomen as a sensor, which can be very hard to adapt to my direction, but maybe good for my teammates to look up to. The other more portable sensor is an innovative design, but the technology is too complicated.

I also used the sensors from Arduino Kit to do some exploration. I tried using a temperature sensor because I guessed that the temperature near the sensor would change during breathing, but it was too weak to show any effect at all. I also reported this in a weekly report. In the end, I got the feedback of trying to use microphones to analyses breathing sounds, which I think is cool and can also reduce my costs. But I still don't know whether this can distinguish normal breathing from abdominal breathing, which requires further research.

During the Tuesday session, we tried to use Miro as our collaboration tool. I like this tool for its outlook. Also, from this week we finally have some online teaching activities, which works better for me than in previous weeks. This week we have practised observations on the status quo of self-isolation, which is very interesting and may also be very helpful to us. We also have our board on Miro now. But the board is not yet complete, which also requires further user research to provide detailed design principles.

Work to do

As mentioned earlier, the work for the next week includes conducting more user research. One is to investigate the availability of outdoor breathing training from users, for now, I’ve only got second research material to support. And the other is to understand the habits of outdoor sports users. At the same time, the possibility of abdominal breathing through the microphone will be further investigated. But I will also continue to look for more suitable sensors.

After that, I will try the finalizing concept and also complete my concept map on Miro.


This weekend I cleared the browser cache, so my Miro account was also cleaned up. The result is I really can't remember the account I used to edit Miro of DECO7385 before, but now the system keeps telling me I've got no access. Hope I'd remember to take screenshot earlier next time.

Week 5 - Proposal

Sicheng Yang - Sun 5 April 2020, 9:38 pm

Reflect from this week

This week is the second week of online learning, and it is also a very struggling week. As I mentioned in the journal last week, our group struggled earlier this week because our group was not satisfied with the previous pitch concept, and we failed to obtain a unified opinion on a new concept. We also reflected this in the report on Tuesday. This situation actually continued until Friday, although we did some housekeeping work, such as the assigning the workload of proposal, but there is no actual progress.

We have considered following the feedback suggesting us to focus on education, but we found that education is actually difficult to integrate with the body as controller. Until our group discusses again on Friday, we reviewed the idea of looking for to everyday life mentioned in the feedback, we started to refine our concept again from the beginning. Finally, we found that Paula ’s original pitch actually provided a very interesting direction, that is, using breath as the controller to bring well-being. This can be a very interesting direction. And as far as I understand, there are already similar breathing training devices in psychological research using video feedback, but it is very raw and not portable. We then determined our concept, and each person diverged separately. I chose the scenario of using breath controller in outdoor activities, so I will focus on the movement of people and the portability.

At this point we have basically completed the team section of proposal and started the individual section.

Work to do

In the next week, I will continue to embody my concept, because our final team problem space was actually determined on Friday, so we did not get much time to think carefully about our individual ideas. Next week, I will focus on the specific implementation of my concept, especially about technical methods that may be used. At the same time, I will continue to write the proposal.

And this week I finally received my Arduino kit. Next week, I will begin to explore the possible use of Arduino.


This week I initially explored some basic usage of Arduino and made a traffic light system based on the tutorial. I used the voice control in DECO7230, so I skipped the part of using Arduino, and only tried it in workshop tutorial. But in general, I think this is a friendly device. Only one strange thing about Arduino is that its preset variables HIGH and LOW turned out to be the int variables, which I thought it should be Boolean.


Week 4 Pitch Feedback

Sicheng Yang - Sun 29 March 2020, 9:12 pm
Modified: Sun 21 June 2020, 3:10 pm

Reflect of this week

This week is the first week of online courses. There is a difference between online courses and on-campus courses, and it is difficult to adjust for a while. On the one hand, the time spent on the road is saved. But on the other hand, the class in zoom lacks interaction, especially among classmates. Of course, this may also be caused by the pitch content of this week. But this format did make me felt more tired.

Back to the pitch of our group, our group had planned to work together on a single project. We thought it was an interesting concept without thinking deeply. It did overlap with some existing products including games using Kinect and Leap Motion. We haven’t explored more possibilities for using the body to control some other things. Feedback from Alison and Clay mentions that we may be able to use some unusual control methods or get rid of gesture control, or even bring sign language and make controlling meaningful. Lorna ’s feedback mentions we might move away from the screen and couch, and we think it is very instructive. But I have to admit that we still feel very bottlenecked. We have two other group meetings this week after class. But in fact, the progress is not very smooth. Everyone suffered some blows after pitch and wanted to turn to personal directions. In a way, this will make our job easier for the rest of the semester. But in fact, most of us have not found a clear direction to go on, which kind of worries us.

What could have been done better

In fact, at this stage, each of us is still confused about what we are doing next. Ideation is the most difficult part of every project when I started my every semester. When we just formed the team, we all seemed a little too taken for granted, thinking that the idea of the gesture plane game we came up with was interesting. Perhaps we are too limited by such an idea that I want to make a game, so I have not explored existing products and papers. This led us to frustration and lost direction during the pitch phase. So, maybe, every time before presenting the idea to others, we should first get opinions from some potential users or experts, and not just addicted to our ideas may make us do better.

Work to do

Next week we want to confirm the feasibility of the direction we are currently proposing. Because after the course structure changed, we are not sure if this meets the requirements. Of course, we will start our proposal writing. We see the umbrella mode mentioned in the announcement email might be most suitable for our group, but we don't know how to integrate this model into the proposal because it asks us to talk about details of the project, such as integrated experience and how to engage users.


One idea that I particularly like in pitch is Hasaki, an idea that uses sound to guide the player’s movement. This makes us rethink the output of our project and does not necessarily need to use the screen as a result. For example, the sound is a direction worth exploring, of course, it may be more than that. We are now considering providing users with a healthier and more dynamic control way that keep them from using screens too much.

Week 3 Team

Sicheng Yang - Fri 13 March 2020, 9:38 pm
Modified: Fri 13 March 2020, 9:40 pm


We formed team this week, and I'm glad I can join the theme of body of controller of my choice and have good teammates! In short, our team decided after a short argument that our team name is MoBody, which is short for Move Body. This is a cool name.


We finally decided to start a game project from using the hand as a controller. We decided to make a flying shooting game. We think that it is also a very intuitive way to compare the shape of an airplane with our hands so it has almost no learning cost for players. And this sounds a very cool idea for us. We also plan to continue to use bracelets to provide feedback from my original pitch idea. We finally decided to use the right hand to make the gesture of the aircraft, using the direction of the hand to control the movement of the aircraft. And using the left hand to control the weapon than the shape of the gun.


In addition, we also considered the aspect of collective experience. We think that two people can cooperate, one person controls the weapon and the other person controls the aircraft. The two planes can also cooperate with each other in separate screens. Although the latter may have higher costs, because we need to increase the number of sensors.


We were inspired by my pitch Running hand. I ’m glad my team members love this idea. Another inspiration is from Raiden, a classic Vertical-scrolling shooter game.


Week 3 World Cafe

Sicheng Yang - Fri 13 March 2020, 9:14 pm
Modified: Fri 13 March 2020, 9:16 pm

We had the world cafe at Tuesday, which is a great event especially when we can have so many different topic and ideas to talk about. Although we are progressing slowly on some themes because the host actually does not fully understand the content of the previous discussion, so it is not able to continue the content of the previous. Eventually everyone returned to the content of the poster and start gain.


Sometimes I feel more relaxed when I am with familiar people, so I can think of better ideas. I finally met a few familiar classmates on the theme Music of Matrix. So, I thought of a funny idea when I was joking, which is the idea of a bus that use fart sound when people sit down to trick them. Although I do think it is really something I would like to try, I did not choose this theme in the end. Because I think this idea cannot find a suitable human value.

I voted for body as controller in the end, which is also the theme that my pitch was categories to. I think I'm really interested in this theme, especially using vibration and other methods to provide feedback. I hope to find like-minded people Teammates to make things in this field.

Week 3 - Reflect

Sicheng Yang - Sat 7 March 2020, 9:55 pm
Modified: Sat 7 March 2020, 10:02 pm


This week we heard the pitch sharing of the classmates. I have to say that such a long version of sharing will be a bit difficult to catch-up all of them. But I still heard a few interesting ideas from the pitch.

I think Musician is a very interesting idea. It’s a very interesting idea to express the audio visually and then interact with it. I think we can go deeper and make it more exciting. Secret Handshake Lock is also a very interesting idea, although It’s a bit creepy, but in a good way. I just think it’s cool, because it’s something I have never thought of. Shadow Band also makes me feel very interesting. It sounds like a fun game, but I also hope it can have more physical feedback. Surprisingly, many people are interested in smart homes, but I think this direction is a bit too realistic. I prefer crazy ideas. I thought that more people would come up with games since there are a less boundary in this field and can give full play to imagination. But not many people actually choose to make games.


After the pitch, we also carried out theming. The most of themes seem to be very clear. As mentioned before, many people are focusing on mental health, smart home and music, which makes theming not so difficult.

Imgur Imgur


Then I also reviewed peers’ comments on Running Hand. I’m glad that many people can agree with me, especially the part about physical feedback. Although this is something I thought of suddenly when I was playing my hand in the air, I think it’s It really improves interaction.

Alison came up with the idea of a Player vs Player match. I think it's very cool, and in fact it would be cooler if we could add something like defence or skills. And this idea just matches the other comment

“Think about the collaborative way for group of people to play the game together.”

Thinking about how to let people to people interaction through computer is also fascinating.

I am already looking forward to the next week's world cafe that will allow us to come up with better ideas and start teaming up.

Week 1 - Running Hand - Poster Explanation

Sicheng Yang - Sun 1 March 2020, 10:41 am
Modified: Sun 1 March 2020, 10:43 am

Running Hand


One of the inspirations that came to my mind was from a hand-man fight in class with my classmates when I was in primary school. Walking with fingers as legs is a very common game in our lives. So, why not make it into a video game, use my figures as legs running all the way up and use it to pass the level. This is the original source of my inspiration.


I came up with the idea of using gesture sensor to project the hand into the game and use it as a character to play the 2D game. Because using fingers as legs of the character, I thought of some very intuitive gestures to control the character, such as the character can squat and then stand up to jump, or the two fingers can be exchanged to make a walk. We can also add kicks and other actions.

Imgur Imgur

Because directly using gesture sensor will lack physical feedback, especially for events like landing or hitting. And lack of physical feedback can confuse players. So, I want to use small devices such as rings to give players vibration feedback.



Three inspirations helped me. One is the Xbox Kinect game FRU using body as bridge for character. The other is Leap Motion, a high-precision gesture detection sensor that makes my idea possible in some extends. Additionally, HD rumble from Nintendo Switch also helps me see the possibility of vibration feedback.

gesture video game vibration

Week 1 - Reflection

Sicheng Yang - Fri 28 February 2020, 8:42 pm
Modified: Fri 28 February 2020, 8:48 pm

Work done

This is the first week of phys comp. This week we mainly explored the form of physical computing and studied a lot of existing innovation cases. I found several examples that touches me. The one is the bar matrix made by MIT I saw few months ago, which can be used to send and receive information and set up scenes, makes me feel a sense of the future.


The other was found when I randomly browse online, they use bar codes for music performance, which makes I feel mind blowing, after all, bar codes are everywhere in our lives, but playing it with different sound effect is a very novel experience. But I haven’t thought other stuff like this that we can create.


We also read 7 grand challenges of HCI in the contemporary and future. I read about the issues of sustainability and democracy. Comparing to the previous discussion of the relationship between people and machines, this translation question is more about the human-to-human relationship caused by machines, which bringing more complex. Although the article points out some topics such as social justice, in fact, the article does not think of a good solution. This is actually one of the problems I am bothered with for years. I’m aware of that large companies such as Facebook and Google have been collecting various user information and using it for commercial purpose. On the other hand, online content filtering is often considered to lead to information silos. However, there is still no good solution, and Internet companies insist on filtering because this will bring greater user stickiness as they said. Of course, new decentralized explorations such as block chain are very innovative, but it seems that there is still a long way to really become user-friendly and efficient.

Of course, good things happened. For example, Tesla is trying to use satellites to build the global Internet, which opens up more possibilities for residents in under developing areas to access the Internet. At the same time, in citizen participation, the cooperation between scientists and game EVE of placing real star maps in the game to invite players to participate in the analysis and exploration of the star maps has achieved real academic results. These stories keep inspiring me.

In addition, we also conducted a brainstorming through cards. This kind of brainstorming always reminds me of some terrible ideas. I didn't learn Worst Possible Idea is even a serious ideation method until I was doing design thinking course. These terrible ideas often make me think of better things, so I am very enamored with this method. Anyway, it helped me open up the idea of hiding the fitness equipment in the room in the gym and choosing the room to exercise by smell. Although this idea met with opposition from the same table.

Imgur Imgur

I also completed a raw idea poster production this week. From the childhood game, I thought of the idea of making a human shape by hand and playing the game with my hand. I will introduce it in detail in the following sections.

How relates

So, the major task of this week is basically ideation. This is an essential step for the start of a project. I don't like this process very much, because it is always tormenting. But I like the results produced by it. Although this week seems to be hitting randomly, it will definitely point us in the right time.

I think the world cafe will be a very useful event. Because the ideas I found in the world café in studio 1 last year helped me and my teammates complete a cool project, Brisbane river adventure . So, I ’m very excited about the activities in the next two weeks look forward to.

Work to do

Next week I will pay attention to the presentation of others. It is very efficient to draw inspiration from others’ idea. At the same time, I will try my best to showcase my idea, maybe I can find some like-minded teammates. I also look forward to the world café in the 3rd week.

Week 1 - intro

Sicheng Yang - Fri 28 February 2020, 8:19 pm

Who am I

I am Sicheng, I am already in my last semester of my Interaction Design studying. I’ve been practicing web development skills in my previous study and did have some achievement. I know React well and a bit of Unity. I love playing games, especially those with novel ways to play, for example: Baba is you. So, I probably want to build a cool game in this course unless some better idea hits me.


I want to learn a lot about the design skills of using Arduino and maybe Raspberry Pi making functional stuff. And I want to learn how to build physical components, such as using laser cutting machine. This machine has been on my to do list for a year. It's pretty cool to be able to put our ideas into action with our hands.

In short my ultimate goal is to make a mind blowing installation at the end of the semester.