As the end result we wanted to use different charts and graphs, so we chose a line, bar chart and radar graph. However, the radar graph is hard to represent the data accurately, within the time we had we could not make many changes. We used these three types of graphs to show the data because of our information. The data had uneven anomalies in each week, which meant that there were major differences in it. Using different types of charts and graphs allows us a variation and makes it more interesting for users to find the errors in it.
Having the charts interactive allows the user to see the data easier, as there are major differences in the scale, it makes it hard to see it clearly. For example, the chart would range from a number of 10 to 400. Also, a lot of the data for some categories would be similar, making it hard to view in the large scale chart.
For our first experiments, we used our data manifested from the Health App and created three line charts. This helped us visualise the uneven data and determine what kind of narratives we would use for each dataset. We collected the data from the health app from the three categories that we chose. These three were Move, Exercise, and Stand. With the dataset, we chose to use 3 weeks of the dates. In each chart, there are anomalies that stand out and does show uneven information.
We came across a news article that the Health App was used in a murder trial, which helped learn about the case. The data could be manifested and help lead to answers in about the murder. Which gave us the idea to use our data and create a visualisation that would use our own scenarios, showing the ‘Truths’ of it. What can be accessed and found may not entirely be true to the actual events recorded, and what can be inferred from data anomalies. Which creates a different meaning, giving an interactive narrative to our data.
The method of data visualisation we chose for the concept is using charts. As this would be a web-based, we wanted to make it interactive for the users to explore the data and go through our scenarios to nd the truth in the data. As some of the data we gathered from the watch was uneven, whereas the watch was not worn, the data would be very different. We chose to represent this as charts because we can see a comparison of data easily and effciently in a visualisation.
At first, we wanted to experiment with three.js for this project. Unfortunately, within the two weeks timeframe, we could not use it with our little knowledge about it. We would have had to learn a lot and it wouldn’t have fitted into our time schedule.
We chose to use the Health App by Apple. The data that we would manifest from for the categories is the Move (calories), Exercise (minutes) and Stand (hours). This data is from using the Apple Watch, which had 4 weeks of data on it. These would be the main sets of data we would use for visualisations and the narratives.
The final concept of the project is a month’s radar chart showing each day of the week’s activity in days. The chart creates heptagons that show a comparison of each week in the month. It creates unique shapes based on the user’s data and is different each time. The reason why it is a heptagon is that of the category of days. On the left, are minimizable tabs of the weeks to show specific data of the days in each week of the month and gives an average for that week.
For the prototypes, I decided to make categories to separate the data and show comparison. The categories that I came up with were months, weeks, days and time. The reason for this is the user can see what is their most active times whether it is a specific hour or day within the week. The user will be able to see an efficient comparison with the radar chart. The idea is to let the user use the interface to easily compare the data on one screen.
Some of the inspiration came from charts that had similar characteristics of using lines and opacity.
I extracted the data of 4 months from my health app into a spreadsheet, which showed each day from January to April. It has the distance of miles I moved every day with my phone and the steps I had taken.
Looking into the health app, I found that all the data for each day can be accessed. The health app displays each day, week, month and yearly data in a bar chart. However, these bar charts can be seen by sliding through each filtered chart. What this data does not show is an effective way of comparing the data to each other. In my opinion, the data is not compared well, therefore I will find a solution to enhance this.
The data showing the distance walked and steps taken are tracked by a phone or a wearable technology linked to the app. It shows the trajectory and movement of bodies in the physical space, as it is tracked by the device. As I do not have the apple health app correctly setup and do not have an apple watch, I have a limited amount of data that is tracked. However, I do have a quantitative data sets that have a large time frame. Which will give me plenty of data to compare and show differences in the data. Which will demonstrate the different movement in physical space.
Next, I looked at different charts and methods to represent the data in an efficient way. Looking at many different types of charts, I decided I could use a ‘Radar Chart’. I chose this method because it would allow me to show a comparison in the most interestingly and efficient.
Design an interactive visualisation that demonstrates how objects and bodies move through physical space.
To start of this project I asked the questions of how I can link data with movement in space. To show trajectory in the data, interacting with the movement in space.
My first initial idea for this project was to create an interactive data visualisation with the use of the camera. Thinking about the trajectories and data, I wanted to use facial recognition. The concept was to collect data on the amount entered a building or an area. This would be a live service to show the data of the bodies moved through the physical space. However, when developing it and from feedback, I decided to end the development of this idea.
The second idea I had were to incorporate AR sound map to represent digital data. There would be recorded messages in physical space, that could only be found through the AR device’s camera. The idea is hidden messages in the space that digital data that could only be accessed through movement in physical space. The idea was inspired by Zach Lieberman an new media artist and computer programmer. However, I did not have the time or skills to accomplish this, therefore I had to take a few steps back.
I came across the apple health app, and I had data tracked about my steps and the distance I had traveled each day for over 4 months. I wanted to use this data for the project, and I needed to think about how I can apply this to demonstrate the trajectory it shows in physical space.
The concept of the system would be to identify the variable factors in the game which allows it to determine the soundtrack based on the background colour and gradient of the map.
In detail, the system would first identify the colour of the background based on the Hue, Saturation, and RGB of the background, responding to it’s closest dominant primary or secondary colour. Next, it will find the curvature of the level, based on the gradient of the hills. Then it will decide the BPM matching to the soundtrack. The colour of the map will relate to the emotion and feeling of the colour. With these factors, it will choose the soundtrack that is the closest matching to these variables.
Here is a concept video of the system in game.
After the Critical Session, I received feedback about the system. My peers suggest that I use more examples of the system, and making it interactive to choose different options. Upon this feedback, I decided to modify the system diagram and create another representation of it.
This was the first revamp, which developed into the following.
I thought about what game. The games I thought of were Katamari Damacy, Final fantasy, and Tiny Wings.
Katamari Damacy is a simulation game that the main objective was to collect random objects and grow the character. However, this game was not suitable for colours as there would be to many factors in the game. It would be more suitable for my initial idea of size.
Final Fantasy is an RPG game, which has an adventurous storyline. Depending on which game of the series I used, my idea could be possible. However, this game also has many different colours that could change constantly.
Tiny wings is a level based arcade game. This game was perfect for my idea, as there is a clear indication of colour change in the game and it was simple. There are many opportunities in a session of the game to see the metadata take place. For example, each level changes colours, which meant a new soundtrack can be played.
Firstly I looked into how I could link the colour to the music choice I chose for the level in the game. The music I chose was from a game called Maple Story, on the game they have maps that have dominant colours which I used as a base for decision.
Next, I looked into what factors in the game that affected the music choice of the system. I decided that the changing of the background colour and the gradient of the curves on the level are the variations.
Also thinking about the interface of the system taking place within the game, I thought of a small overlay constantly showing and hiding with the change of level.