Thorn
A project with Fjord @ The Dock - Helping people understand their data interactions.
Something as simple as watching a video on YouTube or posting an image on Instagram has implications both obvious and otherwise – legal, processing, physical storage infrastructures and energy consumption to name a few. However, as consumers, we rarely get a glimpse of the underlying reality.
This project involved identifying the end user to address their needs on the subject and ask, how might we improve their relationship with data? Heighten their understanding of how it’s being used? How they might retract it? Where and how it exists? This is where Thorn comes in.
Thorn
Thorn tracks mood through monitoring vocal data. It can recognise the subtleties in tone by noting inflection, amplitude, shakiness, breath and other key biometrics such as heart-rate and temperature, and begin to make smart conclusions.
It’s simple. Each day you grow a branch. The quality of this branch is determined by the quality of your mood throughout the day.
Brambly, thorny sections relate to periods of stress, anger etc whereas the growth of fruit, foliage and even the arrival of animals is related to moments of calm and happiness.
Timeline
5 weeks
Skills
Research, wire-framing, prototyping.
Year
2021
Discover
Research Aims
Understand the relationships and correlations between personal data sets.
Evaluate how people currently understand their readily tracked data and what it means.
Discover/outline scope for intervention/commercial use.
Research Findings
There is a huge ability to track mental health by use of passive data.
Strong correlations between different data sets (one directly impacts the other).
Vocal data can be utilised to determine mood (anxiety, stress) through recognising inflection, amplitude, pitch etc.
Define
How might we help people understand how their social circle/family impacts their mental/physical health?
How might we use people's vocal data to help them understand the subconscious effects of their social circle to improve upon their life?
How can data be visually represented?
I began looking at how types of data can be represented through colour, vibration etc to begin hypothesising solutions.
I was interested in the concept of mood rings as a metaphor and wondered how I could apply that same process to a more intelligent system.
Concepting & Sketching
I deduced two ways of looking at fixing the problem during the timeline’s project.
Dynamically offsetting the issue (anxiety, stress) in the moment.
Tackling the issue over a longer period of time through detailed self understanding.
I decided to go down the route of detailed understanding after the stresses had occurred, to allow the user to consciously make the decision as to whether they should continue to engage in those spaces.
Develop
Style inspiration
I wanted to take inspiration from dark 2D platform games such as Limbo to attempt to capture the darkness of the mood at lower branches. The style guide organically shifted slightly from this to allow for a clearer user experience, but got me in the general direction I needed to be.
Wireframing
I started looking to sketch out the solutions for the screens, wanting to combine the timeline with the visual representation of the tree, giving a daily, weekly, monthly and yearly view.
Designs
Final round of iteration for the designs, merging the visual elements I wanted to create an intuitive but highly visual flow.
Deliver