Posted by Ian Forrester on , last updated
The Quantified Self is the process of tracking aspects of a person’s behavior and recording it as numerical data. Once your behavior has been “quantified” in this way, you can make changes to it and see how the associated data changes. This simple concept can be applied to many things: improving fitness by tracking and increasing the number of steps you take in a day, controlling your weight by monitoring and adjusting your calorie intake, or reducing your carbon footprint by recording the locations you visit and how you travel between them, for example.
The concept of the Quantified Self is often associated with wearable technology. “Wearables” are electronic devices which people wear like other personal possessions (bracelets, watches etc). In the context of the Quantified Self, they are used to generate data about user behaviour to enable it to be tracked.
This niche but growing trend of self-tracking has led to the formation of an international Quantified Self movement, with over 50 local meetups around the world and international conferences including the Quantified Self Europe conference in Amsterdam.
This year a big unofficial theme of the conference was the importance of data privacy and security. The ethics of capturing and manipulating self-tracked data was mentioned several times during the opening keynotes by Gary Wolf (one of the founders of the Quantified Self Labs). Gary talked about lifelogging as an acceptable activity at the conference during the introductions. However Gary made it clear that if anyone felt uncomfortable about being photographed, a request for privacy should be respected. The stories in the press about Google Glass and the controversies associated with lifelogging were at the back of everyone’s minds, and nobody wanted to see them repeated.
The importance of respecting privacy was reflected in the programme, with speakers talking about the inherent conflict between product vendors’ and manufacturers’ business models and users’ control of their own personal data. Many examples were discussed, such as the case of the man who requested all the data Facebook had on him.
The Quantified Self movement has a good history of highlighting issues with privacy and security in a practical way. However this year (in the light of the Snowden leaks) a number of personal data breakout sessions were added to the line-up, including workshops on “Personal Data Attacks & Defense” and “Strategies for Managing our Data”. Overall, the feeling was that it was time for the Quantified Self movement to take a stronger line against services, devices and landscape that do not empower their users.
A week later, the seventh Thinking Digital (a conference that has been described as the UK’s answer to TED) took place in Gateshead. This conference too had a couple talks this year that focussed less on enthusing the audience and more on getting them to question their own data privacy and usage.
Aral Balkan, a User Experience designer from Brighton, gave a talk with the debate-provoking title “Free is a lie”. Aral asked the audience to question their own personal data sharing, by pointing out relevant examples of business practices by technology multinationals, such as Facebook’s ability to understand if you are about to start a new relationship. This was closely followed by the alarming story of how a large store in America was able to deduce that a young woman was pregnant before her father. Aral described this use of personal data as “at best, a concealed trade; at worst, an epic deception.” Later in the day, a panel discussion chaired by Jemima Kiss - the Technology editor for the Guardian - ran through a timeline of the Snowden leaks from the Guardian’s perspective. It was fascinating to hear how the news room carefully looked through the information and almost passed the opportunity up.
The Quantified Self conference and Thinking Digital both saw interesting debates about the ethics of how our personal data can be used in ways that benefit both us and the entities that hold it. Long EULAs (End User Licence Agreements) are now so commonplace that consumers cannot realistically even be expected to read them all. At the same time, the commercial value of user data is hard for the companies that hold it to ignore, regardless of whether or not extracting that value benefits the user. Are people being treated fairly? Has a line been crossed, and if so, what can be done about it?
The ethics of data use is something the BBC (along with many other people and organisations) takes very seriously. Privacy is not only very important - it is a human right. However the question remains, Is there a middle ground that allows personal data to be kept secure and under the control of the individual, but which still allows that data to be shared with others (and where appropriate, businesses and other organisations) to deliver the benefits that processing and aggregating that data could offer?
The BBC’s public purposes include “delivering to the public the benefit of emerging communications technologies and services”, and the management of personal data is an area that will affect everyone. The use and misuse of that data has the power to both enrich and ruin people's lives. This is a point that Kaiton Williams (a PhD student from Cornell University) made very clearly in his opening plenary talk at Quantified Self Europe:
“How do we incorporate the perspectives of the many who can’t participate here, are overlooked and marginalized, but whose lives will eventually be affected by practices that spiral out from ours?”
The data collection practices of the Quantified Self movement could be hugely beneficial to the BBC, its license-fee payers and to society as a whole. But the security and privacy issues associated with that data give rise to genuine ethical and technological challenges. These are questions that we will continue to think about as we consider the role of personal data in future BBC services, and the ways that the BBC could contribute appropriately and meaningfully to the wider discussion of these issues.