Take back control of personal data: Who cares?

There are lots of companies now touting solutions to allow people to “take back control” of their personal data. But even after the Cambridge Analytica scandal and countless security breaches, does anyone other than “data zealots” really care?

In every country there is now a “group of zealots – probably about 27” who really care about their personal data, “then the interest drops like a rock”. The observation came from David Siegel, CEO at data privacy specialist, Pillar Project, speaking at the recent Blockchain Summit in London.

Most people won’t manage their personal data, even if they have the tools to do so (and there are some quick wins around, such as switching search engines from google to the likes of duckduckgo). “More than 80 start-ups have said they are going to change this and they are all dead,” added Siegel. This is as much a challenge for Pillar as it is for anyone else, given it is building a digital wallet that is ultimately intended to allow users to hold and secure their data, with the ability to control, share or restrict it.

Why the Apathy?

Certainly, there is increased awareness. The Cambridge Analytica scandal in particular gave many people an insight into what is happening within social media giants (it is undoubtedly happening in many other companies as well that hold large amounts of personal data but without them having been rumbled to date). For more on this, see here: https://smartercommunities.media/hindering-data-for-good-the-erosion-of-trust/

Even if not knowing the details, electorates in the US and UK in particular are now aware that all was not as it should have been on social media in the presidential elections and EU referendum. The almost daily news of hacking of personal data has also woken up many people to the issue.

It is also easy to see whether your account has been compromised in a data breach, through sites such as https://haveibeenpwned.com/ The results are usually worrying but not surprising.

However, moving from a general feeling of disquiet to proactively doing something about it is a leap that relatively few have made.

Every-day tracking still often goes unnoticed. On any large commercial publishing site, this happens from the first page and then there is likely to be ever more tracking as users move deeper into a site because, by so doing, they reveal more about their personal preferences.

Rebuilding secondary education for refugees

David Siegel, Pillar Partners

The collection, storage and use of personal data will only increase, in part as sensors proliferate as the Internet of Things (IoT) becomes reality. IoT also has potentially massive security implications, providing unsecure gateways into homes and commercial and public institutions with flaws in everything from solar panels to air conditioners to smart TVs. And don’t even start on the arrival of Amazon’s Alexa and others, providing the tech giants with that all important reach into the world’s living rooms.

Companies, governments and others wield ever more power to use our personal data to influence our ways of thinking and choices, providing back to us the data that they want us to see, rather than what we choose to see.

Developments in China are causing particular disquiet. The government is building up credit scores and social data footprints for all Chinese citizens, working with Tencent’s wechat social media platform and Alipay, the payments arm of online retail colossus, Alibaba. Citizens who do ‘good’, such as volunteering or donating, could be rewarded. In the hands of any authoritarian regime, or even a democratic one for that matter, it is not hard to see the sinister aspects in this, creating different tiers of citizens based on their perceived worthiness.

Zero Knowledge Authentication

Out of the woodwork has come an army of start-up companies offering solutions. A key goal is “zero knowledge authentication”. It means that anyone could be authenticated without providing any personal data. To give a straightforward example, you could have your age authenticated once, with a trusted third-party. It provides, say, a QR Code that shows this. It is then the QR Code that is presented, not your date of birth, when anyone wants to verify your age.

An example of a start-up in this area is Blockpass, with a solution that allows users to be pre-verified for future identification needs. The company’s European head of business development, Guy Davies, sees four different phases of identification:

  1. The centralised model of google, facebook et al, which brings uncertainty and lack of security.
  2. “Federated ID” whereby users can gain access to services via other IDs, such as signing up for Spotify via facebook which, he feels, “doesn’t resolve any of the underlying issues”.
  3. “Open ID”, where personal data is held in a central store and is interoperable across apps and websites, as per the vision of the FIDO alliance of hardware, mobile and biometrics-based authenticators.
  4. Self-sovereign ID, which is decentralised, with the user in full control. Davies describes this as Web 3.0, with the arrival of complete digital IDs and new economic models. This would include the touted zero knowledge proof capabilities.

Another would-be provider in this field is Nuggets, offering a biometrics and QR Code-based solution, enabling a single shared account for login, payments and verification. Nuggets neither holds your data nor tracks you. Like others, it is at an early stage, with both the solution as well as in terms of signing up users and businesses.

Rebuilding secondary education for refugees

A rallying cry from doteveryone

Over time, says Nuggets co-founder and CEO, Alistair Johnson, the expectation is that individuals and retailers will build up profiles as they do more and more transactions so can become identified as “good actors” which might allow pre-approvals based on this.

By no means all such companies will succeed so at present it is hard for individuals to know which solutions – if any – to adopt.

Pillar, Blockpass and Nuggets all harness blockchain, the distributed ledger technology that underpins Bitcoin and other cryptocurrencies. It might not be a case of putting the personal data on blockchain per se (not least because it now struggles with the GDPR requirement to allow users to delete their data) but the means of authentication certainly could use this technology.

The benefits are security through the basic decentralised nature of blockchain, immutability (once data is on blockchain there is complete transparency so it cannot be tampered with), and democratisation – the community has control. For personal security applications that need these sorts of attributes, it looks a good fit.

Is Personal Data Ownership the Answer?

It is becoming ever more common to hear the idea that taking back control of personal data could mean people being able to decide what they provide and then receiving a financial incentive for doing so. However, this theory is fraught with dangers.

For one thing, once the data is out there, it can’t be taken back, so it doesn’t look like a sustainable model. Hetan Shah, executive director of the Royal Statistical Society, speaking at a recent debate on data and inequality, one of a number of data-themed discussions at the British Library, suggested a model for personal data akin to intellectual property rights, where large companies would after, say, five years, give up the data, perhaps passing it to a charitable trust.

Second, as Siegel asked, where does the money come from – it would be from the social media and online retail giants and “it only works if it is profitable to them”. Whatever they paid for your data, he argued, you could be certain that they were earning much more than that from you and, if that wasn’t the case, they would quickly decide that they didn’t want your data any more.

There could develop a system whereby privacy rights are only for those who can afford them. Ever more commonly, it is argued that privacy should be seen as a human right, not something with which to barter.

Shah voiced such worries about the idea of “owning data”. The risk, he felt, is that the poorest people could trade away their data while rich people could opt out. There should be data rights not data ownership, he argued.

Is Help at Hand?

At least there are plenty of people with the issues now on their radars and trying to address the challenges. These include the Alan Turing Institute, the Oxford Internet Institute, the doteveryone think tank and the Open Data Initiative, as well as some national governments.

Doteveryone, founded and chaired by Martha Lane Fox, is fighting for a fairer internet and seeking to raise awareness of the issues, remove inequalities that stem from technology and make the tech industry more accountable to society. The Open Data Initiative has designed a ‘data ethics framework’ and argues for some sort of privacy accreditation.

And Shah sees hope in the UK government’s planned £9 million Centre for Data Ethics and Innovation, with its remit to advise on the measures needed to enable and ensure safe, ethical and innovative uses of data-driven technologies. At the same time, of course, data does not respect national boundaries and it might be that the ethics and regulations of some countries mean that those that want to avoid them merely shift their operations elsewhere.

Even if there is a long way to go before large numbers of citizens move from disquiet to action, there is far more awareness, debate and attempts to confront some of the biggest challenges facing an ever more digital society and the risks and inequalities that this brings.
By |2018-07-11T16:30:26+00:00Jul 11th, 2018|Society|0 Comments

Leave A Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.