Data for good: The threat from erosion of trust

So many ‘smart city’ initiatives rely on better use of data. Will ever-diminishing trust hamper such efforts? Smarter Communities joined others for a fascinating day looking at this thorny issue.

No doubt the day’s event on data privacy, hosted by specialist tech company, Privitar, had been planned well in advance. But the timing couldn’t have been better nor its coup of attracting one of those at the eye of the current storm as a speaker – Chris Wylie, the ex-Cambridge Analytica whistleblower.

That scandal, which continues to unfold, taking in facebook and the threat to individuals’ data as well as the potential to undermine democracy, has brought personal data issues into the spotlight in a way that other episodes have not. It coincides with the major tightening of EU rules, with GDPR from 25th May, but also national efforts elsewhere, including in China.

Trust in the data ecosystem is at an all-time low,” said Privitar CEO, Jason du Prez, at the start of proceedings and that pretty much was the theme of the day. What can be done to rebuild the trust, what are the consequences if not and, ultimately, do consumers care?

The Backlash and its Implications

I fear a backlash so that, when data is used badly, we can no longer use it for good,” said Jeni Tennison, CEO at UK not-for-profit, the Open Data Institute. It can be used to help people in their day-to-day lives; it can help businesses and governments to be more efficient, thereby improving services; and it can be used to better understand what works and what doesn’t work, not least in coming up with better drugs to cure diseases.

A prime example of the latter is the work of Ben Goldacre and his team in the health sector. Their openprescribing.net website uses NHS data to show what individual GP practices are prescribing, identifies the outliers, and can provide alerts for those that are in, say, the worst ten per cent. This can bring demonstrable significant savings to the health service and can improve the quality of what is being prescribed.

However, Goldacre – a doctor, writer and speaker – was scathing about those that put good use of data in jeopardy. Scandals and failures such as that of Google’s DeepMind and the Royal Free Hospital (the hospital failed to comply with the Data Protection Act when it handed over personal data of 1.6 million patients), “set back the cause of people doing good work”. Similarly, everyone knew the NHS’s earlier Care.Data data sharing initiative wouldn’t work, he argued, with the UK government’s leaflet that was sent to all UK households causing panic and ultimately resulting in the plug being pulled on the scheme.

Rebuilding secondary education for refugees

Ben Goldacre

As Tennison also argued, there should have been much earlier communication about what was intended and the potential benefits. Most people only took notice when it was splashed across the front page of newspapers.

Several of the speakers insisted on ‘Chatham House rules’ – in other words, their comments could not be attributed – which is frustrating on the one-hand but allows them to be more open on the other. So we had a speaker from one major company admitting that there are “so many security holes in so many institutions” that hackers may sometimes over-estimate how sophisticated they need to be. This is particularly the case where there have been mergers and acquisitions, he felt.

Too often, privacy is considered purely from a company perspective, not that of the consumer. And on occasions, weaknesses are known about but nothing is done until there is a problem, as was the case with the security breach in Windows 8 and XP that only caused panic across the globe with last year’s ransomware hack despite widespread awareness for perhaps three years beforehand.

Goldacre and others made the point that anonymised data is never safe. Often, it can be used to identify individuals by the simple use of other datasets. Similarly, encrypted data is only safe if those holding the key to the encryption can be trusted to keep it safe and use it responsibly.

Privacy – A price worth paying?

There is the adage, ‘if it’s free, then you are the product’. If a product is good enough, will people be willing to pay for it with their data, so that it comes to be viewed as having a price? How explicit will this be? And will privacy become something that only the rich can afford, along the lines of a Swiss bank account?

In part, of course, it depends on the intended use and whether there is the trust that the data will not be used for any other means – something that is at the crux of the Cambridge Analytica scandal.

As one speaker pointed out, when your data is used for profiling so that you are one of thousands to receive a particular advert then people may be comfortable with that. But as the buckets become smaller, down to segments of one, it becomes “creepy”.

This applies not only to data but also to the technology for capturing and manipulating this. For instance, face recognition technology can have positive uses but how about when it starts to be used for state-level monitoring or to identify targets for drone attacks? Similarly, film production technology could be benign until it is used to create fake news.

There are also plenty of examples of data being used stupidly, bringing unexpected consequences, which do nothing to instill trust. The risk of police using data to predict who, where and when crimes might occur is a prime example and is mercilessly dissected in Monika Hielscher and Matthias Heeder’s film, Pre-Crime.

Rebuilding secondary education for refugees

Tennison feels that questions such as, should facebook pay users for their data and can people own their data, are wide of the mark. There is now the realisation, she believes, that we all have “data shadows” – “other people know things about us, perhaps even things we don’t know ourselves”. There is a line of thought that monetary value can be attached to this, also driven by “frankly, a desire for a cut of the profits of the big data monopolies”.

However, she argued that privacy is a fundamental human right and it is a dangerous world where human rights are sold. For one thing, data is rarely limited to an individual. For instance, facebook data links to friends and family; DNA and medical data can inform about past and future generations; weekly shopping data reveals family preferences. And more sweepingly, the data of, say, a middle-aged white British woman with two children starts to build a picture of the behaviour and preferences of all middle-aged white British women with two children. Individuals have data rights, as embodied within GDPR, but society also has data rights, she argued.

Tennison said language around data ownership and price should be stamped out, particularly among politicians who don’t understand the subtleties. Which, ironically, was exactly the language used in an article published the following day by the leader of the LibDems, Vince Cable.

What can be done?

Tennison’s organisation, the Open Data Initiative, has designed a ‘data ethics framework’ and she would like there to be a move towards some sort of privacy accreditation. She feels there is the need for this sort of ‘softer’ regulation and intervention, alongside better individual controls, collective action and hard regulation. “It needs to be a messy mix that somehow works and somehow evolves.

There is also increasing talk about ‘privacy by design’, whereby it is embedded within applications. George Danezis from UCL explained how this can be defined and cited the work of Professor Jaap-Henk Hoepman at Radboud University Nijmegen which sets out how applications can be built to minimise the amount of information that is collected, so filtering out what is not needed, then separating the data so that only what is needed is shared with different users, as well as having in-built controls for aggregating, hiding and encrypting it. Danezis also pointed out that users should be informed about the use of their data throughout the user experience, not just at the outset.

Will consumers care enough to opt out? This depends and will clearly vary by country, age, education levels and other factors. Certainly governments and activists are coming round to caring, which Tennison hopes will drive improvements.

Chris Wylie

Last but not least, what was Wylie’s take on the topic and the systemic issues that he helped to lay bare? He admitted that he is far from perfect and, ultimately, he did work for SCL Group, which largely works on military information warfare, and then for its subsidiary, Cambridge Analytica, before his concerns about the latter’s operations led him to blow the whistle. The trigger-point, he said, was when Cambridge Analytica took what SCL did and took it into democracy, “weaponising information and denying consent”.

Rebuilding secondary education for refugees

Harry Davies and Chris Wylie

It was meant to be a Q&A with ex-Guardian researcher, Harry Davies, who was early on the scene reporting on what was happening, but Wylie is an extrovert with a lot to get off his chest, so the occasional question tended to spur lengthy, rapid-fire and enthralling answers.

The story was a year in planning, he revealed, and he believes that facebook knew about the issues, whereby app developers were harvesting data, as far back as 2015. He said the “brilliant” all-female team at the Observer newspaper had no technology writers, viewing it as a social story, first and foremost. And he feels their decision to collaborate with Channel 4 and the New York Times was an enlightened one. There was strength in numbers particularly when facebook “freaked out and threatened to sue everyone”, contesting the claim it even had a data breach. Each media outlet held its nerve, because everyone else was holding their nerve, he said.

Facebook then tried to spike the story, which he felt was a huge own-goal and, even now, despite the apologies and promise to take action, it is still resisting scrutiny. He feels this is reflected in Mark Zuckerberg’s refusal to come to the UK to face questions, preferring “softball questions from senators who don’t understand the internet”.

Wylie believes this is no longer an abstract issue – “it feels like a moment of change”. Previously, liking something on facebook didn’t feel dangerous but it “reveals discreet clues, little by little”. He believes 100 likes is sufficient for someone to do a decent job of profiling. Other aspects, including syntax and vocabulary, can also add to the picture.

At that point, people can be manipulated and democracy is eroded. He likened it to two people on a blind date, where one already knows everything about the other. Governments might also do this, “in which case we have a real problem on our hands”, particularly where they do not respect fundamental human rights.

The onus needs to be on the platform providers, he felt, and he echoed the call for “privacy by design”. Privacy should be viewed the same way as personal safety, he added. Overall, he is encouraged by people now being more skeptical of facebook and other technology companies and he feels that, compared with other data episodes, this has had a greater effect on a wider demographic.

Conclusion

It feels like a debate that is still at a fairly early stage but one that will be central to the nature and governance of our society, not least as sensors and cameras proliferate, on our streets, in the workplace and in homes. The Cambridge Analytica/facebook saga has more twists and turns to come and it is hard not to anticipate other revelations – certainly Wylie feels it is an industry-wide issue. In the same way that every like builds more of a personal profile on facebook, so every scandal chips away at trust, which ultimately makes it harder for those who want to use data for good.

By | 2018-04-30T11:15:52+00:00 Apr 30th, 2018|Society|0 Comments

Leave A Comment

If You Enjoyed This Post
Join My Newsletter
Subscribe
Close