Tennison feels that questions such as, should facebook pay users for their data and can people own their data, are wide of the mark. There is now the realisation, she believes, that we all have “data shadows” – “other people know things about us, perhaps even things we don’t know ourselves”. There is a line of thought that monetary value can be attached to this, also driven by “frankly, a desire for a cut of the profits of the big data monopolies”.
However, she argued that privacy is a fundamental human right and it is a dangerous world where human rights are sold. For one thing, data is rarely limited to an individual. For instance, facebook data links to friends and family; DNA and medical data can inform about past and future generations; weekly shopping data reveals family preferences. And more sweepingly, the data of, say, a middle-aged white British woman with two children starts to build a picture of the behaviour and preferences of all middle-aged white British women with two children. Individuals have data rights, as embodied within GDPR, but society also has data rights, she argued.
Tennison said language around data ownership and price should be stamped out, particularly among politicians who don’t understand the subtleties. Which, ironically, was exactly the language used in an article published the following day by the leader of the LibDems, Vince Cable.
What can be done?
Tennison’s organisation, the Open Data Initiative, has designed a ‘data ethics framework’ and she would like there to be a move towards some sort of privacy accreditation. She feels there is the need for this sort of ‘softer’ regulation and intervention, alongside better individual controls, collective action and hard regulation. “It needs to be a messy mix that somehow works and somehow evolves.”
There is also increasing talk about ‘privacy by design’, whereby it is embedded within applications. George Danezis from UCL explained how this can be defined and cited the work of Professor Jaap-Henk Hoepman at Radboud University Nijmegen which sets out how applications can be built to minimise the amount of information that is collected, so filtering out what is not needed, then separating the data so that only what is needed is shared with different users, as well as having in-built controls for aggregating, hiding and encrypting it. Danezis also pointed out that users should be informed about the use of their data throughout the user experience, not just at the outset.
Will consumers care enough to opt out? This depends and will clearly vary by country, age, education levels and other factors. Certainly governments and activists are coming round to caring, which Tennison hopes will drive improvements.
Last but not least, what was Wylie’s take on the topic and the systemic issues that he helped to lay bare? He admitted that he is far from perfect and, ultimately, he did work for SCL Group, which largely works on military information warfare, and then for its subsidiary, Cambridge Analytica, before his concerns about the latter’s operations led him to blow the whistle. The trigger-point, he said, was when Cambridge Analytica took what SCL did and took it into democracy, “weaponising information and denying consent”.