Over the past decade, big tech platforms have productised our identities for their own gain. As we input our information into social media, transacted on e-commerce sites, applied for loans and swiped right, a picture of our behaviours – and our future intent – was constructed so that it could be commoditised and sold to third parties.
We have a degree of agency: you can delete your social media, not shop online or refuse to hand over personal information – at the cost of being unable to function in a digital marketplace or workplace. (True story: WIRED abandoned using an online tool for our 2019 Secret Santa when it became clear that it was harvesting data by masquerading as a fun, elf-friendly service.)
Whether we’re hovering over an item on a website, hailing a rideshare or organising seasonal tat-gifting, we’re compelled to hand over data in exchange for services. The purpose is singular: these organisations are incentivised to make money for themselves and their shareholders – see the ethical code of Mark Zuckerberg, Sheryl Sandberg, Nick Clegg, et al. Reminder: in 2019, Amazon announced that it paid taxes of £220m on £10.9bn in UK revenue. Facebook’s most recent UK tax return shows that it paid £28m on revenues of £1.65bn.
But what if data extraction was aligned with a wider set of values that offered citizens greater control while serving a social purpose? Take data sharing in healthcare – NHS England’s 100,000 Genomes project uses a “broad consent” model, meaning that participants agree that their data can be accessed by approved individuals.
A 2019 paper from the innovation charity Nesta mapped out an “ecosystem of trust” based on data trusts – institutions “tailored to different conditions of consent, and different patterns of private and public value” – that would seek to impact areas such as education, employment, policing, mobility and healthcare. These public commons raise issues of ethics and governance which are crucial to protect privacy, and these factors should spark a necessary conversation about how organisations – and particularly government – should integrate data analytics into day-to-day operations and share them with both public and private data trusts in order to provide richer samples and enable the development of products and services.
Cities have a significant role to play in this aggregation of data for the public good – Barcelona and Amsterdam have both taken a lead in building models in which data that was previously privatised is aggregated in order to improve services such as mobility and sanitation. If individuals have a Personal Data Store – a repository of their data that they can then opt to make available to third parties – they may choose to, for instance, share their personal information with the city transport network, but not with an online retailer.
A Nesta poll in 2018 suggested that 73 per cent of people would share personal data for public benefit if that data was subject to secure and ethical governance. Privacy by design, accountability and standards upheld by audited, licensed public and private trusts could help not only ensure that data is leveraged for the common good, but also rebuild public trust in institutions and private companies. Data sharing can be a powerful tool for everyone, not just to fund billionaires’ space programmes.
Greg Williams is WIRED's editor-in-chief. He tweets from @GregWilliams718
This article was originally published by WIRED UK