UK wants to lead the world in tech ethics…but what does that mean?
Public debate on tech ethics has been a long time coming.
Reading time: 5 minutes
Last week we surely reached peak hype on tech ethics with this headline:
The Sun was responding to the House of Lord’s AI report, the latest of a growing number of interventions from the government, industry and civil society seeking to grapple with the ethical and social issues technology is posing.
Public debate on tech ethics has been a long time coming. New technologies have been adopted and used with little scrutiny: it has felt like a golden era of ever-improving free communications, search, and content.
Precisely because this has been free, we’ve been slow to realise the potential costs, risks or harms of giving away information. No longer: the Cambridge Analytica/Facebook saga has triggered a public conversation. It hit home because it’s about normal people, sharing things they didn’t think were particularly sensitive, in a semi-private space, which suddenly may have contributed to something of global significance – the election of Trump or the vote for Brexit depending on which side of the pond you sit.
This saga has raised questions about data rights, consent and use; it has triggered debate about where preferences translate into profiling and where micro-targeting becomes manipulation. The delay faced by the ICO in being granted a search warrant of Cambridge Analytica’s servers and Mark Zuckerberg’s refusal to meet MPs has led to some arguing that we need stronger powers and regulations to redress the power of the tech giants.
We need to grapple with concerns and challenges while being wary of ‘techlash’ which could block innovation. Shutting down data collection and analysis might curb some harm, but it would also limit a lot of good, from innovation in medical diagnosis, to more efficient public services.
The Government has set out its ambition and a sense of urgency to make the UK the lead for the ethics of tech and data use, but how can we move that agenda forward – to ensure practice is not just trusted but trustworthy?
I think we need four things
First, we need informed public deliberation, to enable a diversity of voices to explore the risks and trade-offs posed by the use of technologies. We need companies, public sector and policy makers consider where and how users/citizens should have some say when it comes to use, design and deployment.
User analytics don’t act as an adequate proxy for how the public feel about a service: the argument that users are happy with the status quo because they continue to use a service is no longer a persuasive one if it ever was. Understanding is low, and the deck is stacked: people give consent and share data on platforms that have been expertly designed to get them to do just that.
Low levels of understanding and trust need to be seen as a risk, not an opportunity: all it might take is one ‘Black Mirror-type story’ for people to walk away from a product or platform en masse, or for policy-makers to feel the need to implement knee-jerk regulation at pace.
Second, we need better evidence about the impacts of technology on society. Much of the narrative is either being led by expert investigative journalism – which is inevitably is drawn to the worst cases – or by industry – which inevitably profiles what’s best. We need to build a richer and more neutral account. To do that, we need to be able to measure and understand how individuals and society as a whole are and could be affected (for good or ill): who is at risk of harm, and how innovation in different sectors or across different platforms adds up.
Third, we need to embed ethical thinking into the development and deployment of technologies. Social impact needs to become as important to developers and investors as innovation – we need to disrupt the “move fast, break things and apologise” model of innovation through more conscious thought about the value of what might be broken.
That means common norms and standards to translate ethical principles into practical decisions, including stress-tested frameworks and tools to probe technical elements (data provenance, consent, design), as well as exploring the harder questions: does the underlying data or logic reflect the values and society we want to have in the future? How will platforms or products be used ‘in real life’: from how it will be kept up-to-date, monitored, assessed, explained; and what knock-on impacts might it trigger?
Last, we need to articulate a vision of a tech-enabled society with social justice and wellbeing at its core, and lay the foundations to realise that vision. Fundamental norms relating to privacy, ownership, rights, civil liberties, wellbeing, work and justice are being destabilised and need to be rebuilt for a data driven society. We need to undertake considered but ambitious thinking, equal to the disruptive power of technology, and unconstrained by political pressures, to collectively harness the power of data for public good.
These issues can’t be solved by a single institution, sector, technical solution, new piece of regulation, or company (although it is heartening to see some organisations making ethics their calling card). This needs to be a collective project: to develop norms, standards, frameworks, research and dialogue to create the sort of data-enabled society we all want: a future where innovation supports wellbeing.
That’s why we’re playing our part, working in partnership with leading organisations, including The Alan Turing Institute, the Omidyar Network, techUK amongst others. Together we are setting up the Ada Lovelace Institute to connect diverse actors and specialists, develop practical ethical ‘case law’, and catalyse longer-term research on the social impact of data, algorithms and AI, and strategies for social justice in a data-driven society. And to prevent the impending ‘Terminator-style apocalypse’.
Imogen Parker is Programme Head – Justice, Rights and Digital Society at the Nuffield Foundation. This blog first appeared as a guest blog on techUK’s website as part of its AI Week.