Season’s greetings: goodbye to 2020 and the 12 months of Ada
Four policy-facing reports, three citizens’ juries, two expert working groups and a global pandemic
18 December 2020
Reading time: 13 minutes
When I took up my role as first Director of the Ada Lovelace Institute in July 2019, I was honoured by the trust placed in me, but – inside – felt like a nervous Masterchef contestant, presented with high-quality, carefully chosen ingredients and asked to write and deliver a recipe, not for a souffle (that would have been easy!), but for a sustainable and impactful organisation in an area of ever-more-pressing societal need.
I knew I had all the makings of a successful outcome – the support and reputation of the Nuffield Foundation, a handful of brilliant colleagues, and an immense amount of goodwill from founding partners and other stakeholders.
Unlike an aspiring chef, I have not been working alone. Rather, over the past 18 months I have worked hand in hand with a group of wonderful, hardworking colleagues to build the beginnings of an endeavour that we believe will have longevity and resonance, and produce real-world change. And we’ve had the support of colleagues and critical friends from multiple sectors and disciplines around the globe, keeping a close eye on our progress
At the beginning of the year we published our strategy, and set a marker to produce an annual report at the end of our first 12 months of operation in earnest, to report on progress. 2020 has disrupted even these small plans: not only have we found ourselves busier than we’d anticipated, producing much-needed evidence and research at pace to inform decision-making on pandemic technologies, we’re also still in the midst of building and shaping the Ada Lovelace Institute, and not yet ready to suggest we’ve completed what will continue to be an ongoing process of growth and development well into 2021. We’ll be reaching out for feedback and evaluation in the new year, to help us understand what we’ve achieved, and how to use those learnings to guide what we do next.
Nevertheless, at the end of a year in which time has taken on a range of new and unusual interpretations, we’re stopping to take stock of some things we have achieved this year – and some things we’ve faltered on. As a research institute, we are rooted in evidence-based, iterative thought development and committed to transparency of policy and practice. We know so many people are invested in helping this organisation succeed and we’re so grateful for that support, so please join us in reflecting on what has worked, what hasn’t, and where we’d like to go in 2021.
Settling into our approach and methodologies
When Ada was established by the Nuffield Foundation in 2018, it was with a clear remit – to ensure data and AI work for people and society – and three ambitious aims:
- to build evidence and foster rigorous research and debate on how data and AI affect people and society;
- to convene diverse voices to create a shared understanding of the ethical issues arising from data and AI;
- to define and inform good practice in the design and deployment of data and AI.
Our team seized these aims as the framework within which to develop a range of methodologies that define Ada’s work.
As a research institute based outside academia, we saw the opportunity to synthesise, translate and build on existing research and scientific expertise, repackaging academic analysis for policymakers and practitioners. Our synthesis and translation include Exploring principles of data stewardship, in which we compiled a list of more than 100 data sharing and access initiatives and assessed them against principles developed through application of Ostrom’s work on governing the commons; Examining the black box: Tools for assessing algorithms and Inspecting algorithms in social media platforms, which identify the range of algorithm accountability tools, such as impact assessments and regulatory inspections, in development; Meaningful transparency and (in)visible algorithms, and The data will see you now.
Many of these synthesis pieces doubled as tools and resources to standardise and clarify language, and surface research agendas. We have also developed other tools and resources throughout the year, including an international monitor of public health identity systems which has followed the rollout of immunity passports and vaccine certificates in response to the COVID-19 pandemic, and the digital contact tracing tracker which presents a global comparative view of contact tracing apps in more than 60 countries.
Our most prominent work this year deployed a different methodology, which was the convening of experts and working groups. Exit through the App Store? was a rapid evidence review conducted through an expert convening of more than 20 social science, philosophy, legal and technical scholars. We also established the Rethinking Data working group, which is deliberating over a longer time frame to develop a positive vision for the use of data which protects and realises its social value. And we commissioned legal expert Matthew Ryder QC to undertake an independent review into the governance of biometric data.
Another key approach we invested heavily in this year was public deliberation and public attitudes research. With Understanding Patient Data, we ran a citizens’ jury focused on health data partnerships, resulting in the report Foundations of fairness. Following on from our influential public attitudes survey, Beyond face value, we established the Citizens’ Biometrics Council, a long-form public deliberation comprised of two groups of 30 members of the public, one based in Bristol and the other in Manchester, who heard expert evidence and deliberated on questions related to the social legitimacy of biometrics technologies, including facial recognition technology. We faced the not insignificant challenges of uprooting the in-person Council midway and re-establishing it online, due to the pandemic. In doing so we replicated techniques deployed in our first fully digital online public deliberation, convened to scrutinise the development of COVID-19 technologies, the results of which formed the basis of the report, Confidence in a crisis.
This year we also began our own primary research, in the form of a digital ethnography conducted in collaboration with UCL and focused on the deployment of a predictive analytics system in use by a London local authority. We undertook more than 100 interviews as part of the research, which will be published early in 2021, and developed a nuanced understanding of the complex factors grounding and influencing the use of data and algorithms by local authorities which will inform much of our policy-oriented work going forwards.
All the while, we took seriously our remit to convene diverse voices, create community and bring together experts, researchers, policymakers and practitioners. With the support of the AHRC, we established JUST AI, a field-building project led by Dr Alison Powell, with a specific remit to map the field of AI ethics and to begin networking researchers together using working groups, research labs and research commissioning. JUST AI commissioned a cohort of projects on the link between racial justice and AI, and appointed four JUST AI fellows for the duration of the projects.
We held dozens of events, online and off, and attended and presented our work at many more. Aided by the wholesale move to digital that has characterised the pandemic, online convenings became core to our work, with both public and private workshops facilitating engagement with regulators and policymakers on work concerning immunity passports and data sharing, and grounding a collaboration with the ODI, RSS and Institute for Government in response to the National Data Strategy.
We also took our objective to convene diverse voices into the written realm, actively procuring and commissioning articles and blog contributions from a range of internal and external contributors. By the end of 2020 we will have published around 25 long reads or blogs by external authors, drawn from across academic and civil society, including one unionist and two teams of lawyers. These blog posts, and those published by the Ada team, have attracted more than 30,000 unique views across 2020.
Navigating a changing landscape
We have done this work against a backdrop of continuous flux. Changes brought on by the pandemic, including the move to a fully remote and distributed team, saw us confronting the challenge of onboarding new staff from afar. In this we benefited immensely from the support and compassion of the Nuffield Foundation, our parent organisation, which gave our fledgling Institute the financial and human(e) support we needed in a time of great uncertainty.
The crisis we found ourselves in brought hurdles – we had to move our in-person ethnographic study to a less than desirable remote format, for example – as well as opportunities. We developed a mode of delivering online public deliberation at pace, and in doing so discovered an enduring approach that we’ll use going forward. We found new audiences for our events and were surprised to attract large numbers of participants in our early webinars focusing on COVID-19 technologies, particularly digital contact tracing apps. We found policymakers and experts willing to connect via digital tools, and realised that the threshold for convening a range of busy people is more easily cleared with a Zoom meeting than it is with an in-person one.
We continued to build our Board, adding four new trustees in January, farewelling our founding Chair Sir Alan Wilson and welcoming Dame Wendy Hall as Chair in July. Under Wendy’s dynamic leadership we’ve worked hard to cohere a Board of diverse skillsets and opinions in the online realm, and again found that tools such as breakout rooms have opened new possibilities for collaboration.
Prompted by the Black Lives Matter movement, we’ve undergone the difficult work internally of confronting our own contributions to systemic racial injustice, and committed to addressing structural injustices in all aspects of our work. If I’m honest, we’ve still got a long way to go to meet the (rightly) high standards we’ve set ourselves on this front, in particular as we strive to increase the representation of underrepresented groups among our core staff. We’ll be trialling a different approach to recruitment to ensure this happens in 2021, so please consider this a public commitment and hold us accountable to it.
These internal changes have occurred in parallel to some significant changes to the ecosystem of organisations and actors interested in the societal impact of new technologies. It was with great sadness that we witnessed the winding up of Doteveryone, which had shaped and mobilised the responsible technology agenda in the UK, even as we were excited to see new organisations blossom, including Understanding Patient Data and Foxglove. New research institutes were also born – the Institute for Ethical AI at Oxford’s new Schwartzmann Centre, the Minderoo Centre for Technology and Democracy at the Centre for Research in the Arts, Social Sciences and Humanities at Cambridge, the Edinburgh Futures Institute at the University of Edinburgh, and the International Observatory on the Societal Impacts of AI and Digital Technology in Canada among them.
You choose some issues, and some issues choose you
The commitment we made in our strategy to being ‘agile, responsive and prepared to use moments of crisis to advance change’ has been stretched to its limits in 2020.
We started the year with the intention to work in five main areas: biometrics technologies; rethinking data narratives, practices and regulation; understanding how data and algorithms are used in the public sector and for the delivery of public services; health data and data about health; and sustainability issues connected to AI research and development.
Each have continued to be strong themes in our work throughout the year, but have taken different forms to what we had imagined in the strategy we published in February. The COVID-19 pandemic has fundamentally changed the backdrop against which many of the areas of enquiry we identified have played out.
The use of data to inform decision-making has been front and centre during the crisis, surfacing some of the fault lines in the data governance ecosystem: the power assymetries in private and public sector access to data, the ‘tenuous public trust’ in government use of data and algorithms for decision-making, and the complete control over digital infrastructure exercised by major tech platforms.
It has given new meaning to ‘health data’, rendering – from one view – all data as health data, and forcing complex discussions about surveillance and monitoring of the public in the name of public health. The crisis has shone a light on structural inequalities and brought conversations around data representativeness to the fore, as bias in algorithmic decision-making has received increasing attention.
The pandemic created opportunities to engage with issues we hadn’t planned to address in our research agenda. In particular, the development of digital contact tracing apps, and immunity and vaccine certification technologies became major areas of work, as we sought to translate complex knowledge about the technical efficacy, public legitimacy and societal implications of these technologies into rapid advice and guidance for policymakers and public-health practitioners. The work we have done in this field has led to a new research partnership with a major foundation that we will launch in January, focusing on the impact of data-driven technologies in the pandemic on health and social inequalities.
Taking on new work necessitated pausing other initiatives, and our work on sustainability and AI took a back seat as we developed our responses to the crisis. We remain committed to developing research on climate change and AI research, and are currently considering ways to take this agenda forward. Climate and environment will – rightly – resurface as the pressing policy challenge in 2021, so again, please take this as a public commitment and hold us to account.
So, what’s next?
Much as we’d like to take 2021 to recover from 2020, we’re already anticipating a busy schedule of research, events and convenings going into next year. Our Rethinking Data Working Group will begin preparing its final report, which will take a future-focused approach to imagining a radical new vision for the use of data that will take us to 2050. We will publish Matthew Ryder QC’s independent review on biometrics, and our own report of the outcome of the Citizens’ Biometrics Council. We’ll also be publishing a major ethnographic study of the use of data and predictive analytics by a London local authority.
We’ll be starting new work looking at research ethics in universities and in companies, and convening discussions to share experience interfacing with Institutional Review Boards and ethics committees when doing AI research. We’re undertaking a project with a public body to develop research on best practice for impact assessment, and continue to think about how regulators can be best equipped for regulation of digital platforms. We’ll be gathering evidence and convening experts to debate the roll-out of vaccine certification in response to the pandemic. And, with our sister organisation, the Nuffield Council on Bioethics, we’ll be exploring work at the intersection of AI and genomics.
All the while we’ll be thinking about the big questions that 2020 has raised: what does ‘building back better’ mean for our societies, and how might data and AI play a role in the recovery? What will be the long-term impacts of the normalisation of public health surveillance for public notions of privacy? What is the right way to curtail the monopolistic power of big tech entities, and shape a future of trustworthy technology development? How can we ensure the equitable distribution of the benefits of new technologies, and guard against the exacerbation of structural inequalities? And how can we as an organisation continue to put social justice, individual wellbeing and public trust at the centre of conversations about data and AI?
We are proud of the work we’ve achieved this year, and humbled by the trust put in our emerging organisation to help answer these pressing, societal questions. We take up the mantle in the knowledge that we can’t do this alone, and your feedback and constructive criticism has helped us to shape mission-critical aspects of what we do: we’d love to hear your thoughts on Twitter at @AdaLovelaceInst or email to hello@adalovelaceinstitute.org.