Skip to content
Blog

The Ada Lovelace Institute in 2023

Reflections on the last year and a look ahead to 2024 from Ada’s outgoing Director Carly Kind and Interim Director Fran Bennett

Carly Kind , Francine Bennett

19 December 2023

Reading time: 4 minutes

Keyword
General

The Ada Lovelace Institute was founded with a mission to ensure that data and AI work for people and society. This means that we start with the question ‘What society do we want to live in’ rather than focusing on emerging technologies and their uses.

The themes and values that run through everything the Institute does – rebalancing power, centring people and society, amplifying diverse voices and fostering a more informed debate –  continue to shape the way we think and the work we do.

The prescience of the Nuffield Foundation in establishing the Institute and the importance of our mission were absolutely confirmed in 2023. In this blog post, we look back on some of the year’s biggest developments, milestones and achievements, and look forward to what is in store for 2024.

AI takes centre stage

2023 was the year when AI – which we at Ada have been discussing, thinking about and exploring since 2018 – hit the news and the policy agenda. ChatGPT and other generative AI systems have surprised many with their human-seeming abilities to write, code, generate images and more, kickstarting a wave of excitement and trepidation.

To inform the emerging debates on AI, we at Ada have continued our work unpacking and explaining what these generative models actually are, how they are developed and deployed and exploring how they could benefit society.

Shaping the policy landscape

Alongside the headlines and hype, governments and institutions in the UK, EU, USA and global majority countries have been grappling with how and whether to regulate AI. The UK’s White Paper on AI Regulation set out the current UK Government’s domestic proposals for regulation, although this was undercut by some of the proposed changes to UK data protection law and swiftly overtaken by a focus on international collaboration and the risks of frontier AI at the UK’s AI Safety Summit in Bletchley Park.

Ada was one of very few civil society organisations to be invited to the Summit, where Fran gave some closing remarks emphasising the fundamental importance of putting people, rather than technology, at the centre of the AI conversation. Ada also signed a joint communique from civil society attendees reflecting on the Summit and calling on governments to prioritise regulation to address well-established harms.

We were also closely involved in convening diverse voices as part of the ‘AI Fringe’, a series of events that emerged in response to the absence of public and civil society voices at the Summit. We delivered a keynote speech and discussion panel on AI safety and a discussion panel on public participation in AI.

As part of the Fringe, we supported Connected by Data’s People’s Panel, which brought together members of the public to observe and discuss Fringe events. The panel put together a series of recommendations for the future governance of AI, based on conversations with experts, attending events at the Fringe, following updates from the AI Safety Summit, and their own lived experience.

2023 also saw the ongoing development of the EU AI Act. After a long and winding process, this Act stands as a globally pioneering piece of legislation to regulate AI, including foundation models and generative AI. Ada has been deeply involved in the AI Act’s evolution, seeking to create useful evidence for EU policymakers to ensure this globally significant legislation has the best chance of success, and that it works for people and society

Building our evidence base

We have continued to prioritise the critical work of building the fundamental evidence needed for policymakers, industry and civil society to make inclusive, just and equitable decisions about data and AI. We apply multiple research approaches to understand current and new aspects of data and AI, how they affect people’s lives and our institutions, and what people think about and want from data and AI.

A browse through our website library gives a sense of the breadth and depth of our work in 2023. Major research milestones this year have included the publication of a large-scale public survey of attitudes with the Alan Turing Institute, which found that more than 60% of the British public support strong laws and regulations to guide the use of AI.

We also published a major piece of work with the Health Foundation on how the accelerated adoption of data-driven technologies created by the COVID-19 pandemic has affected health inequalities. While it might feel as if the latest wave of generative AI is wholly new, the body evidence on data, AI and society that we and others have been building continues to be extremely relevant to building a better future for all of us.

Looking to the future

We believe that, however fast new technologies move, people and society are at the heart of everything we do. We need to pay attention first and foremost to the human experience of data and AI, looking at who is helped and harmed by these technologies. We need to understand who controls and accumulates data and technologies, what power they gain and wield as a result, and how we can work with policy and industry to rebalance this, to benefit everyone.

We end the year with an expanded team (now nearing 30, from tiny beginnings a few years ago), a new Chair and new Board members, and with renewed energy and excitement about the urgency, importance and impact of our work.

Next year will be an exciting one for us, as we will appoint a new Director and work to refine our future strategic direction. There’s no doubt that 2024 will bring more headline-grabbing jumps forward in the technical capabilities of the most powerful AI systems. We’ll spend time understanding, unpacking and working with others to explore how those jumps can contribute to creating better, fairer societies for everyone – and our core principles and mission will continue to inform everything we do.