Skip to content
Report

Spending wisely

Redesigning the landscape for the procurement of AI in local government

Mavis Machirori

14 November 2024

Reading time: 77 minutes

People walking along the busy pedestrianised shopping street, Buchanan Street, in the centre of Glasgow, Scotland. At the far end of the road is the Royal Concert Hall, behind an entrance to Buchanan Street Underground Station.

A note on terminology

 

Artificial Intelligence (AI)

AI is an umbrella term for a range of algorithm-based technologies designed to carry out tasks previously considered to require human behaviour, intervention or oversight. There is no commonly accepted scientific definition of ‘AI’ and it is used to refer to a wide range of computational techniques like machine learning, natural language processing and deep learning that are considered capable of performing tasks that might traditionally require human intelligence to complete.[1] This includes ‘narrow’ AI systems such as analytics systems used to make predictions and judgements about individuals, as well as so-called ‘general purpose’ AI systems or foundation models.[2]

 

Some of these technologies are already in widespread use across the economy, while others are less widely deployed or are being considered for future use. Throughout this paper, references to ‘AI’ or ‘AI technologies’ without any other modifier should be understood as references to the wide variety of these systems.

 

Local government

We use the general term ‘local government’ to refer to the range of bodies responsible for delivering a wide range of services in local areas, drawing from the Local Government Association (LGA) definition and noting that the different tiers of responsibility determine the functions of specific authorities.[3]

 

Procurement of AI in local government

We use the term ‘procurement’ to refer to both the responsibilities of local government procurers and the procurement process itself when it comes to buying AI technologies. We acknowledge that in local government, these processes are complicated because procurement capabilities are subject to available capacity and resource. We recognise that a ‘procurement team’ can include other roles and responsibilities beyond the contracting element and may include commissioners, data engineers and equality teams. All these roles are important to procurement and should be represented in some form within the process.

 

Our project focuses on situations where local government procurers are purchasing AI technologies in response to a local need or challenge. We use the terms ‘procurement landscape’ or ‘procurement processes’ to refer to processes covering the procurement of digital, data and technologies, of which AI is a subset.

 

Our research acknowledges that there are various ways that AI is used in local government, including solutions developed in-house or AI upgrades to existing technologies. Although some of the issues discussed would apply to in-house development and upgrades, they are not in scope for this discussion.

Executive summary

The Ada Lovelace Institute (Ada) has undertaken a two-part examination of how procurement of AI is working in local government in England.

While AI technologies are being introduced rapidly, our research finds that there is a lack of coherent support and guidance for those procuring AI in local government and that some areas need significant improvement.

When using AI technologies for public services, procurement is the essential first step of the process. Getting the procurement of AI right – in terms of utility, scope, safety and ethics – is vital for ensuring data and AI work effectively, and in the public interest. As current guidance is not sufficient to aid local government to procure AI effectively, this paper calls for the creation of a national taskforce to help break down silos and ensure a consistent, collaborative approach to the procurement of AI in local government.

The UK Government has hailed the effective use of data and AI to increase productivity and improve efficiencies in public services, with the potential to boost the economy and improve people’s lives.[4] Decisions about which technologies are used have a significant impact on how people access and experience public services. Procurement decisions therefore provide an important opportunity to ensure that AI technologies are legitimate, safe and effective and are used to benefit society.

The enthusiasm surrounding the potential of AI to improve public services contrasts with the reality of limited human and financial resources in the public sector. The rapidly evolving technology presents many opportunities for the public sector, but if adopted uncritically, it brings potential for harms. Recent prominent examples of data and AI systems not working as intended – such as the Post Office’s Horizon software[5] and the Home Office’s visa application streamlining algorithm[6] – have raised important questions about how equipped the public sector is to procure and oversee evolving technologies across different stages of their lifecycle.

In the UK, there is a range of guidance and legislation to aid procurement decisions in the public sector. Most recently, new legislation in the form of the Procurement Act 2023 aims to help the public sector purchase goods and services effectively. The Act aims to streamline and increase transparency in the public sector procurement process,[7] including tackling concerns around competition and underperformance of contractors.

However, there are emerging challenges. The Act itself is broad and does not cover the procurement of AI, creating the need for AI-specific guidance. In the meantime, delays in the implementation of the Act[8] leave work to be done to embed new processes into existing workflows and infrastructure.  Despite these delays, AI technologies continue to proliferate in public services with no real evidence that they address the challenges facing society.[9] This raises questions about how to govern these technologies.[10]

As the UK Government creates its vision and ambition for a ‘digital centre’ that places innovation and better outcomes for people at the core of its agenda[11], our work plays a small part in contributing to and supporting that agenda.

Findings

Gaps in guidance

In our first report, Buying AI,[12] we analysed 16 key pieces of guidance and legislation that are available to local government procurers and found that the information available to help with decision-making is not fit for purpose.

Our document analysis highlighted the following challenges for procurers:

  • There is a lack of consensus on terms and definitions of key mechanisms or concepts that may help make procurement of AI less complex.
  • There are gaps and inconsistencies on how to describe (and therefore measure) social benefit, with over 50 terms used to describe various concepts related to it (including public benefit, fairness and impact assessments).
  • There is a lack of comprehensive advice on how to implement the information contained in the documents; procurers must spend considerable time and effort interpreting them. There is no standard approach for prioritising competing demands such as value for money, social value, impact assessments or transparency.

Existing guidance and legislation are not fit for purpose and should be better aligned and in some cases redesigned. Meanings and terms need to be clear and specific enough for the changing context of AI technologies. The guidance and legislation should be adaptable to those AI solutions that continue to iterate post-procurement. There should also be more practical support for local government about how to interpret and use the documents.

Experiences on the ground

In this second stage of the research, we explored diverse experiences of and perspectives on procurement of AI in local government in England. We spoke to stakeholders whose roles directly or indirectly intersect with the procurement of AI technologies to understand how well the process is working on the ground. This included people commissioning services, those with governance responsibilities, staff at technology companies, product developers, policy and strategy teams, and regulators. To test our initial recommendations, we also spoke to a mixed group of stakeholders from both central and local government departments, whose roles involve making decisions on AI-related policy and strategy, regulation and ethics.

The discussions – interviews plus a cross-industry workshop – served to highlight the scale of the challenge of making procurement a lever for change, given the gaps between the reality of procurement on the ground and the hopes that the UK Government has for data and AI in the public sector.

Our overarching finding is that while procurement should be an essential mechanism for ensuring data and AI works in the public sector, some areas need significant improvement to ensure procurement is effective.

Reflections from on-the-ground experiences of our stakeholders show that there are several intersecting challenges in the procurement landscape. These challenges prevent procurers from doing their jobs effectively, leaving them under pressure to make informed procurement decisions without the information and tools they need to do so.

Despite the best intentions of local government procurers, interviewees told us that some local government procurement teams are buying AI because of the pressure to be innovative, save money and not be left behind. Even if buying AI is the right solution for specific societal challenges, procurers do not seem to have adequate time or resources to conduct due diligence on the implications of their decisions. Some of these buying decisions are taking place even when existing infrastructure is not equipped to support newer AI technologies. This picture suggests that procuring and adopting teams might not be ready or able to critically engage with what it means to procure certain AI technologies for their local contexts.

The pressure in local government to be innovative and competitive has led to AI-related activities – such as adopting, piloting, or even deciding to stop using AI technologies – taking place in silos. Siloed working leads to duplication of activities, which is costly and inefficient, and prevents shared learning about benefits and harms.

The challenges raised by our stakeholders are broad, but can be condensed into five areas:

  • There is a confusing landscape for the procurement of AI, where diffuse guidance and narrow legislation limit what procurers can do effectively in practice.
  • Poor data and infrastructure are impacting local government procurers’ ability to maximise insights from data, including attending to their statutory data and equality duties.
  • There is technological uncertainty of how AI works in general and this impacts how well procurers can evaluate its outcomes. This uncertainty emerges from gaps in understanding of what AI technologies are and how they work, and a lack of visibility of how AI is actually being deployed across local government.
  • There is an imbalance between local government and industry in the knowledge and expertise around what AI is, how it works and what outcomes it produces – making it difficult for procurers to assess supplier claims on what a technology will do.
  • Market failures have led to excess of power that rests in the hands of a few large suppliers, which prices out smaller and medium vendors – limiting choice available to procurement teams and creating market capture or vendor lock-in.

These issues make it harder for procurers to make the complex decisions that are required across the procurement process. For instance, a procurer must show there has been due regard to regulatory and governance concerns, run a fair and transparent tendering process, ensure decisions align with the public interest, and provide social value. This might all take place in an environment where procurers are unable to access or interpret key information about the underlying data and inner workings of an AI technology, making it difficult to implement governance requirements. The reality on the ground is that procurers’ decisions are often made with limited time and resources.

Together, these five areas of concern have led to local government procurers becoming overwhelmingly reliant on the private sector to fulfil their technology needs. There is very little space, ability and time for procurers to critically engage with the claims made by AI vendors or suppliers, limiting procurers’ ability to adequately address matters such as equity and data protection.

Drawing together these reflections and experiences from our interviewees and workshop participants – and later, in testing our recommendations – we suggest that a redesign of the digital, data and technology procurement landscape in local government – with a focus on AI – is needed. This will enable the UK Government to meet its objectives of upgrading public services to be more innovative, efficient and better able to respond to people’s needs.

This redesign should go beyond clarifying and fixing current guidance and should extend to other improvements in the procurement process. This should include skills and training, public engagement, and the AI supply-and-demand ecosystem. Getting this right will improve community protections and more widely, trust in government.

Redesigning the landscape for local government procurement of AI

Fixing procurement of AI requires the input of many different parts of the public sector:

  • UK Government (Ministry for Housing, Communities and Local Government [MHCLG]; Department for Science, Innovation and Technology [DSIT]; Crown Commercial Services [CCS]; Government Commercial Function [GCF] and the Treasury)
  • Regulators (Information Commissioner’s Office [ICO] and Equality and Human Rights Commission [EHRC])
  • Local Government Association (LGA) and its networks.

Because of its complexity, a redesign would need to incorporate perspectives of those with on-the-ground experience of procurement in local government. It should therefore reflect the range of expertise and skills needed to make procurement a lever for change. This should include people with experience in policy and strategy, data governance and engineering, procuring and commissioning, IT and digital, contract management, and public engagement.

Evidence from our interviews shows that extensive work on improving the procurement of AI is taking place in localised, centralised or sometimes mixed-government hubs, and that many are focusing on the post-procurement phase. Our research brought together experts from across central, regulatory and local government bodies and we found that while many are working on discrete parts of the AI ecosystem, the connections to the wider procurement system are not quite joined up.

This fragmented approach is not working well to support the needs of those procuring AI in local government, and we believe that there should be a mechanism and platform to bring these conversations together earlier. What is needed is a holistic approach where all conversations about AI infrastructure, experiences, power imbalances, knowledge, skills and data are brought together in one forum.

Such an approach would start with the understanding that procurement has a central role in the future adoption of AI. Procurement is foundational in that it connects all other areas of regulation, governance and innovation in the AI supply chain. In turn, this connects to people on the ground who are impacted by the use of the procured AI systems.

A National Taskforce for Procurement of AI in Local Government

We recommend that the UK Government establishes a fixed-term National Taskforce for Procurement of AI in Local Government to bring together expert roles to respond collaboratively to the challenges set out in this paper. Ownership of this taskforce could sit with bodies already working on improving procurement of AI in local contexts.

Our research initially prompted several recommendations for specific government departments and agencies rather than a central taskforce. However, since speaking to groups working to address AI-related issues in local government, we believe that a collaborative approach that brings all expertise into one forum is key to improving the procurement of AI in local government.

The taskforce would require robust support in terms of resource and funding. We suggest a fixed term of at least three years, and that its inception coincides with the roll-out and implementation of the Procurement Act 2023. As the momentum of AI adoption is likely to accelerate, the taskforce would need to be responsive to emerging concerns in AI by producing regular and timely interim outputs during its tenure.

The taskforce would present a strong opportunity for the UK Government to evaluate the effectiveness of the Act with specific regard to AI innovation that benefits society and is safe, effective and legitimate. Its remit would be to support local government procurers to do their jobs well by:

  • Conducting targeted research to build evidence on what types of AI are being procured across local government, where they are being procured, and the full scope of challenges emerging in the procurement process. This would help define the scale of the problems that exist in the procurement landscape and the level of resource needed to solve them.
  • Producing evidence-based best-practice guidance and policies that support skill development and clarify key terminology. This would address the current gaps in legislation and guidance.
Taskforce actions

We recommend the group starts by working on the four actions set out below:

  1. Ensure that regulatory and legislative documents are clear, consistent and practicable.
  2. Gather evidence on and set metrics of success for procuring and deploying AI in local government.
  3. Create robust governance structures, (contract) templates and assessment frameworks that strengthen local government bargaining positions and minimise their reliance on private suppliers.
  4. Design and recommend a suite of specific skills and training for local government procuring bodies to be able to critically engage with AI technologies and the claims made by suppliers.

As the UK Government looks to AI solutions to support struggling public services, the taskforce would be a vehicle for ensuring these solutions are effective from the start. While innovation and scale are important, poorly executed procurement of these technologies can erode public trust and cause serious harms – including withdrawal of public services, inaccurate insights, unfair data processing and other equality-reducing outcomes.

Transparency will be vital to the taskforce’s effectiveness. Given the reflections from our stakeholders on the current fragmented and siloed procurement landscape, the taskforce would need to work in the open and regularly share insights along its journey to reimagine the procurement of AI in local government.

Key to the taskforce’s success will be building on – rather than duplicating – the expertise and work of various hubs across local government. Rather than starting from scratch, the taskforce should bring together existing networks already working on improving discrete aspects of the procurement ecosystem. A secondment scheme for local government procurers of AI to contribute to the taskforce could also be valuable. Being collaborative in this way would allow the taskforce to be driven by on-the-ground expertise from local government.

The taskforce must also be in constant discussion with vendors and suppliers to make sure they are aware of any challenges or tensions from suppliers that may limit the applicability of any further recommendations the taskforce makes.

More detailed recommendations on the taskforce are in the ‘Key actions for the taskforce’ section.

Ensuring effective procurement can lead to positive change

This is a small piece of research looking at procurement in one area of the public sector. It is service-area agnostic and so the findings are not prescriptive. It shows the challenge that local government teams face when trying to procure and measure societal benefit of AI technologies.

As we have shown, the intersection of fast-changing AI, power imbalances between local government and the private sector, and the current market structure are all exacerbating that challenge. The findings highlight the substantial work required to ensure AI is used effectively in the provision of public services and the role that local government can play. But ultimately, it is the UK Government that will need to drive systemic change, through the creation of a National Taskforce for Procurement of AI in Local Government. This would allow for a collaborative, cross-sector, multi-disciplinary approach to redesigning the procurement landscape.

There are considerable risks attached to getting procurement wrong. Potential consequences could include loss of trust in the public sector; financial costs; loss of access to public services; unfair outcomes from AI-related insights in welfare, housing or immigration; and serious personal or physical harm.

Procurement of AI therefore presents an opportunity for local government, and by extension, the public sector more broadly, to ensure its decisions to purchase specific AI technologies positively benefit society.

The stakeholders who took part in our workshop and interviews have considerable knowledge of what is not working well. An immediate response by the UK Government in the form of a well-resourced taskforce would help to acknowledge and address these issues, and would help ensure that procurement of AI in local government is a lever for positive social outcomes.

Introduction

Across the public sector, local governments are under financial pressure to provide better and more efficient services. They are additionally tasked with ensuring they make decisions in the public interest while supporting innovation, and being transparent and fair in the dealings with the private sector.

In the UK, data and AI are increasingly taking centre stage in discussions about how the public sector will meet all the obligations around service delivery and private-sector relationships. When it comes to scrutinising how and which technologies are adopted into the public sector, procurement is emerging as a key decision point where many considerations around regulation, governance, social value and public interest coalesce.

Procurement therefore plays an essential role in the public sector’s ability to buy data and AI technologies. More widely, procurement decisions impact people’s lives and getting procurement right can lead to positive societal benefits.[13]  However, it is unclear whether procurement is working effectively, and what might be needed to maximise its effectiveness.

As the costs of getting procurement decisions wrong at local level can be considerable (including loss of public trust, individual harm, financial setbacks and reputational damage), we undertook research to explore procurement of AI in local government.[14]

This work is part two of a project looking at procurement of AI in local government. Part one, Buying AI,[15] provides an overview of the key legislative and guidance documents available to procurement teams in local government when buying AI and data-driven systems.

This second part identifies the barriers and levers for change, which influence whether procurement decisions can ensure AI technologies bring societal benefit. It is based on in-depth interviews with people who work across different roles in public sector procurement.

The report is also based on a workshop with public- and private-sector stakeholders from across the AI procurement supply chain. Based on the stakeholders’ experiences and reflections of procurement on the ground, together with insights from Buying AI, this report describes an AI procurement system that is not fit for purpose.

Fit for purpose?

In Buying AI [16] we conducted a document analysis of 16 pieces of guidance and legislation published to aid decision-making in AI procurement. We found that the documentation contained multiple definitions of terms related to AI or social benefit. We found that the range of interpretations and the ability (or lack of ability) to use the terms influenced the process of procuring AI. The lack of alignment of terms in the documents, and the contextual gaps between the documents themselves, highlighted the difficult task of interpreting and applying the guidance to procure AI effectively.

We suggested that the guidance is not fit for purpose as it creates confusing and unsupportive regulatory and legislative environments for local government procurers. In summary, we noted three main themes that have potential for strengthening procurement decisions, but which are not readily adaptable or practicable into procurement:

  • There is a lack of consensus on terms and definitions of key mechanisms or concepts that may help make procurement of AI less complex.
  • There are gaps and inconsistencies on how to describe (and therefore measure) social benefit, with over 50 terms used to describe various concepts related to it (including public benefit, fairness and impact assessments).
  • There is a lack of comprehensive advice on how to implement the information contained in the documents; procurers must spend considerable time and effort interpreting them. There is no standard approach for prioritising competing demands such as value for money, social value, impact assessments or transparency.

This paper complements and builds on these initial findings by presenting the experiences of those on the ground who are tasked with procuring AI for local authorities. Our stakeholders have varying cross-sector roles and interactions procurement teams. Their experiences and reflections demonstrate that procurers require a high level of knowledge to navigate decisions to effectively procure an AI technology.

Our stakeholders have experience working in the public sector, from across central UK Government and in local government departments. Their roles include data governance and regulation, data engineering, commissioning, ethics, digital roles, and procurement.  Discussions also included views from those supplying technologies to the public sector, who interact with local government procurers and who could articulate how some of the challenges within local government are amplified in their relationships with suppliers. These diverse roles are all important in decisions about procurement of AI. They also help to explain the dynamics of the AI procurement ecosystem, and the extent to which procurement can be a lever for change.

AI technologies can provide benefits to society if designed and adopted legitimately, safely and effectively. This report aims to show how local government in England can be supported to maximise these benefits through the procurement of AI.

How to read this paper

Our report focuses on activities within local government in England, but learnings can be applied to UK-wide local government bodies. Some of the findings will resonate with the procurement of AI across the wider UK public sector. More broadly, the report is for anyone interested in understanding the tensions and opportunities for change that exist in the processes of procurement of AI in England’s local government.

For those working in different roles in the procurement ecosystem, we recommend reading all chapters to gain a cross-industry view of procurement of AI. You can apply our findings to different sectors, service areas and local authority contexts. This will provide a holistic understanding of what would make local government procurement of AI better equipped to address the challenges we set out.

For those in broad policy and strategy roles across regulatory bodies, central and local government who are responsible for streamlining or effecting change in the processes within the public sector, we suggest focusing on ‘More than a gateway’. This gives a bird’s eye view of the breadth of areas where decision-making is key. The section ‘Rethinking the procurement landscape’ is especially important and we encourage you to focus on the transformative actions and areas suggested by those working in the procurement lifecycle across government and industry.

For those in central Government, regulatory and legislative bodies, we suggest focusing on the ‘Executive summary’ and/or the ‘Our recommendation’ chapter of the report. Additionally, the section ‘Rethinking the procurement landscape’ sets out what areas need strengthening and what needs to be put in place afresh.

Team leads across digital and data departments (data engineering, ICT as well as commissioning, data protection and equality specialists) can use our roadmap in the section ‘More than a gateway’ to identify areas where multidisciplinary approaches would strengthen procurement. We also encourage you to look at the section on ‘Rethinking the procurement landscape’.

Finally, we encourage suppliers and vendors of AI technologies to read the ‘Multiple challenges’ section as many tensions emerge because of a misalignment of the values or drivers for procurers versus industry.

Methodology

In this second part of our project on procurement of AI in local government, we undertook scoping interviews with 29 experts across 19 organisations, with the aim of understanding how they thought of and operationalised the key themes we discussed in our document analysis. The diverse cross-sector and cross-domain group was made up of those who had experience in procuring or supplying technologies for local government, or whose roles interacted with these two broad activities. The group included AI suppliers and vendors, and people with experience of working in local government and central Government departments and agencies. We also spoke to people working in regulation, academia and the third sector.

All their experiences highlight the breadth of issues that a procurer must consider when making a decision about procuring AI. The stakeholders’ experiences and reflections therefore highlight the issues facing local government procurers, specifically the challenges that exist in trying to procure legitimate, safe and effective AI.

We also held a cross-industry workshop under Chatham House rules: participants are free to apply the ideas and insights gathered to their own work, but no identities or affiliations of participants are to be revealed. There were 17 participants in the workshop, with expertise across the data and AI technology lifecycle.

The main aim of the workshop was to explore how the public sector could better support its procurers to make decisions that lead to positive societal outcomes. The discussion in this workshop was informed by the initial stakeholder interviews, which helped us focus on challenges and opportunities for procurement.

Through this, our secondary outcome was to research and describe what the UK Government could do to support its local government procurers to buy AI or that led to positive social impact.

Workshop discussions were guided by the following questions:

  1. What are the challenges to realising social benefit (or mitigating harms) when procuring (or supplying) AI technologies for public services?
  2. What are the challenges in assessing/measuring societal impacts?
  3. How or would you think differently about procuring different types of AI?
  4. What are the infrastructures or areas that exist to support change?
  5. What trade-offs might be needed to take these opportunities?
  6. What should be in place to tackle the issues and who should be responsible?

Using our notes from the session, we thematically analysed all the data. We grouped the emerging insights into three main themes: challenges, opportunities and changes needed for procurement of AI to be the lever for positive societal outcomes.

Here, we present these findings and then make recommendations about what is needed to make procurement a lever for assessing and ensuring AI produces positive social impact.

Findings

The procurement landscape involves many players and points of contact between procurement teams, suppliers and vendors, and technology developers. In our interviews and workshop with people on the ground, we heard that local government procurement of AI is not working well.

There is no common understanding of the use of AI technologies across England’s local government. This creates a dearth of shared knowledge about which AI solutions are working and which are not. As a result, local government is buying AI technologies with very little insight or understanding of the impacts of the technologies on its communities.

Additionally, the breadth and depth of knowledge needed to assess AI, coupled with the complexity of procurement processes, is leaving local government procurers in potential breach of data protection and equalities legislation.

Overall, we found that procurement processes need to be strengthened to better support the adoption of AI in local government, ensuring alignment with public interest and aiming to achieve societal benefit.

Based on the experiences and reflections of our stakeholders, we suggest that a redesign of the digital, data and technology procurement landscape in local government – with a focus on AI – is needed. This redesign should go beyond clarifying and fixing current guidance, extending to other improvements in the procurement process – from skills and training to public engagement and the AI supply-and-demand ecosystem as a whole. Only then can local government procurement be a strong lever for securing technologies that are safe, effective and legitimate to use.

More than a gateway

The Procurement Act 2023 and the Social Value Act 2012 portray procurement as a straightforward process. However, our findings show that procurement is not a single gateway through which we can ensure that AI has a positive societal impact. Instead, the procurement process is a roadmap involving a collection of relationships, systems and decisions that – if done well – could support the deployment of technologies that have a tangible benefit for people and society.

Such nuance and complexity is not well reflected in current guidance and processes, making the current landscape not fit for purpose. To be effective, procurers need access to clear and practicable regulatory and legislative guidance. They also need to draw from skills and expertise across many domains. Guidance documents therefore need to be useful and easy to implement by different actors at the different points of decision-making process (such as those shown in Figure 1).

Implicit in this is that the steps before and after a final procurement decision is made are equally important to making procurement of AI a success. These steps include commissioning, which is where societal challenges or problems are defined; clarity on what success looks like once an AI technology is deployed; and how and when those expected outcomes and benefits are assessed and evaluated.

The knowledge and expertise needed for successful decision-making around the procurement of AI is vast. While there is some proficiency in local government around data and digital technologies, our research found that this expertise does not translate well into the procurement of AI.

Some stakeholders told us that a lack of technical knowledge within the public sector more broadly – combined with a lack of expertise on the societal impacts of new technologies – are barriers for procurers to ensure societal benefit. This is compounded by the knowledge and power imbalances that exist between the UK public sector and those supplying technologies to its various bodies and agencies. This imbalance often sees procurers rely on the expertise of industry and thus not always able to make their own independent assessments of societal impacts.

Figure 1: An example roadmap highlighting the multiple areas that need to be considered when buying an AI system [clickable]

Multiple challenges make assessing or achieving positive social impact difficult

The procurement of AI technologies in local government is about so much more than finding cost-effective solutions. There is a real opportunity for the use of AI in public services to achieve positive social impact.

We asked workshop participants about barriers to procuring AI for societal benefit. These are the challenges they identified:

  • There is a confusing landscape for the procurement of AI, where diffuse guidance and narrow legislation limit what procurers can do effectively in practice.
  • Poor data and infrastructure are impacting local government procurers’ ability to maximise insights from data, including attending to their statutory data and equality duties.
  • There is technological uncertainty of how AI works in general and this impacts how well procurers can evaluate its outcomes. This uncertainty emerges from gaps in understanding of what AI technologies are and how they work, and a lack of visibility of how AI is actually being deployed across local government.
  • There is an imbalance between local government and industry in the knowledge and expertise around what AI is, how it works and what outcomes it produces – making it difficult for procurers to assess supplier claims on what a technology will do.
  • Market failures have led to excess of power that rests in the hands of a few large suppliers, which prices out smaller and medium vendors – limiting choice available to procurement teams and creating market capture or vendor lock-in.

Below we elaborate on and provide further evidence for these challenges, amplifying the voices of those most closely involved in this procurement process.

Confusing landscape of AI, and related regulation and guidance

A key challenge for procurers is a confusing landscape of regulation and guidance. Just as we found in our document analysis,[17] the stakeholders we spoke to highlighted that there is no clear definition of AI in the guidance that is meant to support procurers’ decision-making. AI is complex and multifaceted: the term is used to talk about both specific technologies like ChatGPT or to refer to more general, advanced suites of tools, including predictive analytics and risk scoring. Some stakeholders told us that this lack of clarity forces procurers without expertise in AI to fill in the gaps on their own to understand which type of technology offers the best solution for the issues they are trying to address.

Additionally, procurement legislation and guidance documents contain multiple definitions and mechanisms for achieving positive social impact. The Social Value Act 2012 provides three key areas to focus on (economic, environmental and social wellbeing), but does not describe how AI may impact these facets of society. We heard concerns about this from the stakeholders we spoke to, who said that a lack of clarity on the risks of AI deployment could lead to mistakes or serious harms. This is especially true in local government, where essential services are delivered and sensitive data is processed.

The stakes are high when it comes to the deployment of AI technologies by local government. Inherent bias in AI systems could lead to discrimination and some applications of AI – like biometric technologies – have significant implications on people’s privacy. Our conversations with stakeholders highlighted that the current inability of procurers to fully understand or measure the impacts of AI could leave local government in breach of the Public Sector Equality Duty in England (PSED), UK General Data Protection Regulation (GDPR) and the Data Protection Act (DPA).

We also heard from stakeholders that there is no clear path for local government to seek redress if they realise an AI technology is not working as intended. Those with varying procurement-specific and data-governance roles across local government were very clear in their critique, saying that local government has little clarity on what to do if, when able to assess societal impact, they realise the impact is not in line with expectations of societal benefit.

The overall cross-industry discussions in this workshop eventually centred on how regulators like the ICO and EHRC could support local government procurers in how they enforce AI-related breaches. Participants wanted clarity on how regulators would support local government to seek compensation or redress in cases of failures or breaches of contract and confidence.

Data and infrastructure

Data: questions of quality

Data underpins the design of AI systems and presents several challenges for procurers. Local government experts in our workshop expressed concerns around data governance and data access requirements set by AI suppliers or vendors.

While data-driven technologies have the potential to benefit society, use of data that is unauthorised or perceived as illegitimate or intrusive can lead to technologies that are problematic, unsafe, ineffective and untrustworthy. Additionally, as AI tools can sometimes be trained on information outside of public sector data, local government procurers need to be cognisant of potential breaches of intellectual property. When local governments do not fully understand the data powering the AI systems they use, the decisions they make can harm individuals or lead to other negative consequences like data breaches, which can erode public trust[18] and violate UK data protection laws.

Several stakeholders with data governance expertise suggested that to maintain public trust in and legitimacy of AI technologies, the data used to train these models, and resulting outputs, must be governed in line with expectations of members of the public. This means that procurers should be able to explain to people what data is required to build and run a model.

Varied public attitudes and concerns around the use of AI across different domains, including health, policing and employment, support this finding.[19] [20] When these concerns are overlaid with plans to digitise services[21] then the overall effect to public trust in technologies can be amplified. To retain or build public trust, local governments must be able to understand and convey to the public how a specific technology may manipulate, alter or use personal data.[22]

Our stakeholders agreed and suggested that data governance agreements need to go beyond the data a local authority holds, extending to the insights and data generated by the technology company supplying the AI system – as those insights might also hide privacy breaches.

Beyond data access agreements and governance, the data held by local government itself needs to be curated. This means organising the data and preparing it for use in various AI applications, standardising it, and linking it to other sources as necessary. Curation leads to higher-quality data and improved accuracy of insights, which is ultimately better for people and communities.

But for local government, there is not always adequate resource to do this. Stakeholders in our interviews and workshop questioned the reliability of an AI system’s outputs if the data quality and inputs were inadequate. Poor, incomplete or unreliable data inputs may potentially see procurers breaching duties such as the PSED and UK GDPR by inadvertently causing unfair, discriminatory or unequal outcomes to certain groups of people.[23]

The stakeholders we spoke to also pointed out that data quality is a key challenge when trying to understand the impacts of AI technologies after they are procured. Many reported that local government procurers do not always have the capacity to collect or record information on who has been affected by AI services, and how those issues have arisen. This means that local government may not always have sight of the societal harms stemming from the deployment of an AI technology, such as loss of access to vital services, discrimination and unfair treatment, and illegitimate surveillance and profiling.

Our stakeholders were clear that allowing suppliers to access local government data actually leaves local government at a disadvantage, replicating similar concerns in the public sector more widely. They mentioned that at times that when local government has provided data for the design of a technology, including to train an AI model as part of the procurement process, they did not always know

  • whether or not the data would create new value to the vendors, and therefore how to assess the value of a contract
  • that they may not get the data, or the additional newly generated data, back and would therefore need to have prior provisions and clarity about data ownership and compliance with data protection law
  • That they might have to pay to access any new data insights in the future, creating an unforeseen cost to the procurement of AI (post-deployment).
The cost of an unstable infrastructure

Our evidence shows that when procurers do not tap into wider sectoral or domain-specific expertise before buying a new AI technology, new cost implications can emerge. For example, there might be insufficient support in place to train staff to use a new tool, or difficulties integrating a new AI system with other existing data and digital services and infrastructure. Ultimately, this could have a knock-on effect on the public purse, leading to further cuts to the services being provided to communities.

Costs also emerge in maintaining AI systems themselves, and the infrastructure they run on. Some workshop participants indicated the need for a holistic view, citing the need for increased scrutiny of data and AI together with the specific local government context in which it is being used and adopted.[24] The challenge facing local government – which stakeholders largely agreed on – is considering how legacy systems may impact what AI solutions can be procured, and what new costs would arise from needing to upgrade existing systems to house a new technology.

Without full economic and social costing, the real cost of procurement may not emerge until later. Ignoring the full context in which an AI system may be introduced only leads to increased financial burden on local government procurement, creating a dynamic where contracts are not tipped towards local government’s advantage.[25] This idea of unbalanced contracts was highlighted in the workshop multiple times as an issue that still needs resolution for local government procurers of AI technologies.

The cross-industry perspectives emerging from our interviews and workshop were unanimous that the challenges of data and infrastructure cannot be solved by local government procurement teams alone. These challenges call for experts in data, digital and AI to work collaboratively with procurers to identify governance, equality and regulatory concerns that may arise before a procurement contract is made.

Technological uncertainty

Despite the patchy but growing evidence of AI use across government departments,[26] there is uncertainty about the exact benefits it may deliver.[27] This concern was raised in our workshop, with debate over which services AI should or should not be deployed in. Overall, stakeholders across the board were cautious about claims that AI would transform public services. Those with expert AI knowledge highlighted that AI tools need to be engaged with more critically, and reflected that many local government procurers do not have the requisite skills to interpret the technical information that suppliers typically present in a tendering process.

Our evidence suggests that procurers are being faced with uncertainty around AI because it is not always clear how AI tools work. Stakeholders reported that it can be difficult to request that suppliers or vendors offer insight into the internal workings of their AI models, as it usually opens up concerns around intellectual property. This prevents procurers from being able to access the system design and training data to fully assess how an AI technology might work in the real world.

Subsequently, frontline staff using an AI technology may not be able to explain its outcomes to those they are serving, something we have seen in our past research.[28] Technological uncertainty can therefore also surface data concerns around compliance with legal and governance obligations.

In the workshop, stakeholders also suggested that the general uncertainty that exists around AI itself is because of the different maturity levels of different types of AI. From their experience, procurement teams were having to grapple with what makes something AI, and which service areas are best for the type of AI they want to procure. They noted that current guidance around buying or assessing AI does not reflect the knowledge and skills procurers need to operationalise that guidance.

Another issue raised by our stakeholders was a lack of visibility of how AI is currently being used across local government. Across the public sector and local government, there is insufficient cross-cutting information about who is using AI, for what purposes and to what effect.[29] [30] [31] Instead, this information remains siloed within specific organisations, which has led to a collective lack of insight into what is working well and what is not.

This lack of shared knowledge and information has a chilling effect on local government’s risk appetite around deploying AI technologies, as they fear innovation failure. According to our stakeholders, these silos and fears mean that local government procurers often have to rely mostly on the claims of AI suppliers rather than on their own critique, making it even more difficult to share information if those claims are not realised.

Overall, workshop participants wanted to see more centralised channels for recording the impacts of specific AI deployments by local government. This would ensure these impacts are both monitored and published as learning resources for other localities. It would also support the function of the Algorithmic Transparency Recording Standard (ATRS) and allow for wider scrutiny of and greater accountability for societal harms that arise from AI systems.

Imbalances in knowledge and expertise

Many of the issues raised in our workshop point to an imbalance in AI knowledge and expertise between local government and private sector suppliers. This reflects a shortage of data analysts in the public sector, which has caused local government authorities to face high training costs for upskilling their staff on data-related training.[32]

According to the stakeholders we spoke to, a consequence of this knowledge imbalance is that local governments rely heavily on the expertise of AI suppliers. Our conversations with stakeholders suggest that a lack of skills and resources in local government has exacerbated the gap between procurers and suppliers, forcing local government to focus on tool and contract management rather than in-house development of technologies.

We also found that a siloed decision-making culture exists within local government. Some stakeholders we spoke to across data governance and digital roles expressed concern that discussions on what to procure did not always involve their expertise, or that they were brought into the conversation too late, when decisions had already been made.

There is also a dearth of knowledge in local government around monitoring and evaluation of AI technologies – important mechanisms for assessing social impacts. This is creating an ever increasing reliance on vendors and suppliers of AI technologies to do these tasks, leaving them to ‘mark their own homework’. Local government is sometimes not equipped to assess the claims that are made by vendors, meaning they cannot prepare future mitigations that might be necessary for the tools they procure.

Some stakeholders even suggested that there are instances where a lack of knowledge or expertise stops local government procurement teams from future-proofing contracts, leaving them locked in a contract that might not be producing desired outcomes. This usually arises when procurement teams do not know enough to think through contingencies and put the right requests in their tenders or contracts – all of which leaves them economically disadvantaged.

Failures in the market of AI technologies

In our interviews and workshop, there was considerable discussion of the state of the AI supply chain. The on-the-ground experiences of our stakeholders clearly demonstrated that the current market for supplying AI in local government is tipped in favour of large suppliers. This creates an environment that makes it difficult for small- or medium-sized vendors of AI technologies to compete, paving the way for vendor lock-in, unfair contracting clauses and long contractual agreements that are difficult for local governments to get out of.

Local government procurers of AI are therefore finding themselves subject to market forces that benefit big tech over public sector development. Many of our workshop participants suggested that some of these imbalances and failures emanated from the marketplaces (frameworks) on offer to procurers. Central Government frameworks, where approved suppliers can list their products for sale, were especially reported to price out smaller vendors as there is usually a relatively high minimum monetary value suppliers can bid.

In practice, this reduces the number of diverse suppliers available to local government procurers.  Some procurers recalled tensions between the frameworks they would prefer to use, those being pushed by central Government and those their contract managers would prefer – leaving them with uncertainty on which to choose and how to actually determine best value in procurement . Suppliers and buyers alike reported that procurement frameworks in general offer no obvious assurances of whether suppliers or AI technologies are vetted and in what ways. Transparency on the makeup of frameworks, and how they differ from each other, is something that was suggested as a potential response to this aspect of market failure.

Those we spoke to from local government suggested that smaller, local suppliers may be better able to provide social value that fits the needs of their communities, but market dynamics mean that this choice is not always available to the procurer.  Finding ways to diversify suppliers in the market might mean moving away from current ‘old hands’, but it is vital for supporting solutions that are more aligned to the needs to local communities. If left unchallenged, market failures will continue to price out SMEs that may be more likely to bring social value.

One key reflection from the participants was that vendors on frameworks are not always the suppliers or developers of AI technologies, which an additional layer of complexity in the AI supply chain. The more distance between developers and procurers, the more likely it is that information translates poorly in communication across the demand-supply divide. Many of our workshop participants cited challenges that stemmed from this dynamic – with some reporting that outsourcing of AI solutions occurs at different layers in the supply chain, obscuring the cumulative impact of the technology on people and society and making it difficult to trace accountability if something goes wrong.

The power held by a few big tech suppliers has left local government with less ability to direct when and how they may use AI. Our stakeholders mentioned instances where suppliers or vendors have ignored official routes to market and sold solutions directly to local government procurers. In these situations, there is even less scrutiny on these suppliers and their technologies, and little ability for procurers to do due diligence on the veracity of claims. We also heard that vendors sometimes use the lack of information-sharing across local government to their advantage – claiming their technologies are used more widely and with higher levels of success than is actually the case.

Achieving social value for communities is clearly important for local government, yet there is currently little exploration of the societal impacts of using certain technology companies to provide services. While stakeholders pointed to isolated benefits of procuring from certain suppliers – like job creation – we found that procurers in local government are not equipped to comprehensively assess the full social value of these partnerships. A lack of competition in the supply chain can also limit the social value that procurers can ask to see, as they have less choice and little leveraging power against big corporations.

Rethinking the procurement landscape

It is clear from the challenges listed above that for local government to confidently deploy safe, effective and trustworthy AI technologies, there must be significant changes in how these technologies are procured. In this section, we present some responses to these challenges, and call for a joint taskforce to address them.

Our evidence shows that the procurement landscape is fragmented, and a unified approach is needed to ensure that AI is being deployed effectively, safely and fairly by local government. These procurement processes also need far more clarity – from clarity on how terms like ‘AI’ or ‘social benefit’ are defined, to clarity on roles and responsibilities across industry, central Government and local government.

Suggestions from our interviews and workshop show wide-ranging opportunities for change across the process of procuring AI in local government. Stakeholder feedback highlighted several priority areas of focus: data governance, including decisions around user groups; systems and processes that can support or strengthen procurement roles and decision-making; and the need for due diligence checks to ensure that the use of an AI technology is not only legally compliant, but aligned with the public interest.

Improving data governance

Our evidence demonstrates that procurers and suppliers must have better capacity to map out the data sources that are used to build AI systems. Suppliers must also clarify where the onward uses of data might be, and the value they will generate or receive from the data. This transparency about data would help anticipate risks to the public, supporting procurers to put the right safeguards in place. It is also fundamental to documenting and showing due regard to GDPR, the DPA and the PSED. Within local government, procurement teams must ensure they are communicating with relevant data and compliance experts to embed this governance from the beginning.

Another aspect of good data governance is having adequate, centralised data on how people are being impacted by AI technologies, which can influence the current and future use of AI in local government. Suppliers and procurers need to engage with communities at different points throughout the design, procurement and deployment of AI technologies. As the effects of AI are not always immediately clear to those in local government, the role of public voice is key to understanding impacts.

By engaging with the public and assessing and collecting data on a technology’s post-deployment outputs, procurers can compare expectations to actual benefits and harms. Where engagement is not possible, procurers and suppliers can engage voluntary, community and social enterprise organisations in collecting the information needed to represent the voices of communities in decision-making.

A note on public engagement

Some procurers and academics we spoke to noted that the voices of communities being impacted by AI technologies are frequently excluded in procurement conversations. While public benefit is a key goal in procurement, the lived experience of people using these technologies is often left out, limiting the ability to fully assess the social impact of these tools.

Many stakeholders from our interviews and workshop agreed that public engagement is an area that needs further investment, development and support to be meaningful in the procurement process. While our stakeholder discussions centred on the possible benefits of a centralised feedback mechanism between communities and local government procurers – where impacts or outcomes of AI deployments are reported – there was concern that this might place further burden on already stretched resources.

From our own research at Ada,[33] we know the public are not a monolith, and have nuanced views of the use of AI in different contexts. For this reason, it is important that local governments seek the views of the public when procuring new AI technologies.

The realities of variously resourced local government bodies mean that there is no one way to conduct public engagement activities. However, there are certain parts of the procurement process where public engagement may be particularly useful.

An important starting point is for local government procurers to ask themselves what questions public engagement might help them answer, or what challenges it might help them address. These aims will influence at what points in the procurement process to engage the public, and whether it is best to inform, consult, collaborate or co-produce decisions on procurement of AI.[34]

One way to engage the public is to position them as subject specialists when considering where and how AI will be deployed. Their lived experience could help inform the design of a technology and the context of its use, ensuring it responds to their needs.

Frontline workers in local government can also offer valuable insight, as they will have first-hand experience using existing systems and may have ideas that would influence whether or not to procure technologies.[35]

Another example of public engagement in procurement is the concept of social witnesses.[36] Used in some Latin American countries, social witnesses are individuals or groups from a community who are involved in overseeing and monitoring the entire procurement process. This holds public officials to account for the decisions they make, including why particular bids were made, and if the procured services are delivering as promised.

Public voice can also be influential in data governance decisions (e.g. deciding what data is shared or used for building, deploying and assessing AI models). Understanding and responding to the community’s perspectives on their data can help ensure that local government procurers buy technologies that align with public expectations about how their data will be used.

Refining systems and processes

Addressing assessment inadequacies

Assessing the impact of AI is not only important for knowing what safeguards or resources to enhance, but for complying with the law on data protection and equalities impacts. But local government procurers currently lack the expertise, support and transparency needed to adequately assess AI technologies.

Our evidence shows that tensions arise when suppliers do not grant procurers access to the data needed for a comprehensive assessment, citing intellectual property concerns. Ada’s research on audit regimes shows that full access to data is vital to conduct a meaningful assessment.[37] Without this access and transparency, it is difficult for procurers or assessors to determine the accuracy of the information provided by the vendor. Often procurers do not have the requisite skills to know exactly what to ask for when it comes to requesting data access, leading to a blanket refusal from suppliers. One remedy for this is ensuring the right local government team with diverse domain expertise is involved in all aspects of the assessment process.

Ensuring assurance

There is an urgent need for development and implementation of enforceable assurance mechanisms for both AI technologies and procurement frameworks themselves, as part of wider governance responses to AI technologies.

Assurance of technologies [38] and their underlying software are signals that certain steps have been taken to ensure the technology meets a particular standard. Such mechanisms would increase procurers’ confidence in the claims made by vendors about safety and efficacy. They might also reduce the time burden for local government to assess the background model behind an AI solution, allowing them more time to consider contextual and service-area factors.

Between the research and the writing of this report, the Government launched new guidance on AI assurance with its AI Management Essentials toolkit .[39] While this guidance remains voluntary for now, it offers important ways that those developing or supplying AI services can show a commitment to having and maintaining good AI governance. This could help procurers feel more confident about choosing a supplier that meets a minimum governance threshold. These and other emerging standards [40] should complement related guidance and regulation with continual alignment to relevant legislation.

However, there remains no clarity on who is responsible for any fallout if the AI technologies procured by local government are not safe or effective when deployed. This lack of accountability prevents local government procurers from confidently and critically analysing claims from suppliers against their own local needs and assessments.

Assurance should therefore go beyond AI technologies, extending to procurement frameworks themselves. This would mean that all frameworks would have the same standards, the same rules and the same consequences if these rules are not followed.

To respond to emerging issues in AI, any assurance mechanism must be agile – able to be updated over time. However, this should be balanced with the potential burden constant iterations of standards and processes may put on procurers.

Establishing baselines of what ‘good’ looks like

All processes and systems within procurement of AI by local government should be subject to the same minimum standards of what ‘good’ looks like. This does not mean all procurement outcomes must be judged in exactly the same way, but that across different contexts and domains, there are unifying and commonly understood baselines that all procurers and suppliers are expected to achieve.

Baselines for the successful deployment of AI in a community would allow for consistency in objectives across local government – and would mean that success would be defined by meeting these objectives, not by local resourcing or capabilities. Setting these baselines would involve balancing the risks and potential benefits of these technologies for society. For example, surveillance technologies could reduce crime rates, but jeopardise privacy. And the digitisation of services could enable more efficiencies, but potentially leave some people behind. None of these trade-offs are new, and existing legislation like the Social Value Act 2012 or the socio-economic duty of the Equality Act 2010 [41] could support the creation of these baselines.

Establishing a minimum baseline that suppliers must meet would allow SMEs and large providers to be subject to the same scrutiny, which would reduce market capture and monopolies.

Embedding due diligence

Our evidence suggests that procurers of AI in local government can embed due diligence into their processes through modelling, testing and contracts.

In the absence of a strong evidence base for using AI or knowledge of its impacts on communities, procurers in local government must try to predict potential impacts. Building capacity to model, pilot or test AI in safe and monitored ways will boost procurers’ confidence about how a technology is likely to perform, and their decisions around procuring it. Sharing the results of these activities – from different localities and contexts – across local government will allow procurement teams to better predict whether or not specific technologies will work for them.

Contracts also present an opportunity to codify what ‘good’ looks like in the procurement of AI by local government, as well as the expectations and obligations that suppliers and vendors must meet. This includes how evaluations will be designed and assessed, and the criteria the procured product will be measured against. However, stakeholders reported that the power imbalance that exists between public sector and industry reduces the potential impact, reach and strength of contracts. Having stronger contract clauses including on expected outcomes, roles and responsibilities, and accountability measures would be a positive source of change.

Setting the scene for meaningful change

Expecting AI technologies to be transformative without considering the system in which they are acquired – and the context in which they are deployed – sets local government procurers up for failure. As noted in our interviews and workshop, procurement of AI can only be a vehicle for realising positive societal benefit and public legitimacy if we address its many challenges. This includes clarity around data inputs and outputs, rebalancing power between suppliers and local government procurers, and more practicable and joined-up regulatory and legislative support for decision-making.

For local government specifically, doing procurement well is time- and resource-intensive. Not all local governments or service areas can undertake broad change simultaneously. Through our research, we have determined that a redesign of the digital, data and technology procurement landscape in local government – with a focus on AI – will ease this burden and help central Government meet its objectives of upgrading public services to be more innovative, efficient and better able to respond to people’s needs.

This redesign should go beyond clarifying and fixing current guidance and extend to other improvements in the procurement process, from skills and training to public engagement and the AI supply-and-demand ecosystem as a whole. Getting this right will ensure that AI technologies are only procured when they are in the best interest of communities, and will build trust in local government. It will also require considerable collaboration, which we discuss next.

Our recommendation: a National Taskforce for Procurement of AI in Local Government

Our research has found that local government procurers are not adequately equipped and supported to ensure that the AI technologies they buy are safe, trustworthy and effective. Our stakeholders’ experiences and reflections highlight that there are multiple challenges within the procurement landscape which need to be addressed.

With a new Government in office, emerging clarity about the remit of DSIT and its ‘digital centre’ of government,[42] and the upcoming implementation of the Procurement Act 2023, the time is right to tackle these issues. Our initial recommendations for addressing these challenges were in the form of specific actions for specific government bodies. However, when refining our recommendations with stakeholders, it became clear that any effort to redesign the AI procurement landscape would need a collaborative approach – one that pushes against the siloed status quo in this space.

We know that within central Government and across local government, there are numerous networks, units and groups working to build knowledge in the public sector around the procurement and use of AI. However, we have found that while these groups are doing good work, they are under-resourced and struggle to share timely information. Additionally, feedback from local groups on how AI tools are impacting frontline workers, procurers and everyday people is not always reaching central bodies. This lack of communication creates a divide between procurement policies at a local and central level, and in some cases creates a duplication of efforts.

We recommend that the UK Government sets up a National Taskforce for Procurement of AI in Local Government, which would ensure that procurement of AI can be used as a lever for positive societal change. Its remit would be to support local government procurers to do their jobs well by:

  • Conducting targeted research to build evidence on what types of AI are being procured across local government, where they are being procured, and the full scope of challenges emerging in the procurement process. This would help define the scale of the problems that exist in the procurement landscape and the level of resource needed to solve them.
  • Producing evidence-based best-practice guidance and policies that support skill development and clarify key terminology. This would address the current gaps in legislation and guidance.

To be successful, the taskforce should reflect the wide range of expertise and skills needed to make procurement a lever for change – including expertise in policy and strategy, data governance, engineering, procuring, commissioning, IT and digital, contract management and public engagement. Its membership would also require the input of many different parts of the public sector, including central Government (e.g. Ministry for Housing, Communities and Local Government, Department for Science, Innovation and Technology, Crown Commercial Services, Government Commercial Fund and Treasury), regulators (Information Commissioner’s Office and Equality and Human Rights Commission) and local government (Local Government Association and their networks).

Its ownership would be shared between the central UK Government and local government. It would need to be well-resourced, ideally for at least three years, with its specific AI focus potentially enhancing the successful implementation of the Procurement Act 2023.

As the UK Government looks to AI solutions to support struggling public services, the taskforce would be a vehicle for ensuring these solutions are effective from the start. While innovation and scale are important, poorly executed procurement of these technologies can erode public trust and cause serious harms – including withdrawal of public services, inaccurate insights, unfair data processing and other equality-reducing outcomes.

Transparency will be vital to the taskforce’s effectiveness. Given the reflections from our stakeholders on the current fragmented and siloed procurement landscape, the taskforce will need to work in the open and regularly share insights along its journey to reimagine the procurement of AI in local government.

The taskforce will also need to keep a finger on the pulse of AI developments. As current governance and regulation lags behind the pace of AI change, the taskforce could be an agile and reactive force, producing and iterating guidance and policies in response to emerging and ongoing concerns around AI.

Key to the taskforce’s success will be building on – rather than duplicating – the expertise and work of various hubs across local government. Rather than starting from scratch, the taskforce should bring together existing networks already working on improving discrete aspects of the procurement ecosystem. A secondment scheme for local government procurers of AI to contribute to the taskforce could also be valuable. Being collaborative in this way would allow the taskforce to be driven by on-the-ground expertise from local government.

The taskforce must also be in constant discussion with vendors and suppliers to make sure they are aware of any challenges or tensions that may limit the applicability of any further recommendations the taskforce makes.

Key actions for the taskforce

Based on the insights from our research, we offer four key actions the taskforce can take:

1. Ensure that regulatory and legislative documents are clear, consistent and practicable. The taskforce should be a catalyst for ensuring that public-sector and service-area-specific regulators and legislators make their guidance for local government procurers of AI consistent and practicable. It should also facilitate conversations between regulators, legislators and local government on the barriers to operationalising regulatory and legislative information.

In addition, the taskforce could influence the UK Government to support procurers by offering a path to redress when harms occur. The lack of practicability in current regulation, legislation and guidance has created a situation where suppliers hold no accountability if they do not meet their end of the contract. We heard from our stakeholders that local government procurers are not always sure how to handle scenarios where suppliers breach their own contracts, to the detriment of the public purse and the ability of procurers to adequately undertake their own public duties. Having concrete and clearly defined responsibilities would help procurers enforce mechanisms for redress knowing they would be backed by the law or regulation.

2. Gather evidence on and set metrics of success for procuring and deploying AI in local government. A clear theme in our interviews and workshop was that when procuring AI systems, local government is not always clear on what is working and what is not. Sometimes this arises when AI technologies are deployed in situations where there is simultaneously high pressure to innovate and save money, and low resources. This can skew decisions on what to procure and limit discussion across local government.

One action for the taskforce would be to collate evidence of what AI is being used across local government, across which service areas, and with what impact – and then identify what safeguards and regulatory structures are needed. This should draw heavily on diverse local government knowledge of the procurement and deployment of AI technologies.

The taskforce could use existing structures in England, such as combined authorities, to outline ways to help procurement adapt to the changing AI landscape. Combined authorities have close working relationships with councils while also having the independence to set their budgetary focus and therefore determine which services they provide and want to see AI deployed in. As case studies, combined authorities would have a holistic view of procurement – from its impact on people to its impact on the public purse. They would also have the ability to gather evidence from and disseminate knowledge across its councils in a rapid and unified way.

Any evidence-gathering effort should also involve a feedback mechanism for members of the public to provide information on how they are experiencing AI technologies deployed in their communities.

The taskforce could use this evidence to manage central Government’s expectations around AI solutions in local government, identify common issues around risks and benefits of AI uses, and set policy that can better support procurers’ decisions in buying AI. The policy should concretely describe – across a wide range of services – what the successful deployment of AI in local government can look like, and the lines that should not be crossed. It should also provide clear guidance to support specific localities to determine their own readiness to acquire and deploy AI technologies.

3. Create robust governance structures, (contract) templates and assessment frameworks that strengthen local government bargaining positions and minimise their reliance on private suppliers. Current procurement regulation and governance is not adequate to support procurers of AI in local government. Our evidence shows that procurers need a way to reconcile public benefit expectations with enforceable structures that support their decision-making.

The UK Government could draw inspiration from USA,[43] [44] [45] where some public sector bodies in some jurisdictions have started creating mechanisms like bills and contracting templates that clarify roles and responsibilities around AI deployment and procurement, with explicit routes to pursue redress if needed. This could rebalance the uneven power dynamic between local government procurers of AI and suppliers.

The taskforce should create strong contracting clauses to support procurers seeking access to data held by suppliers, which is not always readily shared. There is also an opportunity for the taskforce to go beyond contracts by opening up and scrutinising how and which vendors end up on which data and digital technology frameworks. If the taskforce managed to influence and embed assurances within procurement frameworks, then local government procurers could prioritise assessing how AI technologies would perform in their local contexts.

4. Design and recommend a suite of specific skills and training for local government procuring bodies to be able to critically engage with AI technologies and the claims made by suppliers. The taskforce could help clarify the requisite skills and knowledge for local government procurers to do their jobs well. This goes beyond general skills and training, extending to the specificities of critically assessing the claims made about AI technologies by suppliers. Having the right skills to interact with and interrogate AI systems is fundamental for local government procurers to show due regard and diligence to the DPA and the PSED. Local procurers would also be better equipped to plan and predict some potential negative outcomes from certain uses of AI, stopping societal harms before they happen.

Building local skills takes time, and will go beyond the lifespan of the taskforce. However, the taskforce could create an independent panel of advisers who would provide targeted support to local government procurers struggling with limited knowledge of AI. This shift in the knowledge landscape would further rebalance power between procurers and technology suppliers – and support procurers in meeting their data and equality obligations.

Our research at Ada shows that a significant barrier to effective governance of AI is a lack of accountability and enforcement.[46] This can further entrench the power imbalances present in public services. For this reason, we would recommend that the UK Government provides regulators with the powers and resources needed for them to do their job well. The taskforce can do this by working closely with regulators and legislators, and providing evidence from the ground of the breaches occurring, barriers to reporting these breaches, and the impact on people and society.

Overall, a taskforce would serve to join up a fragmented landscape, create accountability mechanisms that are currently absent, and empower procurers to better understand AI technologies and their impacts on communities. Its outcomes would support the UK Government in its mission to deliver public services that are innovative, personalised and attentive to the needs of society.

Conclusion

Local government procurers of AI in England need to make numerous decisions in the procurement pathway, balancing the expectations of central Government, society and industry. Any approach to tackling the challenges set out by our stakeholders needs to be balanced with the realities of procurement, including knowledge and skills gaps, and reduced human and financial resources.

To determine whether an AI solution is appropriate in a given context, procurers need a comprehensive understanding of both the societal challenge being addressed and the AI solution to be deployed. Across the entirety of our project, we have shown that current regulatory and legislative documents do not adequately support local government in making informed decisions around the procurement of AI. Our roadmap (in the section ‘More than a gateway’) highlights the multiple layers and processes that need to work in a coherent manner for procurement of AI in local government to be effective.

In our interviews and workshop, stakeholders further emphasised that the data and AI strategies of the UK Government are not aligned with the experiences of those working on the ground, and that the current landscape is currently too fragmented to effectively address this. Procurers in England’s local government are therefore not well equipped to buy and deploy AI that has societal benefit, nor are they able to fully assess AI’s impact on communities.

Our recommendation to create the National Taskforce for Procurement of AI in Local Government comes from the barriers we identified in our research, and presents an opportunity to overcome these barriers in a joined-up way. We hope the taskforce can provide the evidence, tools and structures needed to change procurement practices and the wider ecosystem. Through its collaborative and multi-disciplinary approach, the taskforce can proactively research and recommend immediate and long-term changes that will ensure the procurement of AI in local government in England works for people and society.

Local government procurement has the potential to be a lever for realising positive social impact from AI. Fixing its landscape will require the creation of new infrastructure and the strengthening of existing processes. It will need to take into account the experiences of those on the ground: procurers, frontline staff who use AI technologies, and the public. Doing this will support a future where AI technologies contribute to thriving public services.

Acknowledgements

This paper was lead authored by Mavis Machirori with input from Anna Studman and Imogen Parker.

We are grateful to Victoria Blyth and Jerry Fishenden for their review of the work.

We thank all the participants in our interviews and workshops (all affiliations reflective of time of research):

  • Andy Snell, Barnsley Hospital NHS Foundation; Barnsley Metropolitan Borough Council
  • Angus Cleary, EHRC
  • Ann Borda, The Alan Turing Institute
  • Chiadi Lionel, Camden Council
  • Claire Lesko, EHRC
  • Cristina Muresan, IEEE
  • Emily Campbell Ratcliffe, Department for Science, Innovation and Technology
  • Ian Makgill, Spend Network
  • James Findlay, Stance Global
  • Jenny McEneaney, Local Government Association
  • Jerry Fishenden
  • Kira Allmann, Manchester City Council
  • Mark Sendak, Duke Institute for Health Innovation
  • Michael Katell, The Alan Turing Institute[47]
  • Miranda Sharp
  • Paul Maltby, Faculty
  • Peter Schofield, Manchester City Council
  • Sam Nutt, London Office of Technology and Innovation
  • Smera Jayadeva, The Alan Turing Institute
  • Steven Blantz, Camden Council
  • Tim Davies, Connected by Data
  • Tom Turngold, Spend Network

Footnotes

[1] ‘The use of digital technology to create systems capable of performing tasks commonly thought to require intelligence.’ See: Department for Science, Innovation & Technology, Office for Artificial Intelligence, and Centre for Data Ethics and Innovation, ‘A guide to using artificial intelligence in the public sector’ (GOV.UK) <https://www.gov.uk/government/publications/understanding-artificial-intelligence/a-guide-to-using-artificial-intelligence-in-the-public-sector#defining-artificial-intelligence> accessed 1 November 2024

[2] ‘What Is a Foundation Model?’ (Ada Lovelace Institute) <https://www.adalovelaceinstitute.org/resource/foundation-models-explainer/> accessed 1 November 2024

[3] ‘What Is Local Government?’ (Local Government Association) <https://www.local.gov.uk/about/what-local government> accessed 1 November 2024

[4] Department for Science, Innovation and Technology, Department of Health and Social Care, Home Office and Peter Kyle MP, ‘New Data Laws Unveiled to Improve Public Services and Boost UK Economy by £10 Billion’ (GOV.UK, 24 October 2024) <https://www.gov.uk/government/news/new-data-laws-unveiled-to-improve-public-services-and-boost-uk-economy-by-10-billion> accessed 1 November 2024

[5] Ibid

[6] ‘Post Office Horizon Scandal Explained: Everything You Need to Know’ ComputerWeekly.com <https://www.computerweekly.com/feature/Post-Office-Horizon-scandal-explained-everything-you-need-to-know> accessed 1 November 2024

[7] Government Commercial Function, ‘Transforming Public Procurement’ (GOV.UK, 24 May 2024) <https://www.gov.uk/government/collections/transforming-public-procurement> accessed 1 November 2024

[8] Ibid

[9] Narayanan A and Kappor S, AI Snake Oil: The Bait and Switch behind AI Risk Prediction Tools (Princeton University Press)

[10] Committee on Standards in Public Life, ‘Artificial Intelligence and Public Standards: Report’ (GOV.UK, 2020) <https://www.gov.uk/government/publications/artificial-intelligence-and-public-standards-report> accessed 1 November 2024

[11] ‘Department for Science, Innovation and Technology and Feryal Clark MP, ‘Tech Experts to Shape Government Digital Vision to Drive Innovation and Boost Public Services’ (GOV.UK, 1 October 2024) <https://www.gov.uk/government/news/tech-experts-to-shape-government-digital-vision-to-drive-innovation-and-boost-public-services> accessed 1 November 2024

[12] Ada Lovelace Institute, Buying AI: Is the public sector equipped to procure technology in the public interest? (2024) <https://www.adalovelaceinstitute.org/report/buying-ai-procurement/> accessed 1 November 2024

[13] Redden J and others, ‘Automating Public Services: Learning from Cancelled Systems’ (Collective Wellbeing Carnegie UK 2022) <https://carnegieuktrust.org.uk/publications/automating-public-services-learning-from-cancelled-systems/> accessed 8 November 2024

[14] Nino Bucci, ‘Robodebt Royal Commission Final Report: What Did It Find and What Will Happen Next?’ The Guardian (7 July 2023) <https://www.theguardian.com/australia-news/2023/jul/07/robodebt-royal-commission-final-report-what-did-it-find-and-what-will-happen-next> accessed 5 November 2024

[15] Ada Lovelace Institute, Buying AI: Is the public sector equipped to procure technology in the public interest? <https://www.adalovelaceinstitute.org/report/buying-ai-procurement/> accessed 1 November 2024

[16] Ibid

[17] Ada Lovelace Institute, Buying AI: Is the public sector equipped to procure technology in the public interest? (2024) <https://www.adalovelaceinstitute.org/report/buying-ai-procurement/> accessed 1 November 2024

[18] Stephen Almond, ‘Local Authorities and the AI Revolution’ (Local Gov, 24 August 2023) <https://www.localgov.co.uk/Local-authorities-and-the-AI-revolution/57806> accessed 5 November 2024.

[19] Sylvie Hobden and Dea Begaj, ‘The Tide Is Changing: Monitoring Public Attitudes towards Data and AI’ Responsible Technology Adoption Unit Blog, GOV.UK, 6 December 2023) <https://rtau.blog.gov.uk/2023/12/06/the-tide-is-changing-monitoring-public-attitudes-towards-data-and-ai/> accessed 5 November 2024

[20] Centre for Data Ethics and Innovation, ‘Public Attitudes to Data and AI: Tracker Survey (Wave 3)’ (GOV.UK) <https://www.gov.uk/government/publications/public-attitudes-to-data-and-ai-tracker-survey-wave-3/public-attitudes-to-data-and-ai-tracker-survey-wave-3> accessed 5 November 2024.

[21] Birchall S, ‘A Digital Divide: Study Exposes “rift” between Local Councils and Residents’ [2023] Government Transformation Magazine <https://www.government-transformation.com/en/citizen-experience/study-exposes-rift-between-local-councils-and-their-residents-over-digital-transformation> accessed 5 November 2024

[22] Stephen Bonner, ‘The Use of AI by Local Authorities’ (Information Commissioner’s Office, 19 January 2023) <https://ico.org.uk/about-the-ico/media-centre/blog-addressing-concerns-on-the-use-of-ai-by-local-authorities/> accessed 5 November 2024

[23] Equality and Human Rights Commission, ‘Artificial Intelligence: Meeting the Public Sector Equality Duty (PSED)’ (1 September 2022) <https://www.equalityhumanrights.com/guidance/artificial-intelligence-meeting-public-sector-equality-duty-psed> accessed 5 November 2024

[24] Shah H, ‘Tony Blair Is Wrong – AI Will Not Magically Solve Our Public Services’ (New Statesman, 9 October 2024) <https://www.newstatesman.com/comment/2024/10/tony-blair-is-wrong-artificial-intelligence-ai-publ> accessed 5 November 2024

[25]  ‘Liverpool City Council: Report Finds More Contract Failings’ BBC News (16 June 2022) <https://www.bbc.com/news/uk-england-merseyside-61830195> accessed 12 November 2024

accessed 5 November 2024

[26] Gareth Davies, ‘Use of Artificial Intelligence in Government’ (National Audit Office 2024) <https://www.nao.org.uk/reports/use-of-artificial-intelligence-in-government/> accessed 5 November 2024; ‘What Is a Foundation Model?’ (Ada Lovelace Institute 2023) <https://www.adalovelaceinstitute.org/resource/foundation-models-explainer/> accessed 5 November 2024

[27] Dan Bateyko, ‘Let LLMs Do the Talking? Generative AI Issues in Government Chatbots’ (Center for Democracy and Technology, 13 December 2023) <https://cdt.org/insights/let-llms-do-the-talking-generative-ai-issues-in-government-chatbots/> accessed 5 November 2024

[28] Ada Lovelace Institute, Critical analytics? Learning from the early adoption of data analytics for local authority service delivery (2024) <https://www.adalovelaceinstitute.org/report/local-authority-data-analytics/> accessed 5 November 2024.

[29] Department for Science, Innovation and Technology, Cabinet Office and Central Digital and Data Office, ‘Find out How Algorithmic Tools Are Used in Public Organisations’ (GOV.UK) <https://www.gov.uk/algorithmic-transparency-records> accessed 5 November 2024

[30] Gareth Davies, ‘Use of Artificial Intelligence in Government’ (National Audit Office 2024) <https://www.nao.org.uk/reports/use-of-artificial-intelligence-in-government/> accessed 5 November 2024

[31] ‘Local Government: State of the Sector: AI’ (Local Government Association 2024) <https://www.local.gov.uk/sites/default/files/documents/Local%20Government%20State%20of%20the%20Sector%20AI%20Research%20Report%202024%20-%20UPDATED_3.pdf>.

[32] Department for Science, Innovation & Technology, Department for Digital, Culture, Media & Sport ‘Quantifying the UK Data Skills Gap – Full Report’ (GOV.UK, 18 May 2021) <https://www.gov.uk/government/publications/quantifying-the-uk-data-skills-gap/quantifying-the-uk-data-skills-gap-full-report> accessed 5 November 2024

[33] Ada Lovelace Institute, ‘Majority of British Public Support “Laws and Regulations” to Guide the Use of AI, According to a New Nationwide Survey’ (6 June 2023) <https://www.adalovelaceinstitute.org/press-release/new-nationwide-ai-survey/> accessed 6 November 2024

[34] ‘Spectrum Evolution – International Association for Public Participation’ <https://www.iap2.org/page/SpectrumEvolution> accessed 6 November 2024; Participatory data stewardship: A framework for involving people in the use of data (2021) <https://www.adalovelaceinstitute.org/report/participatory-data-stewardship/> accessed 6 November 2024

[35] Ada Lovelace Institute, Critical analytics? Learning from the early adoption of data analytics for local authority service delivery (2024) <https://www.adalovelaceinstitute.org/report/local-authority-data-analytics/> accessed 5 November 2024; Ada Lovelace Institute, Participatory data stewardship: A framework for involving people in the use of data (2021) <https://www.adalovelaceinstitute.org/report/participatory-data-stewardship/> accessed 6 November 2024

[36] Cesar Nicandro Cruz-Rubi, ‘Citizen Participation and Public Procurement in Latin-America: Case Studies’ (Hivos 2020) <https://hivos.org/resource/citizen-participation-and-public-procurement-in-latin-america-case-studies/> accessed 6 November 2024

[37] Ada Lovelace Institute, Code & conduct: How to create third-party auditing regimes for AI systems (2024) <https://www.adalovelaceinstitute.org/report/code-conduct-ai/> accessed 6 November 2024

[38] Department for Science, Innovation & Technology, ‘Introduction to AI Assurance’ (GOV.UK, 12 February 2024) <https://www.gov.uk/government/publications/introduction-to-ai-assurance/introduction-to-ai-assurance> accessed 6 November 2024

[39] Department for Science, Innovation & Technology, ‘Guidance for Using the AI Management Essentials Tool’ (GOV.UK, 6 November 2024) <https://www.gov.uk/government/consultations/ai-management-essentials-tool/guidance-for-using-the-ai-management-essentials-tool> accessed 6 November 2024.

[40] ‘IEEE Standards Association’ (IEEE Standards Association) <https://standards.ieee.org/ieee/3119/10729/> accessed 6 November 2024

[41] Government Equalities Office and others, ‘Explanatory Notes to Equality Act 2010’ <https://www.legislation.gov.uk/ukpga/2010/15/notes/division/3/1> accessed 11 November 2024

[42] Department for Science, Innovation and Technology and Feryal Clark MP, ‘Tech Experts to Shape Government Digital Vision to Drive Innovation and Boost Public Services’ (GOV.UK) <https://www.gov.uk/government/news/tech-experts-to-shape-government-digital-vision-to-drive-innovation-and-boost-public-services> accessed 5 November 2024

[43] Automated decision tools 2023 [AB 331: Amended] (California Legislature— 2023–2024 Regular Session) <https://leginfo.legislature.ca.gov/faces/billNavClient.xhtml?bill_id=202320240AB331> accessed 7 November 2024

[44] City of San José, ‘Government AI Coalition | City of San José’ (www.sanjoseca.gov) <https://www.sanjoseca.gov/your-government/departments-offices/information-technology/ai-reviews-algorithm-register/govai-coalition#overview> accessed 7 November 2024

[45] Rutinel, M, Titone, B and Rodriguez, R, Consumer Protections for Artificial Intelligence [SB24-205] (Colorado General Assembly, 2024 Regular Session) <https://leg.colorado.gov/bills/sb24-205> accessed 7 November 2024

[46] Ada Lovelace Institute, Code & conduct: How to create third-party auditing regimes for AI systems (2024) <https://www.adalovelaceinstitute.org/report/code-conduct-ai/> accessed 6 November 2024

[47] Before the publication of this report, Michael Katell sadly passed away. We thank him for his contribution and extend our sympathies to his loved ones.


Image credit: Liz Leyden

Related content