Responsible innovation: what does it mean and how can we make it happen?
Exploring the foundational premises for delivering ‘world-leading data protection standards’ that benefit people and achieve societal goals
On 10 September, the UK Government published its proposal for amending the current data protection regime (the UK GDPR). The aim is to create ‘a pro-growth and pro-innovation data regime whilst maintaining the UK’s world-leading data protection standards’.
At the Ada Lovelace Institute, our mission is to ensure that data and AI work for people and society. In order to explore whether the Government’s plans will enable these aims, we are organising a series of five events, each looking at different sections, questions, statements and framing in the Government’s consultation and asking what benefits and challenges are brought by the proposals.
Session 1: Responsible innovation: what does it mean and how can we make it happen?
The first event in our series focuses on responsible innovation. This is the focus of chapter 1 of the consultation, where Government suggests that parts of the law present barriers to responsible innovation (scroll down for a summary of the relevant consultation proposals).
This event interrogates the pro-growth, pro-innovation framing of the proposed data reform and explores questions such as:
- What is the impact of innovation on society? Is innovation always good? What does responsible innovation mean?
- What do we know about the impact of regulation on innovation (in the data/tech sector or in other domains)? What is the historical relationship between regulation and innovation?
- How would innovation be different if these proposals were in place? What are the benefits, and what are the potential challenges?
Scroll down for a summary of the key points discussed, and/or watch back the event here:
This video is embedded with YouTube’s ‘privacy-enhanced mode’ enabled although it is still possible that if you play this video it may add cookies. Read our Privacy policy and Digital best practice for more on how we use digital tools and data.
Chair
Panellists
-
Tommaso Valletti
Professor of Economics, Imperial College -
Katie Lips
Head of Digital & Scams Policy, Which? -
Ravi Naik
Legal Director, AWO
What does the consultation say about responsible innovation?
The consultation takes the approach that innovation is a central driver to economic growth and that data-driven innovation has the potential to deliver economic and societal benefits. One of the overarching principles guiding the Government’s consultation is that ‘the UK’s data protection regime should be future-proofed with a responsive framework that enables responsible innovation and a focus on privacy outcomes that avoids imposing any rules today that become obsolete as the technological landscape evolves’. The first chapter, of five, is focused on ‘Reducing barriers to responsible innovation’.
While the consultation document recognises the importance of regulatory certainty and high data protection standards, the changes proposed imply a desire to liberate data from strong protections and loosen regulatory standards in order to promote data-driven innovation. For example, the provisions in Chapter 1 are about facilitating increased data collection and data sharing in the public interest, while the provisions in Chapter 2 on ‘delivering better outcomes for people’ appear to be focused on relaxing the accountability measures. See more specific proposals below.
The consultation says some stakeholders have told the Government that some elements of the law are creating barriers to responsible innovation. This includes some definitions being unclear or lacking case law or guidance, concerns about legality because of rules being difficult to navigate (such as the use and re-use of personal information in research), and uncertainty as to when personal data can be processed on legal grounds other than consent.
Summary of the relevant consultation proposals:
The consultation proposes a number of changes, including:
- Bringing together in one place various provisions across the Data Protection Act and General Data Protection Regulation (GDPR) relating to the use of data in research and defining ‘scientific research’ in legislation (section 1.2).
- Either clarifying the existing reasons allowing data to be used in research, or introducing a new legal ground with appropriate safeguards (1.2).
- Allowing people to give permission for their data to be used more widely in scientific research, when it may not be clear what all these uses are when their data is collected (1.2).
- Removing the requirement for organisations using data for research purposes to have to give further information to those they’ve directly collected data from about any ‘further processing’ of their data (1.2).
- Changing the rules around ‘further processing’ of data – where people’s data can be used for purposes different to those for which it was collected, or used by others not involved in the original collection, or where the data is being used for similar but not identical reasons to which it was originally collected (1.3).
- Creating a list of reasons (legitimate interests) for processing someone’s personal data where an organisation does not have to apply a balancing test – this is where the organisation has to show that their interests outweigh those from whom the data was collected (1.4).
- Defining ‘fairness’ in the use of data and artificial intelligence systems (1.5, and see our fourth event in the series).
- Seeking views on whether personal data should be used ‘more freely, subject to appropriate safeguards’ for the purposes of training AI systems, including clarifying the law around using sensitive personal data to monitor bias in AI systems (1.5).
- Reforming or abolishing article 22 of GDPR, which is designed to protect individuals where decisions are made about them purely by automated systems (1.5).
- Seeking views on the use of data to profile people and groups, including whether existing safeguards are effective and proportionate (1.5).
- Clarifying the way ‘anonymous data’ about people is defined (1.6).
- Seeking views on what role government should play in supporting data intermediaries – institutions involved in sharing data between different parties (1.7).
Chapter 2 is also relevant, as it discusses ‘reducing burdens on businesses’ through changing how organisations should comply with data legislation.
Key points discussed during the event:
- Innovation may create value, but it may also extract or destroy it. Regulation helps limit the latter two types of innovation: well-designed regulation may help the first kind to survive and flourish.
- The proposed reforms are written to entrench a specific ad-based business model used by big tech companies. This may have been innovative some years ago, but it is now business-as-usual.
- Consumers have an increasing understanding of the ways that their data are used, and they do not want to give up the controls that they already have.
- The GDPR already provides a regime for research – providing more guidance on the current system is preferable to eroding existing protections.
- Cookie ‘pop-ups’ provide a veneer of legality to tracking mechanisms that may actually be unlawful. Enforcement of the law as it stands would remove the mechanisms (and thus the pop-ups).
- A new data protection regime would impose compliance costs on business, and potentially jeopardise adequacy with the EU, stifling innovation.
Speaker interventions:
Tommaso Valletti, Professor of Economics at Imperial College, told the audience that one of the prevailing narratives – that regulation kills innovation – is too simplistic. Instead, he told participants, we should think about three different kinds of innovation: innovation can create, extract, or destroy value.
Innovation that creates value is what we should be aiming for as a society: it creates opportunities, reduces costs, improves efficiency and effectiveness. There is a chance that regulation could stifle this.
On the other hand, regulation can work wonders for restricting the kinds of innovation that we don’t want. Innovation that extracts value increases the profits made by companies that already have a large market share. Tommaso pointed to the acquisition of Fitbit by Google as an example of innovation that increases profits, with no expectation that consumers – let alone citizens – would share in the benefits.
Regulation can also work to limit innovation that destroys value: also known as toxic innovation. He gave the example of the core business model of firms like Google and Facebook, which aim to maximise engagement in order to sell advertising.
In response to a question about whether regulation can only minimise harms, Tommaso pointed out that good regulation should aim to minimise harm but also incentivise positive externalities. Harm-minimising regulation might be more obvious: for example, mandating seatbelt use, or labelling foods. But well-designed regulation might create flourishing ecosystems. In the UK tech sector at present, Tommaso pointed out, companies may aim to innovate in order to be acquired by one of the existing big tech firms: regulation might enable more small innovators to survive, thrive and change the trajectory of technology. Governments may not be good at anticipating technological changes, but regulation at least gives those changes a chance.
Tommaso – who had earlier told the audience that until 2019, he was the European Commission chief economist for competition – noted approvingly that discussions about data protection and about competition were no longer happening in separate silos, since for tech companies whose business model relies entirely on data, assessing the data is crucial in order to assess competition. Unfortunately, he said, tech companies have not given access to independent researchers, which diverts academic attention to other industries. He also noted the privacy impacts that come from sharing data, and emphasised that people who are ‘statistically similar’ may have very different privacy perspectives. The precedents in the UK, he said, are not good. He cites the example of patient records from the Royal Free Hospital being shared with tech company DeepMind, now acquired by Google, he pointed out that this data would not even have been available to academic medical researchers. ‘We have privatised something that is a common good and that’s the worst situation that can actually happen.’
Katie Lips, Head of Digital & Scams Policy at consumer protection body Which? told the audience that it’s important to consider whether we want ‘disruptive innovation’, or ‘responsible innovation,’ and whether these are different from ‘consumer-strategic innovation’. Eroding privacy, she noted, carries a real risk of alienating consumers, who need to be at the heart of innovation processes. Innovation is about challenging and questioning, Katie pointed out it’s uncomfortable and risky, in comparison to business-as-usual, where the operating model doesn’t change much, even if it is improved a little over time. Incumbents don’t tend to innovate, because it requires questioning the fundamentals of their own business, and they rarely see innovators coming.
Big tech companies, she said, were once the innovators, and this brought lots of consumer benefits. But what is being proposed by the Government just seems to accelerate the status quo by allowing ad-based businesses to grow. ‘Innovation today’, she said, ‘should be to change that and to do better than that.’
Consumer relationships with data are starting to change. People are more wary about what they share with private companies, and they want more control. Two reports, ‘Are You Following Me?’ and ‘Are You Still Following Me?’ found that consumers are increasingly data-savvy, and when they learn about how much they are being tracked online, it can be very shocking.
Consumer-centric innovation, to Katie, looks like greater control over data, more choice, more rights and more transparency. Changes to the current data-protection regime, which undermine these rights and hand more power to big tech companies isn’t innovation at all, she noted, it blocks new innovators from challenging these business models, which are now business-as-usual. ‘As a consumer,’ Katie said, ‘I want more choice. I want it to be possible for new players to enter the market and not be blocked out. I doubt that true innovation will come from the incumbents.’
Skewing too far towards business needs, she said, alienates consumers. But without them, there’s really no innovation at all. But the current consultation doesn’t give enough focus to consumers (or patients, or citizens or other groups of people). Katie noted that the proposed reforms are being spun as ‘getting rid of annoying cookie barriers, but when you explain what people will actually give up, they want to challenge the changes. ‘For us, talking to consumers is always the most valuable thing,’ she said, and Which? is finding that people want to keep the controls that they have.
Ravi Naik, Legal Director at digital rights agency AWO, started by talking about the GDPR as it currently is. He described it as a ‘charter of rights to balance those two competing interests: to give individuals control over how their information is used, and regulations over those who control our information,’ and pointed out that it’s a diligent and nuanced document. A lot of the Government proposal focuses on research exemptions, but as Ravi noted, research is already a core element of the GDPR, and there is a whole well thought out framework carved out for it. In fact, Ravi said, he is already working with the European Digital Media Observatory on creating a code of conduct for both platforms and researchers to facilitate active research and data gathering from big platforms, and this is based on the GDPR as it currently is.
Consolidating and providing guidance, especially about responsibility and liability, could solve a lot of the issues raised in the consultation without having to minimise rights safeguards. Answers to questions about research shouldn’t be buried in a schedule that only Ravi and other lawyers have read, he argued it should be clear to researchers and to platforms.
Putting data protection at the heart of innovation embeds trust, and enables people to really understand what rights they have and the impact that a technology might have. Ravi also spoke about the proposed extension of the definition of ‘legitimate interests,’ and particularly the assumptions that underly this proposal. The Government states that personalised advertising requires knowledge about an individual, and that cookies are low-risk. Referring back to Katie’s intervention, Ravi noted that both of these are open to question as, in fact, is the legality of some of the tracking mechanisms used. He pointed out that using consent (in the form of ‘cookie pop-ups’) is ‘more of a veneer of legality,’ and that, if regulators enforced the current laws, those pop-ups might disappear because people will only be tracked in ways that are lawful. He noted that there are real-world consequences of the current situation, using the example of vulnerable gamblers being profiled and nudged towards addictive behaviours.
Ravi also noted two possible adverse consequences of the proposals. Firstly, new regulations would mean a new regime of compliance, which poses additional costs to business and might stifle innovation as a result. Secondly, there is the issue of adequacy with the EU. Jeopardising the UK’s adequacy for data protection could risk the future of data flows between Europe and the UK.
Q&A
In the Q&A session at the end of the event, Ravi responded to a question about Data Protection Impact Assessments (DPIAs) that in two cases (Bridges, on facial recognition, and test-and-trace), the failure to carry out a DPIA properly – or at all – has been found to be unlawful. If DPIAs had been carried out, problems and vulnerabilities in the system would have been identified earlier. He pointed out that not all processing is high-risk, but that it is appropriate to measure the impact of high-risk processing before you do it.
Tommaso pointed out that in Facebook’s early days, it differentiated itself from MySpace by offering more privacy, but when they became a monopoly, that was dropped. So when there was some competition, privacy was one of the differentiators, but now, he pointed out, people are stuck with monopolies that are not interconnected. Lawyers like Ravi, he said, talk about privacy as a fundamental human right, but economists think in terms of tradeoffs. Tommaso pointed to a recent paper by Damon Acemoglu on AI Harms, which presents theories that currently are not tested because there is a lack of data. Without data, he said, we are in the hands of whoever presents the best narratives. We need a diversity of views to think about our digital future: what do we want? Only then, he said, will we be able to think about some good solutions to the current problems we have.
Katie spoke about the ways that people who want to innovate actually work in practice. The text of the GDPR, she said, is not necessarily on the radar of people who work in product management – they want to deliver an experience for consumers.
In response to a question about regulation, Katie said that she would like to see proactive, rather than reactive regulation. She pointed out that much of the Government’s proposals seem to be reactions to changes we know about today, and don’t go far enough in thinking about how to provide frameworks for what might come five or ten years down the line. Ravi said that as the GDPR is still quite young – it renovates some previous provisions, but the powers given to the ICO are still new – he would like to see the ICO more empowered as a regulator. Ravi also praised the opening up of algorithmic audits, including by the Ada Lovelace Institute, as a future-proof development, and noted that he would also like to see more meaningful transparency: ‘meaningful transparency makes a lot of meaningful difference.’
Related content
Lessons learned from COVID-19: how should data usage during the pandemic shape the future?
Exploring the foundational premises for delivering ‘world-leading data protection standards’ that benefit people and achieve societal goals
Participatory data stewardship
A framework for involving people in the use of data
Disambiguating data stewardship
Why what we mean by ‘stewarding data’ matters
Exploring legal mechanisms for data stewardship
A joint publication with the AI Council, which explores three legal mechanisms that could help facilitate responsible data stewardship