The ethics of recommendation systems in public service media
Exploring the ethical implications of public service media use of recommendation systems.
The Ada Lovelace Institute worked with the British Broadcasting Corporation (BBC) to explore the development and use of recommendation systems (also referred to as ‘recommendation engines’) in public service media. In particular, we looked at how public service values are operationalised, what optimisation means in this context, the particular ethical considerations that arise, and how organisations seeking to serve the public can minimise risks and maximise social value. The project was supported by the Arts and Humanities Research Council (AHRC).
Project background
Recommendation systems are special-purpose software designed to suggest products, services and content to a user of an online service. They are usually a feature of a larger program or platform and are widely used in a variety of online commercial services, including shopping, television, news and music. The recommendations are often based on the stated or inferred preferences of users and people with similar profiles or preferences, or on relationships between users’ previous behaviour and the suggested content.
Well-known examples of recommendation systems include Netflix’s recommendations for films and TV shows, Tik Tok’s ‘For You’ page, Google Search’s autocomplete function and Amazon’s recommendations for related products.
However, the prevalence of recommendation systems has raised a number of questions about their ethical impact:
- Personal autonomy: There are concerns about recommendation systems nudging users of services towards a particular outcome, which may be commercially beneficial to the service but potentially harmful to the welfare of the user.
- User profiling: Predictions for suggested content may stem from assumptions about the user’s demographic information, which may in turn entrench societal biases and inequalities.
- Privacy: Systems often require the collection, storage and use of large amounts of personal data.
- Transparency and explainability: Recommendation systems, particularly those based on modern machine learning techniques, often produce decisions that are difficult to explain to lay audiences.
- Polarisation: There are concerns about the polarising effects within societies of recommendation systems that deliver a personalised perspective of news, media and current events.
The use of recommendation systems by public service media organisations like the BBC adds another layer of ethical complexity.
Unlike private-sector firms, whose business model is motivated by shareholder interests and profit, public service organisations are beholden to the public interest. The BBC, and other public-sector media, have explored using recommendation systems as both a way to compete with private-sector competitors and to more effectively meet their public service objectives.
However, there remains a lack of research into the ethics of recommendation systems used in these contexts, and therefore a lack of clear guidance for how public-sector organisations should consider designing and implementing such systems.
Project overview
Through this project, the Ada Lovelace Institute investigated how public service values are operationalised and optimised for in the development of recommendation systems, what kinds of ethical considerations may arise in using these technologies to serve public service ends, and how organisations seeking to serve the public can minimise those risks while maximising social value.
By undertaking a literature review and qualitative interviews with the BBC and other national public broadcasters, academics and developers, the research answers the following questions:
- What are the values that public service media organisations adhere to? How do these differ from the goals that private-sector organisations are incentivised to pursue?
- In what contexts do public service media use recommendation systems?
- What value can recommendation systems add for public service media and how do they square with public service values?
- What are the ethical risks that recommendation systems might raise in those contexts? And what challenges should teams consider?
- What are the mitigations that public service media can implement in the design, development, and implementation of these systems?
Read our findings and recommendations in the full report: Inform, educate, entertain… and recommend?
We are grateful to the Arts and Humanities Research Council (AHRC) who have supported this work with a £100k grant. Ada’s relationship with the BBC is governed by a memorandum of understanding (MOU), which is available here.
Image credit: British Broadcasting Company
Project publications
Inform, educate, entertain… and recommend?
Exploring the use and ethics of recommendation systems in public service media
Related content
Examining the Black Box
Identifying common language for algorithm audits and impact assessments
Algorithmic accountability for the public sector
Learning from the first wave of policy implementation
Transparency mechanisms for UK public-sector algorithmic decision-making systems
Existing UK mechanisms for transparency and their relation to the implementation of algorithmic decision-making systems