Review of Online Safety Bill
Review of Online Safety Bill
OPEN ACCESS
l
Perspective
A critical review of the Online Safety Bill
Markus Trengove,1,2,* Emre Kazim,2,3 Denise Almeida,4 Airlie Hilliard,2,5 Sara Zannone,2 and Elizabeth Lomas4
1
School of Public Policy, University College London, 29 Tavistock Square, London WC1H 9QU, UK
2
Holistic AI, 18 Soho Square, London W1D 3QL, UK
3
Department of Computer Science, University College London, Gower Street, London WC1E 6EA, UK
4
Department of Information Studies, University College London, Gower Street, London WC1E 6BT, UK 5Institute
of Management Studies, Goldsmiths, University of London, New Cross, London SE14 6NW, UK
*Correspondence: [email protected] https://doi.org/10.1016/j.patter.2022.100544
THE BIGGER PICTURE This critical perspective makes a timely contribution to the tech policy debate con- cerning the monitoring and moderatio
Production: Data science output is validated, understood, and regularly used for multiple domains/platforms
SUMMARY
The UK Parliament has tabled the Online Safety Bill to make the internet safer for users by requiring pro- viders to
regulate legal but harmful content on their platform. This paper critically assesses the draft legislation, surveying its
rationale; its scope in terms of lawful and unlawful harms it intends to regulate; and the mechanisms through which
it will be enforced. We argue that it requires further refinement if it is to protect free speech and innovation in the
digital sphere. We propose four conclusions: further evi- dence is required to substantiate the necessity and
proportionality of the Bill’s interventions; the Bill risks a democratic deficit by limiting the opportunity for
parliamentary scrutiny; the duties of the bill may be too wide (in terms of burdening providers); and that
enforcement of a Code of Practice will likely be insuf- ficient.
ecosystem. Instead, we suggest that an ethical design approach 3. To take additional measures to protect all users from
would be better suited to resolving the problems that the Online con- tent that is harmful without being illegal, if the service is
Safety Bill is meant to address, and pro- vides developers with of a sufficient reach and magnitude (section 11). Some
the flexibility to adopt a plurality of preventative measures. online interactions, while legal, can nevertheless be harmful.
Our intended readership are those with an interest in the regulation This includes online campaigns of harassment that are not
of digital services, both in industry and in policy- making. This is not covered by existing criminal law, and the proliferation of
limited to policymakers: we argue that choices at a technical level disinformation (fake news). The purpose of the Bill is to place
(i.e., what tools to use to make the platforms safer) have important a duty of care upon service providers to ensure that their users
moral and political implica- tions, and so it is important that those in are protected not only from unlawful harm, but also from
data science and ma- chine learning understand the consequences lawful and harmful content.
of these tools. We intend to contribute to an increasing debate In the sections below, we highlight key features of the legisla- tion
with the hope of improving the UK’s digital regulation. The and the duty it places on service providers, with a focus on those
arguments we present here reflect our critical opinion, rather than an features that we argue are problematic. The duty is justi- fied with
exposition of scientific fact, and so we invite critical reply to our reference to the role of user-to-user and search services in
claims here. We begin with an overview of the legislation. Those proliferating harm, and so it is ultimately a duty owed to the users of
interested in our commentary should move directly to the section internet services (albeit arbitrated by the government and regulator).
‘‘Measuring the Bill through proportionality and ne- cessity.’’ The duty requires services to comply with Codes of Practice, and to
report their compliance to the regulator to ev- idence their execution
AN OVERVIEW OF THE ONLINE SAFETY BILL of their duty. Concomitantly, the regulator, Ofcom, is empowered to
enact and enforce the Codes of Prac- tice, although the Minister has
In 2020, the UK government issued the Online Harms White Pa- per the power to direct this enactment.
with the purpose of creating a safer internet ecosystem and repairing
public trust in digital platforms and search services. The White Paper Rationale for duty
has subsequently led to the Online Safety Bill, a draft piece of In its initial White Paper on the subject (then under the auspices of
legislation giving more regulatory substance to the ambitions set out the Online Harms Bill), the government presented both evi- dence of
in the original White Paper. In this section, we offer a selected the extent of online harms and a precis of existing ef- forts to regulate
overview of the proposed legislation, where our selection of emphasis online harms. We survey these reasons and highlight potential
is presented with a view to offering commentary on those specific lacunas here by way of exposition before our critical analysis in
elements. section 2, which assesses whether the Bill’s rationale as given
The Bill’s approach is to place a duty of care on internet ser- vice demonstrates that tighter regulation of service providers is necessary
providers of both user-to-user services in which users interact with for reducing harm.
each other online, such as the manner in which users typically interact The government presents the following online harms as the crucial
on platforms like Facebook and Twitter, and search services that impetus for the Bill:
index information and allow users to navigate the internet, such as
Google and Bing. The duty of care is framed in broad terms in the d Child sexual exploitation and abuse online: the White Paper
Bill, but it is composed of three distinct duties:7 cites the Internet Watch Foundation’s statistics regarding the
circulation of CSAM. The White Paper notes that, of the
1. To protect users from illegal content (section 9): although
80,319 cases of CSAM confirmed by the IWF, 43% of the
the production and dissemination of CSAM and terrorist
children involved were between 11 and 15, 57% were under 10,
propaganda are already illegal, the purpose of the Bill is to and 2% were under 2.2
place a duty of care upon service providers to control the
d Terrorist content online: the government’s concern is that the
digital space with a view to limiting the potential spread of
internet allows the proliferation of terrorist propaganda. The
illegal content. It is unclear what the Bill is able to accomplish
White Paper repeats the Rt Hon Amber Rudd’s claim that five
in this space that is not already possible under the existing
of the terrorist incidents in the UK in 2017 included internet
legal landscape.
elements, implying that they were radicalized by international
2. To take additional protective measures to make their site
groups, such as ISIS and Daesh.3
safe for children, if their service is likely to be used by
d Content illegally uploaded from prisons: the White Paper
chil- dren (section 10). Children on the internet are vulnerable to
claims (without citation) that there is an increase in the
grooming, cyberbullying, encouragement to self-harm, and
amount of content transmitted illegally from prisons.
harm to their mental health. The purpose of the Bill is to
d The sale of opioids online: the White Paper cites the Na-
place a duty of care on service providers to ensure the safety of
tional Crime Agency’s statistics that there have been at least
children on their platforms. The litmus test for ‘‘a service
146 opioid-related deaths in the UK since 2016. It claims
likely to be used by children’’ is not well defined.
(without reference) that opioids are sold on ‘‘several well-
known social media sites.’’
d Cyberbullying: the White Paper cites National Health Ser- vice
data that one in five children aged 11–19 has experi- enced
cyberbullying. Of those who experienced cyber-
members propose and pass a vote to reject the Codes. Scrutiny of the vices affected and civil society organizations representing stake-
Codes of Practice, therefore, is the exception, rather than the default holders. Second, the guidelines are meant to represent best practice in
position of Parliament. the industry, but enterprises can deviate from the guidelines with
sufficient reason. Third, the guidelines do not create new duties:
Assessing the Bill rather, they are meant only to help enterprises easily navigate the
In this section, we have surveyed the government’s own ratio- nale duties already established in the Act.
for the Bill, as well as assessing the scope and content of the duties of The DSA takes a comparable approach to the Platform
care that the Bill imposes. Our focus for the remainder of this paper Accountability and Consumer Transparency Act (PACT Act), which
will be on: has been proposed in the Senate of the United States. 10 Concerning
legal but harmful content, The PACT Act, like the DSA, focuses on
(1) the causal claims in the government’s rationale for the Bill design features of user-to-user services: the Act stipulates
and whether they justify the necessity of the legislation; transparency and process requirements for acceptable use,
(2) the authority that the Bill vests in the Minister and in complaints, and content moderation, requiring services to submit
Ofcom; transparency reports (section 5).
(3) the wide scope of content that is included in service pro- By contrast, the national legislatures in Brazil 11 and India12 have
viders’ duty of care per the Bill; and both considered much stricter regulation of content moni- toring
(4) the Bill’s reliance on Codes of Practice to be prescribed by online. The Brazilian executive issued a Provisional Mea- sure 1068
Ofcom in conjunction with the Minister. to restrict content removal by social media platforms, limiting
removal only to cases of nudity, violence, narcotics, and incitement to
In the sections that follow, we argue that: the government’s
crime, thereby preventing social media platforms from removing
rationale for the Bill provides insufficient justification for the
disinformation (such as President Jair Bolso- naro’s COVID-19
necessity of this intervention, and that more consultation and
disinformation removed by Facebook, Twitter, and YouTube).13 The
justification are needed; the Bill grants far-reaching powers of
Indian government has similarly issued a number of regulations,
interference to the executive; the scope of content covered by the Bill
including the Information Technology Act14 and Information
is worryingly broad; the emphasis on Codes of Practice is inapt; and
Technology (Intermediary Guidelines and Digital Media Ethics
that the Bill creates potential obstacles for small and medium
Code) Rules of 2021,12 which direct user- to-user services to
enterprises.
remove a wide range of content, including material that threatens
the sovereignty of the Indian state, to use algorithmic systems to
INTERNATIONAL COMPARISONS monitor and remove harmful content, and to trace encrypted
messages to limit online anonymity. Activist groups have claimed
It is worth comparing the Bill with its international equivalents, since that these measures are aimed at curbing dissent against the
legislators in a number of jurisdictions have sought to regu- late government, resulting in what they call ‘‘digital authoritarianism.’’15
content-moderation on social media platforms. These pro- posed These comparators are useful in framing the different degrees to
legislative interventions provide us with a useful set of benchmarks which governments have chosen to interfere with services’ content
against which to measure the Safety Bill. monitoring and moderation. The US and EU models are focused on
The Parliament of the European Union is currently considering the design choices that empower users by making the terms and
proposed Digital Services Act (DSA) to address content moderation procedures of user-to-user services transparent and accessible. The
in the EU.9 Like the OSB, the DSA is aimed at pro- tecting the human Indian and Brazilian models, by contrast, are focused much more
rights of citizens of the EU online. However, the proposal differs in explicitly on directing the content that is permissible on user-to-user
several important regards. The DSA stip- ulates more detailed, services. The UK government has intimated its inclination toward the
design-based duties with regard to legal but harmful content: user-to- former approach, but this remains relatively underdeveloped in the
user services must contain clear and accessible terms and conditions, Bill itself, as we discuss in the following sections.
content-reporting proced- ures, and appeal procedures following
content or user removal, as well as requiring large platforms to MEASURING THE BILL THROUGH PROPORTIONALITY
cooperate with ‘‘trusted flaggers’’ (drawn from expert and AND NECESSITY
professional institutions) who report harmful content (section 3). The
DSA also requires very large platforms to perform assessments of The Online Safety Bill will necessarily limit individuals’ right to
their systemic risks, including systemic design features that freedom of expression, and place costly positive duties on entre-
threaten the exercise of fundamental rights, to declare the parameters preneurs that limit their free enjoyment of their property (not to
of their recom- mender systems, and to evidence their mitigation mention downstream effects on their competitiveness). These rights
strategies for minimizing system risk (section 4). Therefore, with regard are enshrined in the European Convention on Human Rights (article
to legal but harmful content, the DSA is concerned only with systemic 10) and the subsequent Paris Protocol (article 1). However, we are
design features of user-to-user services. concerned here not with the legal right—partic- ularly since the UK
To help enterprises, the DSA empowers the European Com- parliament reserves the right to enact legisla- tion that is incompatible
mission to issue guidelines for the fulfilment of their duties stipu- with its commitment to the Convention. Rather, we are concerned
lated in the Act (article 27). This differs from the Codes of Conduct in with the normative right that underpins the aforementioned legal
three important respects. First, the Commission is obliged to compose creations. As our point of departure, we
the guidelines in collaboration with the ser-
synthesis and production, and suicide. Violations are required to be from Online Abuse (NSPCC). http://www.nspcc.org.uk/about-us/news-
registered with the Federal Roskomnadzor register.41 opinion/2021/poll-shows-widescale-public-support-for-stronger-laws-to-
protect-children-from-online-abuse/.
CONCLUDING REMARKS 7. Department for Digital, Culture, Media and Sport (2022). Draft Online Safety
Bill (DCMS).
We share the government’s concerns about the potential haz- ards of 8. Mars, B., Gunnell, D., Biddle, L., Kidger, J., Moran, P., Winstone, L., and
the internet, particularly with regard to vulnerable groups, particularly Heron, J. (2020). Prospective associations between internet use and poor mental
health: a population-based study. PLoS One 15, e0235889.
children. However, this is not the only imperative at stake: it is also https://doi.org/10.1371/journal.pone.0235889.
important that the government foster an open internet, on which free
speech and innovation can flourish. We accept that the state cannot 9. European Commission (2020). Proposal for a REGULATION of the EURO- PEAN
PARLIAMENT and of the COUNCIL on a Single Market for Digital Services
fully satisfy all of these imperatives simultaneously: the state will (Digital Services Act) and Amending Directive 2000/31/EC (Euro- pean
necessarily have to make tradeoffs between safety, liberty, and Commission).
innovation. 10. Schatz, B. (2021). Platform Accountability and Consumer Transparency Act
Insofar as we have been critical of the Online Safety Bill, it has (PACT) Act (Congress of the United States).
been because we think it has not yet achieved an optimal bal- ance
11. Imprensa Nacional (2021). MEDIDA PROVISO´ RIA No 1.068, DE 6 DE Se-
between these imperatives. First, we argue that the Depart- ment must tembro DE 2021 (DOU - Imprensa Nacional).
do more to justify this legislative intervention: there is a paucity of
12. Ministry of Electronics and Information Technology (2021). Information
justificatory evidence for the scope of the Bill in the current White Technology rules. https://prsindia.org/files/bills_acts/bills_parliament/2021/
Papers issued in its support. Second, we have argued that the Intermediary_Guidelines_and_Digital_Media_Ethics_Code_Rules-2021.pdf.
mechanisms of the Bill do not do enough to pro- tect the liberties of
platforms and their users, because it effec- tively defers much of the 13. Satariano, A. (2021). YouTube Pulls Videos by Bolsonaro for Spreading
power to regulate platforms to the Min- ister. Third, we argue that the Misinformation on the Virus (N. Y. Times).
Bill imposes overly wide duties on platforms that can be deleterious 14. Indian Computer Emergency Response Team (2008). Information and
to smaller enterprises and in- crease government intervention. Fourth, Technology (Amendment) Act (Available at https://eprocure.gov.in/cppp/
we argue that it is imperative for the government to commit to an rulesandprocs/kbadqkdlcswfjdelrquehwuxcfmijmuixngudufgbuubgub-
fugbububjxcgfvsbdihbgfGhdfgFHytyhRtMTk4NzY=).
ethical-by-design approach to the duty of care.
It is our opinion that it is possible for the government to correct the 15. Indian government must correct moves toward digital authoritarianism, allow
tech platforms to uphold rights (2021). Access now. https://www.
problems with the Bill and the White Papers we suggest here without accessnow.org/farmer-protests-india-censorship/.
having to make significant sacrifices to its strategic aims. We suggest
that these changes—while seemingly small—will have a significant 16. Rodin, D. (2004). War and self-defense. Ethics Int. Aff. 18, 63–68. https://
doi.org/10.1111/j.1747-7093.2004.tb00451.x.
effect on making the internet freer, more open, and more innovative
—as well as making it safe. 17. Rodin, D. (2011). Justifying harm. Ethics 122, 74–110. https://doi.org/10.
1086/662295.
AUTHOR CONTRIBUTIONS 18. McMahan, J. (2007). The sources and status of just war principles. J. Mil. Ethics
6, 91–106. https://doi.org/10.1080/15027570701381963.
M.T., E.K., D.A., and A.H. were responsible for conceptualization and writing
19. Office for National Statistics (2020). Online bullying in england and wales: year
(original draft and review & editing). S.Z. and E.L. were responsible for review &
ending march 2020. https://www.ons.gov.uk/peoplepopulationandcommunity/
editing. crimeandjustice/bulletins/onlinebullyinginenglandandwales/yearendingmarch2020.
DECLARATION OF INTERESTS 20. Sedgwick, R., Epstein, S., Dutta, R., and Ougrin, D. (2019). Social media,
internet use and suicide attempts in adolescents. Curr. Opin. Psychiatr. 32, 534–
541. https://doi.org/10.1097/YCO.0000000000000547.
The authors declare no competing interests.
21. DeAndrea, D.C. (2015). Testing the Proclaimed affordances of online sup- port
REFERENCES groups in a nationally representative sample of adults seeking mental health
assistance. J. Health Commun. 20, 147–156. https://doi.org/10.
1080/10810730.2014.914606.
1. Department of Digital, Culture, Media, and Sport (2020). Online harms white
paper. https://www.gov.uk/government/consultations/online-harms- white- 22. Santana, A.D. (2014). Virtuous or Vitriolic: the effect of anonymity on civility in
paper/online-harms-white-paper. online newspaper reader comment boards. Journal. Pract. 8, 18–33.
https://doi.org/10.1080/17512786.2013.813194.
2. Internet Watch Foundation (2020). Internet Watch Foundation annual report
2020 - face the facts. https://www.iwf.org.uk/about-us/who-we- are/annual- 23. Thaler, R.H., and Sunstein, C.R. (2008). Nudge: Improving Decisions about Health,
report/. Wealth, and Happiness (Yale University Press).
3. Amber, R. (2018). Speech at San Francisco Digital Forum. 24. Bovens, L. (2009). The ethics of nudge. In Preference Change, T. Gru€ne-
Yanoff and S.O. Hansson, eds. (Springer Netherlands), pp. 207–219.
4. Amnesty International (2017). Amnesty Reveals Alarming Impact of Online Abuse
against Women (Amnesty International). https://www.amnesty.org/ 25. Nordfa€lt, J., Grewal, D., Roggeveen, A.L., and Hill, K.M. (2014). Insights
en/latest/news/2017/11/amnesty-reveals-alarming-impact-of-online-abuse- against- from in-store marketing experiments. In Review of Marketing Research,
women/. D. Grewal, A.L. Roggeveen, and J. NordfA¨ lt, eds. (Emerald Group Publish- ing
Limited), pp. 127–146.
5. Reuters Institute for the Study of Journalism (2021). Reuters Institute Dig- ital
News Report 2021 (Reuters). 26. Kurz, V. (2018). Nudging to reduce meat consumption: immediate and
persistent effects of an intervention at a university restaurant. J. Environ. Econ.
6. National Society for the Prevention of Cruelty to Children (2021). Poll Shows Manag. 90, 317–341. https://doi.org/10.1016/j.jeem.2018.06.005.
Widescale Public Support for Stronger Laws to Protect Children
27. Adam, A., Jensen, J.D., Sommer, I., and Hansen, G.L. (2017). Does shelf space
management intervention have an effect on calorie turnover at
34. Pinto, A., Pauze´ , E., Mutata, R., Roy-Gagnon, M.-H., and Potvin Kent, M. (2020). 41. Roskomsvoboda (2021). Social media self-censorship law comes into force.
Food and beverage advertising to children and adolescents on television: a https://roskomsvoboda.org/post/vstupil-v-silu-zakon-o-samotsenzure-sots/.
baseline study. Int. J. Environ. Res. Public. Health 17, 1999.
https://doi.org/10.3390/ijerph17061999. About the author
Markus Trengove is a senior researcher in AI ethics, law, and policy at Holi-
35. Alaniz, M.L. (1998). Alcohol availability and targeted advertising in racial/
sticAI. He holds a PhD in political science from University College London.
ethnic minority communities. Alcohol Health Res. World 22, 286–289.