0% found this document useful (0 votes)
16 views10 pages

Review of Online Safety Bill

The UK Parliament has tabled the Online Safety Bill to make the internet safer by requiring providers to regulate harmful content. This paper critically assesses the draft legislation, surveying its rationale, scope of harms covered, and enforcement mechanisms. The authors argue further refinement is needed to protect free speech and innovation while achieving the bill's goals.

Uploaded by

andychancool
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
Download as docx, pdf, or txt
0% found this document useful (0 votes)
16 views10 pages

Review of Online Safety Bill

The UK Parliament has tabled the Online Safety Bill to make the internet safer by requiring providers to regulate harmful content. This paper critically assesses the draft legislation, surveying its rationale, scope of harms covered, and enforcement mechanisms. The authors argue further refinement is needed to protect free speech and innovation while achieving the bill's goals.

Uploaded by

andychancool
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1/ 10

l

OPEN ACCESS
l
Perspective
A critical review of the Online Safety Bill
Markus Trengove,1,2,* Emre Kazim,2,3 Denise Almeida,4 Airlie Hilliard,2,5 Sara Zannone,2 and Elizabeth Lomas4
1
School of Public Policy, University College London, 29 Tavistock Square, London WC1H 9QU, UK
2
Holistic AI, 18 Soho Square, London W1D 3QL, UK
3
Department of Computer Science, University College London, Gower Street, London WC1E 6EA, UK
4
Department of Information Studies, University College London, Gower Street, London WC1E 6BT, UK 5Institute
of Management Studies, Goldsmiths, University of London, New Cross, London SE14 6NW, UK
*Correspondence: [email protected] https://doi.org/10.1016/j.patter.2022.100544

THE BIGGER PICTURE This critical perspective makes a timely contribution to the tech policy debate con- cerning the monitoring and moderatio
Production: Data science output is validated, understood, and regularly used for multiple domains/platforms

SUMMARY

The UK Parliament has tabled the Online Safety Bill to make the internet safer for users by requiring pro- viders to
regulate legal but harmful content on their platform. This paper critically assesses the draft legislation, surveying its
rationale; its scope in terms of lawful and unlawful harms it intends to regulate; and the mechanisms through which
it will be enforced. We argue that it requires further refinement if it is to protect free speech and innovation in the
digital sphere. We propose four conclusions: further evi- dence is required to substantiate the necessity and
proportionality of the Bill’s interventions; the Bill risks a democratic deficit by limiting the opportunity for
parliamentary scrutiny; the duties of the bill may be too wide (in terms of burdening providers); and that
enforcement of a Code of Practice will likely be insuf- ficient.

INTRODUCTION: ONLINE REGULATION IN THE UK


The internet seems a conducive environment for these harms: it is
a space that sprawls jurisdictions, it develops at a faster pace
The internet has become increasingly integrted into our individ- ual
than regulation, and it allows great degrees of anonymity and
and communal lives: over 90% of UK citizens are now online.
secrecy for those wanting to commit wrongdoing with impunity.
Indeed, with this, existing social maladies have found a new, and
Despite reporting concerns about their safety and the paucity of
complex, expression: child abuse, terrorist propaganda, and the
protective measures,6 Britons still enter online spaces—largely
harassment of women and minority groups are among the phe-
because of the benefits of internet services, but also out of a
nomena that have been transformed by the internet, and that have
perceived lack of plau- sible alternatives, and because of an
rendered it an unsafe place for many of its netizens. 1 To cite but a few
increasing truism that the internet is effectively an extension of the
statistics to this effect: the Internet Watch Founda- tion confirmed
public square.
153,383 cases of Child Sexual Abuse Material (CSAM) in the UK in It is against this background that the British Parliament is now
20202; the UK government claimed that all five domestic terrorist considering the Online Safety Bill (previously drafted with the title of
incidents in 2017 had online elements, including radicalization by ‘‘The Online Harms Bill’’).7 The purpose of the bill is to create ‘‘a
international groups, such as ISIS3; 21% of women in the UK have new regulatory regime to address illegal and harmful content
been victim to misogynistic abuse online4; two in every three Britons online.’’7 Key among the stipulated objectives of the legislation are
are concerned about the pro- liferation of fake news.5 the following:

Patterns 3, August 12, 2022 ª 2022 The Author(s). 1


This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).
ll
OPEN Perspective
d A free, open and secure internet:1 individuals must be able to
ements. Recognizing that this space is of critical public concern and
use the internet without restriction, except where limited and
presents genuinely novel forms of policy interaction (from duty of
proportionate restriction is necessary for protecting in-
care; protection of the vulnerable; digital and algorithmic justice;
dividuals’ rights and interests.
privacy; respect, dignity, and decency in society; freedom of
d Freedom of expression online: each person has the right to
expression; etc.), readers will notice that our inter- ventions highlight
express themselves freely and to receive information from
points of contestation, which, at a high level, represent calls for
others, unless where such expression is prohibited by law, as in
further evidence and greater consultation with relevant stakeholders.
the case of hate speech or terrorist propaganda. This right
We forward our critical review with two main sections: the first is
includes the expression and transmission of informa- tion
an overview, where we summarize the key components of the Bill
online.
and its preceding White Papers.1 Here, we survey the following:
d An online environment where companies take effective
steps to keep their users safe, and where criminal, terrorist, d Rationale: drawing on the government’s White Paper, we
and hostile foreign state activity is not left to contaminate reconstruct their explanation for the necessity of the Bill,
the online space:1 recent controversies—including allega- including the social problems it aims to correct and the reg-
tions of Russian social media interference in elections, and ulatory lacuna it aims to fill.
terrorist recruitment online by ISIS—have emphasized the need d Scope: we survey the wide range of unlawful and lawful
to protect users on social media sites from bad-faith actors. harms on digital services that the Bill intends to regulate.
d Rules and norms for the internet that discourage d Enforcement: we survey the mechanisms through which the
harmful behavior:1 features of user-to-user services (including Bill enforces regulation, particularly by empowering the
the possibility of anonymity) seem to encourage antisocial regulator, Ofcom, and the Minister.
behavior. d International comparisons: we survey similar regulatory
d The UK as a thriving digital economy, with a prosperous proposals from other jurisdictions aimed at resolving the same
ecosystem of companies developing innovation in online set of issues.
safety:1 user-to-user and search services are sites of inno-
vation and growth. Accordingly, they are of great impor- tance Secondly, we offer our critical commentary on these features of the
to the development of the digital economy. The UK has also Bill. In our critical discussion, our main conclusions are the
been party to world-leading safety innovation pre- viously, in following:
the form of General Data Protection Regula- tion (GDPR). d Further evidence needed: we argue that the government’s
d Citizens who understand the risks of online activity, chal- White Papers for the Online Harms Bill and Online Safety Bill
lenge unacceptable behaviors and know how to access do not provide sufficient evidence for the necessity or efficacy
help if they experience harm online, with children of regulatory intervention. Despite the prevalence of harms
receiving extra protection:1 user-to-user services potentially online, it is not clear from the White Papers why extensive
expose vulnerable people—particularly children—to bad-faith government interference—with its concomitant limitations on
ac- tors. It is therefore important to ensure that there is suffi- freedom and individual rights—is a neces- sary or
cient protection to make their internet use secure. proportionate resort in resolving these issues.
d Renewed public confidence and trust in online companies d Possible democratic deficit: we argue that the Online Safety
and services:1 controversies involving user-to-user and search Bill suffers a possible democratic deficit because it delegates
services have undermined public trust in their reli- ability as extensive authority to Ofcom in its capacity as the industry
news sources and service providers. regulator for digital media, and to the Min- ister. In assigning
Ofcom the power to determine the Code of Practice for digital
Given the extent of such harms, such policy objectives are platforms, the Online Safety Bill em- powers Ofcom with
commendable; in particular, because they are cognisant of— and sweeping powers to enact rules for the internet with little
indeed affirm—the need for a safe, free, open, and thriving digital democratic scrutiny by Parliament and no consultation.
environment. In this paper, we critically analyze the mech- anisms d Duties too wide: we are concerned about the potential range of
and rationales for the regulation that the British govern- ment has
new powers and duties created by the Online Safety Bill. The
proffered, arguing that the proposed legislation fails to meet the Online Safety Bill extends services’ duty of care to include the
government’s own desiderata. The effect of this legislation, in our regulation of legal but harmful mate- rial. We argue both that
opinion, will be a less open and free internet, one in which British this extension overburdens devel- opers with responsibility—at
companies have less access to a thriving digital economy, and in pain of penalty—for legal content and that the specific framing
which the most effective steps toward preventing harm are neglected. of this provision risks a regulatory slippery slope toward wider
Per our analysis, the government has not done enough to justify the censorship.
need for its intervention, and has crafted a regulatory framework that d Problems with Enforcement: we raise concerns about the
leaves open the possibil- ity of counterproductive and undemocratic
Online Safety Bill’s regulatory model of enforcing a Code of
interference. Practice. Although we can only speculate about the con- tent of
To motivate our opinion, we offer a selected overview of the such a Code of Practice, we argue that a Code of Practice is in
proposed legislation, where our selection of emphasis is pre- sented principle an inapt tool for a dynamic digital
with a view to offering commentary on those specific el-

2 Patterns 3, August 12, 2022


ll
Perspective OPEN ACCESS

ecosystem. Instead, we suggest that an ethical design approach 3. To take additional measures to protect all users from
would be better suited to resolving the problems that the Online con- tent that is harmful without being illegal, if the service is
Safety Bill is meant to address, and pro- vides developers with of a sufficient reach and magnitude (section 11). Some
the flexibility to adopt a plurality of preventative measures. online interactions, while legal, can nevertheless be harmful.
Our intended readership are those with an interest in the regulation This includes online campaigns of harassment that are not
of digital services, both in industry and in policy- making. This is not covered by existing criminal law, and the proliferation of
limited to policymakers: we argue that choices at a technical level disinformation (fake news). The purpose of the Bill is to place
(i.e., what tools to use to make the platforms safer) have important a duty of care upon service providers to ensure that their users
moral and political implica- tions, and so it is important that those in are protected not only from unlawful harm, but also from
data science and ma- chine learning understand the consequences lawful and harmful content.
of these tools. We intend to contribute to an increasing debate In the sections below, we highlight key features of the legisla- tion
with the hope of improving the UK’s digital regulation. The and the duty it places on service providers, with a focus on those
arguments we present here reflect our critical opinion, rather than an features that we argue are problematic. The duty is justi- fied with
exposition of scientific fact, and so we invite critical reply to our reference to the role of user-to-user and search services in
claims here. We begin with an overview of the legislation. Those proliferating harm, and so it is ultimately a duty owed to the users of
interested in our commentary should move directly to the section internet services (albeit arbitrated by the government and regulator).
‘‘Measuring the Bill through proportionality and ne- cessity.’’ The duty requires services to comply with Codes of Practice, and to
report their compliance to the regulator to ev- idence their execution
AN OVERVIEW OF THE ONLINE SAFETY BILL of their duty. Concomitantly, the regulator, Ofcom, is empowered to
enact and enforce the Codes of Prac- tice, although the Minister has
In 2020, the UK government issued the Online Harms White Pa- per the power to direct this enactment.
with the purpose of creating a safer internet ecosystem and repairing
public trust in digital platforms and search services. The White Paper Rationale for duty
has subsequently led to the Online Safety Bill, a draft piece of In its initial White Paper on the subject (then under the auspices of
legislation giving more regulatory substance to the ambitions set out the Online Harms Bill), the government presented both evi- dence of
in the original White Paper. In this section, we offer a selected the extent of online harms and a precis of existing ef- forts to regulate
overview of the proposed legislation, where our selection of emphasis online harms. We survey these reasons and highlight potential
is presented with a view to offering commentary on those specific lacunas here by way of exposition before our critical analysis in
elements. section 2, which assesses whether the Bill’s rationale as given
The Bill’s approach is to place a duty of care on internet ser- vice demonstrates that tighter regulation of service providers is necessary
providers of both user-to-user services in which users interact with for reducing harm.
each other online, such as the manner in which users typically interact The government presents the following online harms as the crucial
on platforms like Facebook and Twitter, and search services that impetus for the Bill:
index information and allow users to navigate the internet, such as
Google and Bing. The duty of care is framed in broad terms in the d Child sexual exploitation and abuse online: the White Paper
Bill, but it is composed of three distinct duties:7 cites the Internet Watch Foundation’s statistics regarding the
circulation of CSAM. The White Paper notes that, of the
1. To protect users from illegal content (section 9): although
80,319 cases of CSAM confirmed by the IWF, 43% of the
the production and dissemination of CSAM and terrorist
children involved were between 11 and 15, 57% were under 10,
propaganda are already illegal, the purpose of the Bill is to and 2% were under 2.2
place a duty of care upon service providers to control the
d Terrorist content online: the government’s concern is that the
digital space with a view to limiting the potential spread of
internet allows the proliferation of terrorist propaganda. The
illegal content. It is unclear what the Bill is able to accomplish
White Paper repeats the Rt Hon Amber Rudd’s claim that five
in this space that is not already possible under the existing
of the terrorist incidents in the UK in 2017 included internet
legal landscape.
elements, implying that they were radicalized by international
2. To take additional protective measures to make their site
groups, such as ISIS and Daesh.3
safe for children, if their service is likely to be used by
d Content illegally uploaded from prisons: the White Paper
chil- dren (section 10). Children on the internet are vulnerable to
claims (without citation) that there is an increase in the
grooming, cyberbullying, encouragement to self-harm, and
amount of content transmitted illegally from prisons.
harm to their mental health. The purpose of the Bill is to
d The sale of opioids online: the White Paper cites the Na-
place a duty of care on service providers to ensure the safety of
tional Crime Agency’s statistics that there have been at least
children on their platforms. The litmus test for ‘‘a service
146 opioid-related deaths in the UK since 2016. It claims
likely to be used by children’’ is not well defined.
(without reference) that opioids are sold on ‘‘several well-
known social media sites.’’
d Cyberbullying: the White Paper cites National Health Ser- vice
data that one in five children aged 11–19 has experi- enced
cyberbullying. Of those who experienced cyber-

Patterns 3, August 12, 2022 3


ll
OPEN Perspective
bullying, the White Paper claims that the propensities of ance measures and empowers the government to hold named directors
victims towards social anxiety, depression, suicidality, and accountable for their services’ failures to comply.
self-harm were all higher than corresponding cases of ordinary Ordinary sensibilities: what is more novel is the Bill’s focus on
bullying. harms that are not illegal. Citing online harassment in its White Paper,
d Self-harm and suicide: the White Paper cites academic the government has suggested that there is a need to regulate behavior
research that showed that 22.5% of young adults reported that is not prohibited or regulated otherwise, including speech that
suicide and self-harm-related internet use; 70% of young adults does not fall within the remit of hate speech. The Bill requires that
with suicidal intent reported related internet use; approximately service providers not only protect their users from illegal content but
a quarter of children who presented to hos- pitals following also from this species of con- tent that would otherwise be legal, as
self-harm, and of those who died of suicide, reported suicide- long as it can reasonably be construed as potentially harmful to
related internet use.8 children or adults of ‘‘ordi- nary sensibilities.’’
d Underage sharing of sexual imagery: the White Paper cites The scope of this content is potentially very wide. According to the
statistical evidence that suggests that between 26% and 38% Bill, this duty should be triggered if:
of teenagers have sent sexual images to partners, whereas
12% to 49% of teenagers have received sexual images from the provider of the service has reasonable grounds to believe
partners. that the nature of the content is such that there is a material
d Online disinformation: the White Paper cites a Reuters report risk of the content having, or indirectly having, a significant
showing that 61% of people want the government to do more to adverse physical or psychological impact on an adult of
distinguish between real and fake news. The White Paper does ordinary sensibilities.
not provide statistical evidence as to the prevalence of The Bill, therefore, uses the ‘‘reasonableness’’ standard that
disinformation, but cites the Russian state’s use of pervades English common law. This standard sets a variable threshold
disinformation as an example. that focuses on the perspective of an epistemically responsible agent.
d Online manipulation: the government’s concern is that the This definition is further supplemented by the following stipulation
introduction of artificial intelligence will increase the preva- in the Bill:
lence and effectiveness of psychological manipulation. The
government cites the need to replicate the regulation of [A]risk of content ‘‘indirectly’’ having a significant adverse
manipulation in other media, such as the Broadcasting Act physical or psychological impact on an adult is a reference to a
1990, which prohibited subliminal advertising. risk of either of the following—(a) content causing an
d Online abuse of public figures: the government cites the individual to do or say things to a targeted adult that would
have a significant adverse physical or psychological impact on
disproportionate amount of abuse received by female pub- lic
such an adult; (b) content causing an adult to act in a way that
figures. It cites an international survey that found that two in
every three female journalists were harassed online, including —(i) has a significant adverse physical or psychological
receiving death threats, cyberstalking, and obscene messages. impact on that adult, or (ii) increases the likelihood of such
an impact on that adult.
Where these activities are already illegal (as in the case of CSAM
This framing is noteworthy both because it includes in its scope a
or terrorist recruitment and propaganda), the purpose of the
responsibility on the part of services for content that is not illegal
legislation is not to prohibit the act itself but to establish the
otherwise, but also because it provides such wide interpretive scope
duties of online services to protect users from illegal content.
depending on how one chooses to construe ‘‘reasonableness.’’ We
This deviates from the existing regime, established by the EU’s e-
return to this issue later in this paper.
Commerce Directive, which exempts services from liability for
illegal content, unless they know about the unlawfulness of the
Enforcing the duty
content or have sufficient information to conclude that it is unlawful
The Bill and White Paper assert that service providers have, in
and do not act expeditiously to remove the content. We cover the
principle, a duty of care to protect their users; to discharge this duty,
form of this duty in the next two subsections.
the Bill stipulates that services must abide by Codes of Practice. The
Where content is legal, the White Paper argues that it falls in a
Bill itself does not specify the content of the Codes of Practice.
regulatory lacuna, since legal (but harmful) content is already subject
Instead, it delegates this authority to the industry regulator, the
to statutory regulation (typically overseen by Ofcom) when it is
Office of Communications (Ofcom). Ofcom is an in- dependent
presented on other platforms, including live television, catch-up
industry regulator, although its chair is politically ap- pointed.
television, and subscription services. The Bill, there- fore, fills the
While Ofcom has the power to set the Codes of Practice, the Bill
regulatory gap by extending similar regulatory con- trol over the
vests the Minister of State with the power to veto the Codes of
dissemination of content on user-to-user services.
Practice, or to order Ofcom to modify the codes so that they align
with ‘‘government policy.’’ The Bill, in effect, grants the Min- ister
Scope of the duty of State wide powers to direct the Codes of Practice.
The scope of illegal content is, of course, specified already by Parliament, by contrast, has relatively little oversight over the
preceding legislation, and includes material that promotes terrorism Codes of Practice. The Bill adopts a negative form of parliamen- tary
or that is classified as CSAM. However, the Online Safety Bill adds a oversight with regard to the Codes of Practice: Parliament is assumed
new layer of liability to these activities because it requires that to have consented to the Codes of Practice unless its
services adhere to an additional layer of compli-

4 Patterns 3, August 12, 2022


ll
Perspective OPEN ACCESS

members propose and pass a vote to reject the Codes. Scrutiny of the vices affected and civil society organizations representing stake-
Codes of Practice, therefore, is the exception, rather than the default holders. Second, the guidelines are meant to represent best practice in
position of Parliament. the industry, but enterprises can deviate from the guidelines with
sufficient reason. Third, the guidelines do not create new duties:
Assessing the Bill rather, they are meant only to help enterprises easily navigate the
In this section, we have surveyed the government’s own ratio- nale duties already established in the Act.
for the Bill, as well as assessing the scope and content of the duties of The DSA takes a comparable approach to the Platform
care that the Bill imposes. Our focus for the remainder of this paper Accountability and Consumer Transparency Act (PACT Act), which
will be on: has been proposed in the Senate of the United States. 10 Concerning
legal but harmful content, The PACT Act, like the DSA, focuses on
(1) the causal claims in the government’s rationale for the Bill design features of user-to-user services: the Act stipulates
and whether they justify the necessity of the legislation; transparency and process requirements for acceptable use,
(2) the authority that the Bill vests in the Minister and in complaints, and content moderation, requiring services to submit
Ofcom; transparency reports (section 5).
(3) the wide scope of content that is included in service pro- By contrast, the national legislatures in Brazil 11 and India12 have
viders’ duty of care per the Bill; and both considered much stricter regulation of content moni- toring
(4) the Bill’s reliance on Codes of Practice to be prescribed by online. The Brazilian executive issued a Provisional Mea- sure 1068
Ofcom in conjunction with the Minister. to restrict content removal by social media platforms, limiting
removal only to cases of nudity, violence, narcotics, and incitement to
In the sections that follow, we argue that: the government’s
crime, thereby preventing social media platforms from removing
rationale for the Bill provides insufficient justification for the
disinformation (such as President Jair Bolso- naro’s COVID-19
necessity of this intervention, and that more consultation and
disinformation removed by Facebook, Twitter, and YouTube).13 The
justification are needed; the Bill grants far-reaching powers of
Indian government has similarly issued a number of regulations,
interference to the executive; the scope of content covered by the Bill
including the Information Technology Act14 and Information
is worryingly broad; the emphasis on Codes of Practice is inapt; and
Technology (Intermediary Guidelines and Digital Media Ethics
that the Bill creates potential obstacles for small and medium
Code) Rules of 2021,12 which direct user- to-user services to
enterprises.
remove a wide range of content, including material that threatens
the sovereignty of the Indian state, to use algorithmic systems to
INTERNATIONAL COMPARISONS monitor and remove harmful content, and to trace encrypted
messages to limit online anonymity. Activist groups have claimed
It is worth comparing the Bill with its international equivalents, since that these measures are aimed at curbing dissent against the
legislators in a number of jurisdictions have sought to regu- late government, resulting in what they call ‘‘digital authoritarianism.’’15
content-moderation on social media platforms. These pro- posed These comparators are useful in framing the different degrees to
legislative interventions provide us with a useful set of benchmarks which governments have chosen to interfere with services’ content
against which to measure the Safety Bill. monitoring and moderation. The US and EU models are focused on
The Parliament of the European Union is currently considering the design choices that empower users by making the terms and
proposed Digital Services Act (DSA) to address content moderation procedures of user-to-user services transparent and accessible. The
in the EU.9 Like the OSB, the DSA is aimed at pro- tecting the human Indian and Brazilian models, by contrast, are focused much more
rights of citizens of the EU online. However, the proposal differs in explicitly on directing the content that is permissible on user-to-user
several important regards. The DSA stip- ulates more detailed, services. The UK government has intimated its inclination toward the
design-based duties with regard to legal but harmful content: user-to- former approach, but this remains relatively underdeveloped in the
user services must contain clear and accessible terms and conditions, Bill itself, as we discuss in the following sections.
content-reporting proced- ures, and appeal procedures following
content or user removal, as well as requiring large platforms to MEASURING THE BILL THROUGH PROPORTIONALITY
cooperate with ‘‘trusted flaggers’’ (drawn from expert and AND NECESSITY
professional institutions) who report harmful content (section 3). The
DSA also requires very large platforms to perform assessments of The Online Safety Bill will necessarily limit individuals’ right to
their systemic risks, including systemic design features that freedom of expression, and place costly positive duties on entre-
threaten the exercise of fundamental rights, to declare the parameters preneurs that limit their free enjoyment of their property (not to
of their recom- mender systems, and to evidence their mitigation mention downstream effects on their competitiveness). These rights
strategies for minimizing system risk (section 4). Therefore, with regard are enshrined in the European Convention on Human Rights (article
to legal but harmful content, the DSA is concerned only with systemic 10) and the subsequent Paris Protocol (article 1). However, we are
design features of user-to-user services. concerned here not with the legal right—partic- ularly since the UK
To help enterprises, the DSA empowers the European Com- parliament reserves the right to enact legisla- tion that is incompatible
mission to issue guidelines for the fulfilment of their duties stipu- with its commitment to the Convention. Rather, we are concerned
lated in the Act (article 27). This differs from the Codes of Conduct in with the normative right that underpins the aforementioned legal
three important respects. First, the Commission is obliged to compose creations. As our point of departure, we
the guidelines in collaboration with the ser-

Patterns 3, August 12, 2022 5


ll
OPEN Perspective
assume that individuals are vested with natural rights to free
schoolmates.19 The Bill, as it is, is unlikely to resolve the problem that
expression and use of their property.
the White Paper outlines, since private communications fall outside of
These rights, of course, are not absolute or insurmountable: it is, by
the scope of the Bill. Of course, the government could resolve the
way of exception, permissible to infringe upon these rights in the
problem more effectively by expanding the remit of the Bill to
presence of sufficient countervailing justification. Rights provide
include private communications, but this would clearly constitute an
‘‘moral breakwaters’’16 that protect individual and col- lective
unacceptable breach of individual rights, and would require services
interests, but the breakwaters can always be overcome with sufficient
to break the encryption of private messaging, which would have
justification. However, since rights function as breakwaters, they
extremely deleterious pri- vacy costs.
cannot be limited anytime that infringement causes a net good.
A similar problem plagues the government’s ambitions con-
Rather, rights are only defeasible when the costs of respecting the
cerning suicidality and self-harm. The White Paper is correct in
rights is significantly greater than the cost of transgression.17 In other
emphasizing the correlation between internet use (particularly of
words, rights establish a default position, and deviation from this
search and user-to-user services) and suicidality, depression, and self-
default requires significant and extraordinary reason. This sets the
harm. However, it is again unclear whether the Bill’s in- terventions
standard of evidence that the government should (normatively, albeit
present an effective and proportionate solution to the problem. First,
not legally) adduce to support the infringement of others’ rights: their
although internet use is strongly correlated with suicidality and
intervention should not simply produce a net good; rather, the net
depression, this correlation is predominantly due to the effects of
good must be sufficiently weighty to offset the cost of intentional
sleep loss and private cyberbullying, rather than exposure to suicide-
rights infringement.
related content (which has a very low correlation with suicidality).20,21
In normative theory, rights infringements are subject to the re-
Second, where individuals experiencing suicidality have accessed
quirements of necessity and proportionality.18 Necessity permits
suicide-related content prior to self-harming, the preponderance of
rights infringements only as a last resort: rights cannot be trans-
this content has not been user-to-user content, but rather fact-based
gressed if there are less harmful means available to achieve the same
websites.20 It is unclear, therefore, whether the Bill as it is will be able
end. Proportionality permits rights infringement only if the
to resolve much of the problem of suicidality and self-harm. Again,
expected outcome of the infringement is commensurate with the
the government would have to make much further-reaching in-
cost of the infringement—in other words, there must be an apt
terventions to get to the real causes of the problem, but this again
‘‘fit’’ between means and ends.
would be at the cost of interfering with individual liberties.
What we want to suggest here is that the evidence presented by the
government in justifying the Online Safety Bill (and the in-
Necessity issues
fringements it entails) does not meet the thresholds of necessity and
The second point of concern is that, even where the Bill’s inter-
proportionality. Given the stringency of the rights affected, it is
ventions are effective in preventing or mitigating harm, there are
critical that the government should adduce clear and convincing
interventions available to the government that would interfere less
evidence that its proposal satisfies these require- ments. To this end,
with individual rights. This would render the interventions in the
we suggest that the government’s rationale for the Bill raises several
Bill unnecessary, since they do not constitute the least costly means
general concerns that cast doubt upon the proportionality and
of addressing the intended harm.
necessity of this intervention. We list these concerns below.
For the purposes of measuring necessity, it is useful here to
benchmark the interventions in the Bill to similar interventions in
Proportionality issues comparable legislation, such as the DSA. The DSA does not focus
In several instances, the White Paper cites genuine concerns about the on content monitoring or moderation by platforms them- selves, but
contributions of user-to-user and search services. However, if we rather focuses on setting out clear and easily acces- sible mechanisms
investigate these claims more closely, there is a mismatch between for users to register complaints and to flag content that contravenes
the putative justification of the White Paper and the remit of the the terms and conditions of the service (sections 3 and 4). This
legislation. In brief, the problem can be framed as follows: approach causes less interference with the rights of individual users,
because it means that they are not monitored by default, as well as
d the interventions in the Bill do relatively little to resolve the imposing less burdensome duties on service providers, since they do
concerns raised by the White Paper; not have to develop monitoring mechanisms (which we cover later
d to resolve the concerns thoroughly would require inter- in this paper). This approach also reduces interference by
fering with legitimate internet use. diminishing the pos- sibility of removing non-harmful content, since
only content flagged by users (rather than by monitoring AI) will be
This is a problem of proportionality, because the expected benefit
picked up. The DSA’s approach also avoids the Bill’s more costly
of the intervention does not fit the magnitude of the rights
solution of de-anonymizing user-to-user services. The Bill proposes
infringement entailed by the intervention.
limiting the ability of individuals to use user-to-user services
Consider, first, the problem of bullying. The government is right to
anonymously as a means of harm prevention. While the restric- tion
want to address the problem of cyberbullying: one in five children
of the ability of individuals to be anonymous might have ben- efits in
report being subject to cyberbullying. 19 However, the Bill only
terms of holding individuals more accountable for their actions,
addresses a fractional part of this problem, because 90% of
which could promote less toxic interactions online, the limiting of
cyberbullying occurs in private messages between
anonymity might be damaging to those who rely on

6 Patterns 3, August 12, 2022


ll
Perspective OPEN

the lack of identification online to access support. 22 Chat rooms and


but harmful’’ content. We have particular concerns about the duty
support groups can be beneficial for those experiencing mental health
placed upon services to control ‘‘legal but harmful’’ content, given
issues and one of the features that can increase their effectiveness is
the breadth of the definition of what counts as harmful. Here, we
the ability to be anonymous. Individuals are able to access support
argue that the breadth of the content covered by the Bill would be
without their identity being revealed, meaning they are free to discuss
better addressed by ethics-by-design procedural mechanisms, rather
their experiences without being concerned about their employer or
than content-specific regulation.
family finding out, for example. It is unclear how the Bill would
The Bill defines harmful content in Part 2, Chapter 6, Section 46,
balance minimizing harm facilitated by anonymity while avoiding
repeating its formulation in defining content that is harmful to
bringing harm to those who rely on anonymity to access support.
children. The formulation in this section has a few noteworthy
Relying instead on user reporting is a means of empowering
features. The first is that it includes content that causes ‘‘indi- rect’’
individuals against online harm without limiting their ability to
harm and extends the remit of harm to psychological (and not just
interact freely and anonymously online when it is beneficial to their
physical) harm. The second, more worrying, feature of the
wellbeing.
formulation is that it defines harm in terms of the reason- able
understanding of a person of ‘‘ordinary sensibilities.’’ The
DEMOCRATIC DEFICIT introduction of the reasonableness element imports an interpre- tive
element into the duty without inserting clear boundaries delineating
Our second concern relates to the enforcement of the duties set out in the scope of the duty, which—as we explain in this section—is
the Bill. We argue that the structure of the Bill grants undue power to troublesome for a top-down approach to regu- lating harm.
the executive, and deprives the public of the opportu- nity to exercise Our concern here is not with the cogency of a wider definition of
democratic oversight of the Bill’s content. harmful speech as such. It is plausible, we think, to extend the remit
In designing the structure of its regulation, the Bill assigns to of ‘‘harmful speech’’ beyond the remit of what counts in law as
Ofcom the power to issue Codes of Practice that will determine how ‘‘hate speech.’’ However, in our opinion, the Online Safety Bill does
user-to-user and search services are to fulfil the more ab- stract duties not specify this remit with sufficient clarity for the purpose for which
set out in the legislation. As mentioned in section 1.3, the Bill also it is deployed. This formulation has the capacity to include a vast
grants the Minister the power to interfere with the Codes of Practice sweep of material that would be undesirable to limit. Whether an
by exercising a veto power or by directing Ofcom to align the Codes individual piece of content counts as ‘‘harmful’’ on this definition
of Practice with government policy. will depend significantly on the context of its use. It is this matter of
Given that the government will be able to punish services and interpretation that, we think, opens the possibility of over-censorship
individuals who are derelict in their duties, and will inform what when enforced by a top-down approach in which the government
information can be shared and received on the internet, the Co- des of specifies a list of harmful con- tent or algorithmic systems are used to
Practice have significant implications for the rights of internet users detect harmful content (as in the Brazilian and Indian cases).
and services. The Codes of Practice can make in- formation more or Consider two examples that pose particular interpretive diffi- culty:
less difficult to communicate. Our concern is that the stringency of manipulation and disinformation:
these restrictions is dependent upon the regulator and ultimately the
Minister, with little oversight from Parliament or the public (although d Manipulation: ‘‘nudging’’ refers to features of choice archi-
the Department has sug- gested it will consult stakeholders). Since the tecture that alter an individual’s behavior in a predictable way23
Minister can direct the Codes of Practice, they are effectively granted and, although disputed, can be said to be a form of
the authority to determine how user-to-user services control speech manipulation of the choices people make. 24 Nudging can take
on their platforms. simple forms, including the placement of prod- ucts in the
Our concern is that this creates the possibility of a democratic supermarket, with branded products being placed at eye level to
deficit in the Bill: the Minister retains sweeping powers to inter- fere encourage consumers to spend more.25 Nudging has also been
with the limits and regulation of speech on the internet’s key used in public health inter- ventions, including in reducing meat
platforms, with Parliament playing only a minimal negative oversight consumption26 and the encouragement of healthier food
role. This power is sweeping since the remit of the Bill is wide purchasing,27 as well as in pro-environmental behavior.28
(particularly in defining the ‘‘harmful but legal’’ content for which Evidently, nudging (and manipulation) is present in analog
services are responsible). This means that the Minister has significant settings and is not novel to online, algorithmic-driven settings,
power to interfere with an important set of rights (including free such as social media. Indeed, traditional information flow
speech and free press) without the particulars of their interventions theory can be adapted to algorithmic nudging.29 This is not to
being vetted by Parliament or subject to public scrutiny. We suggest say that more analog forms of manipulation and algorithmic
that this amount of power is susceptible to abuse, and does not accord manipulation, or algorithmic personalization as it is commonly
with the Bill’s vision of a free internet. referred to in the literature, are exactly equivalent. Algorithmic
manipulation can be more covert
than human manipulation,29 which, in part, is due to the
NEW POWERS AND DUTIES
often black box nature of algorithms.30 It also allows a greater
level of personalization since the large number of data points
The Bill empowers Ofcom to enforce both services’ duties con- collected about an individual from their online
cerning illegal content, as well as their duties concerning ‘‘legal

Patterns 3, August 12, 2022 7


ll
OPEN Perspective
activity enables highly accurate profiles to be generated about CODES OF PRACTICE
them.31 However, again targeted advertising, which falls within
the remit of nudging/manipulation,32 can occur offline, Our final concern relates to the Bill’s focus on Codes of Practice
although with less personalization; television adver- tisements deferred to Ofcom as the main regulatory mechanism. Our concern
are targeted at the intended audience of the sta- tion, including here is that the legislation’s open-ended references to Codes of
targeting to children,33,34 and the placement of billboards Practice opens the possibility of inappropriate regula- tory tools. As
enables targeting to specific demo- graphics.35,36 It is not clear, we intimate in the previous section, our concern here is that the Codes
in the context of the Bill, how we are to distinguish of Practice leave open the possibility that regulation will restrict
meaningfully between legitimate techniques of persuasion and particular pieces or kinds of content. This would, of course, place an
bad faith manipulation. unduly onerous burden on ser- vice providers, and hold them
d Disinformation: the White Paper provides seemingly con- responsible for activities on their sites for which they should not be
tradictory directives with regard to limiting disinformation. The held liable.
White Paper claims, simultaneously, that user-to- user We note that the GDPR and the DSA include a similar mecha-
services have an obligation to ‘‘improve how their users nism that permits regulators to establish Codes of Conduct or best-
understand’’ the ‘‘trustworthiness’’ of news (7.29), but it practice guidelines.39 However, it is important to note that the Codes
also confirms that the purpose of regulation should be in these cases do not establish new rules that are not grounded in the
‘‘protecting users from harm, not judging what is true or legislation: rather, it provides efficient means for enterprises to
not’’ (7.31). It is not clear how these two imperatives can be comply with their duties established by com- plex legislation. Our
squared. If ‘‘harm’’ is simply content that is already illegal concern is that the Online Safety Bill is suf- ficiently open-ended that
(including hate speech, defamation, unlawful politi- the Codes of Practice will, in this case, amount to the creation of new
cal interference), then it is unclear what additional protec- tion rules, since the duties in the Bill are multiply realizable and open to a
the Bill will contribute. However, if ‘‘harmful’’ is construed wide range of interpretations. This is because the Bill outlines only in
more widely, then the Bill will invariably have to set broad terms the duties that services have to protect users, but does not
parameters for the kind of information that counts as prescribe (as the DSA and GDPR do) which features of their
disinformation (rather than simply misinformation). This is a platforms are in the scope of the regulation (i.e., whether they have a
delicate interpretive task that depends upon the context in duty to monitor and moderate specific pieces of content, or whether
which information is disseminated, and so requiring stricter they only have a duty to adjust the design features of their services).
monitoring will require tradeoffs in which services will have to The most sensible approach, we argue, would be to adopt an ethical
limit expression. design approach that (1) focuses on the ethical features of the design
process and (2) provides services with sufficient space to adopt
Our concern here is that the top-down monitoring of content (either flexible and innovative solutions to the social problems present on
by the government or by AI deployed by services) their platforms. A Code of Practice runs the risk of focusing less on
—given these interpretive difficulties—will increase the risk of design, and rigidifying the solutions that providers can use to solve
excessive censorship. Whether an individual piece of content problems.
constitutes ‘‘harm’’ by the definition above will be highly sensi- tive In a recent memorandum on the topic, the Department of Dig- ital,
to the context of its use. The context sensitivity of this defi- nition Culture, Media, and Sport has indicated that they will focus their
suggests important technical difficulties for enterprises. Enterprises Codes of Practice on design and process features of user- to-user and
will presumably have to develop tools to scan con- tent for a number search services.40 However, we would appeal to Of- com, Parliament,
of harms. Automatic detection of harm is an open problem far from and the DCMS to concretize this commitment to assuage concerns
being solved.37 The academic commu- nity researching automated about content-specific censorship. It is important for the purposes of
detection of cyberbullying, for example, has made appeals for more clarity that this be confirmed.
universal and specific criteria concerning cyberbullying definitions, Compared with those jurisdictions that have taken a content-
more consistent evaluation methods, and better-quality datasets.37,38 specific approach to regulation, the Online Safety Bill is less stringent
It is easy to see how larger companies with access to more data and and specific. Indeed, India’s Information Technology Rules,12 which
highly skilled technical personnel would be better placed to solve the echo the sentiment of the Online Safety Bill, de- fines content that
task, whereas smaller firms will struggle to meet this serious must be reviewed under the rules, listing discrimination, psychotropic
technical task. substances and smoking, imitable behavior, such as content depicting
Given the importance of the issue for the safety and human rights self harm and offensive lan- guage (such as expletives, nudity, sexual
of users, we endorse the research community’s call for a clearer set of content, and violence). The Rules also require the appointment of a
criteria for ‘‘harm.’’ We also recommend sup- porting the creation of Chief Compliance Officer in social media companies to ensure
universal tools, which could be achieved by collecting or sharing compliance and cooperation, and identification of the first poster of
datasets and existing tech- nologies and would remove the burden the unac- ceptable content in some cases. Likewise, the Russian law
from small and me- dium enterprises. However, more generally, it is On Information, Information Technologies and Information
our opinion that an ethics-by-design approach would mitigate much Protec- tion requires social media sites to monitor and restrict
of this concern, because it would empower users to inform the content related to material concerning the advertising of alcohol and
moderation themselves with the help of the appropriate proce- dural on- line casinos, disrespect for society, information on drug
mechanisms.

8 Patterns 3, August 12, 2022


ll
Perspective OPEN

synthesis and production, and suicide. Violations are required to be from Online Abuse (NSPCC). http://www.nspcc.org.uk/about-us/news-
registered with the Federal Roskomnadzor register.41 opinion/2021/poll-shows-widescale-public-support-for-stronger-laws-to-
protect-children-from-online-abuse/.

CONCLUDING REMARKS 7. Department for Digital, Culture, Media and Sport (2022). Draft Online Safety
Bill (DCMS).
We share the government’s concerns about the potential haz- ards of 8. Mars, B., Gunnell, D., Biddle, L., Kidger, J., Moran, P., Winstone, L., and
the internet, particularly with regard to vulnerable groups, particularly Heron, J. (2020). Prospective associations between internet use and poor mental
health: a population-based study. PLoS One 15, e0235889.
children. However, this is not the only imperative at stake: it is also https://doi.org/10.1371/journal.pone.0235889.
important that the government foster an open internet, on which free
speech and innovation can flourish. We accept that the state cannot 9. European Commission (2020). Proposal for a REGULATION of the EURO- PEAN
PARLIAMENT and of the COUNCIL on a Single Market for Digital Services
fully satisfy all of these imperatives simultaneously: the state will (Digital Services Act) and Amending Directive 2000/31/EC (Euro- pean
necessarily have to make tradeoffs between safety, liberty, and Commission).
innovation. 10. Schatz, B. (2021). Platform Accountability and Consumer Transparency Act
Insofar as we have been critical of the Online Safety Bill, it has (PACT) Act (Congress of the United States).
been because we think it has not yet achieved an optimal bal- ance
11. Imprensa Nacional (2021). MEDIDA PROVISO´ RIA No 1.068, DE 6 DE Se-
between these imperatives. First, we argue that the Depart- ment must tembro DE 2021 (DOU - Imprensa Nacional).
do more to justify this legislative intervention: there is a paucity of
12. Ministry of Electronics and Information Technology (2021). Information
justificatory evidence for the scope of the Bill in the current White Technology rules. https://prsindia.org/files/bills_acts/bills_parliament/2021/
Papers issued in its support. Second, we have argued that the Intermediary_Guidelines_and_Digital_Media_Ethics_Code_Rules-2021.pdf.
mechanisms of the Bill do not do enough to pro- tect the liberties of
platforms and their users, because it effec- tively defers much of the 13. Satariano, A. (2021). YouTube Pulls Videos by Bolsonaro for Spreading
power to regulate platforms to the Min- ister. Third, we argue that the Misinformation on the Virus (N. Y. Times).
Bill imposes overly wide duties on platforms that can be deleterious 14. Indian Computer Emergency Response Team (2008). Information and
to smaller enterprises and in- crease government intervention. Fourth, Technology (Amendment) Act (Available at https://eprocure.gov.in/cppp/
we argue that it is imperative for the government to commit to an rulesandprocs/kbadqkdlcswfjdelrquehwuxcfmijmuixngudufgbuubgub-
fugbububjxcgfvsbdihbgfGhdfgFHytyhRtMTk4NzY=).
ethical-by-design approach to the duty of care.
It is our opinion that it is possible for the government to correct the 15. Indian government must correct moves toward digital authoritarianism, allow
tech platforms to uphold rights (2021). Access now. https://www.
problems with the Bill and the White Papers we suggest here without accessnow.org/farmer-protests-india-censorship/.
having to make significant sacrifices to its strategic aims. We suggest
that these changes—while seemingly small—will have a significant 16. Rodin, D. (2004). War and self-defense. Ethics Int. Aff. 18, 63–68. https://
doi.org/10.1111/j.1747-7093.2004.tb00451.x.
effect on making the internet freer, more open, and more innovative
—as well as making it safe. 17. Rodin, D. (2011). Justifying harm. Ethics 122, 74–110. https://doi.org/10.
1086/662295.
AUTHOR CONTRIBUTIONS 18. McMahan, J. (2007). The sources and status of just war principles. J. Mil. Ethics
6, 91–106. https://doi.org/10.1080/15027570701381963.
M.T., E.K., D.A., and A.H. were responsible for conceptualization and writing
19. Office for National Statistics (2020). Online bullying in england and wales: year
(original draft and review & editing). S.Z. and E.L. were responsible for review &
ending march 2020. https://www.ons.gov.uk/peoplepopulationandcommunity/
editing. crimeandjustice/bulletins/onlinebullyinginenglandandwales/yearendingmarch2020.

DECLARATION OF INTERESTS 20. Sedgwick, R., Epstein, S., Dutta, R., and Ougrin, D. (2019). Social media,
internet use and suicide attempts in adolescents. Curr. Opin. Psychiatr. 32, 534–
541. https://doi.org/10.1097/YCO.0000000000000547.
The authors declare no competing interests.
21. DeAndrea, D.C. (2015). Testing the Proclaimed affordances of online sup- port
REFERENCES groups in a nationally representative sample of adults seeking mental health
assistance. J. Health Commun. 20, 147–156. https://doi.org/10.
1080/10810730.2014.914606.
1. Department of Digital, Culture, Media, and Sport (2020). Online harms white
paper. https://www.gov.uk/government/consultations/online-harms- white- 22. Santana, A.D. (2014). Virtuous or Vitriolic: the effect of anonymity on civility in
paper/online-harms-white-paper. online newspaper reader comment boards. Journal. Pract. 8, 18–33.
https://doi.org/10.1080/17512786.2013.813194.
2. Internet Watch Foundation (2020). Internet Watch Foundation annual report
2020 - face the facts. https://www.iwf.org.uk/about-us/who-we- are/annual- 23. Thaler, R.H., and Sunstein, C.R. (2008). Nudge: Improving Decisions about Health,
report/. Wealth, and Happiness (Yale University Press).

3. Amber, R. (2018). Speech at San Francisco Digital Forum. 24. Bovens, L. (2009). The ethics of nudge. In Preference Change, T. Gru€ne-
Yanoff and S.O. Hansson, eds. (Springer Netherlands), pp. 207–219.
4. Amnesty International (2017). Amnesty Reveals Alarming Impact of Online Abuse
against Women (Amnesty International). https://www.amnesty.org/ 25. Nordfa€lt, J., Grewal, D., Roggeveen, A.L., and Hill, K.M. (2014). Insights
en/latest/news/2017/11/amnesty-reveals-alarming-impact-of-online-abuse- against- from in-store marketing experiments. In Review of Marketing Research,
women/. D. Grewal, A.L. Roggeveen, and J. NordfA¨ lt, eds. (Emerald Group Publish- ing
Limited), pp. 127–146.
5. Reuters Institute for the Study of Journalism (2021). Reuters Institute Dig- ital
News Report 2021 (Reuters). 26. Kurz, V. (2018). Nudging to reduce meat consumption: immediate and
persistent effects of an intervention at a university restaurant. J. Environ. Econ.
6. National Society for the Prevention of Cruelty to Children (2021). Poll Shows Manag. 90, 317–341. https://doi.org/10.1016/j.jeem.2018.06.005.
Widescale Public Support for Stronger Laws to Protect Children
27. Adam, A., Jensen, J.D., Sommer, I., and Hansen, G.L. (2017). Does shelf space
management intervention have an effect on calorie turnover at

Patterns 3, August 12, 2022 9


ll
OPEN Perspective
supermarkets? J. Retail. Consum. Serv. 34, 311–318. https://doi.org/10.
36. Luke, D. (2000). Smoke signs: patterns of tobacco billboard advertising in a
1016/j.jretconser.2016.07.007.
metropolitan region. Tob. Control 9, 16–23. https://doi.org/10.1136/tc.9. 1.16.
28. Byerly, H., Balmford, A., Ferraro, P.J., Hammond Wagner, C., Palchak, E.,
Polasky, S., Ricketts, T.H., Schwartz, A.J., and Fisher, B. (2018). Nudging pro- 37. Emmery, C., Verhoeven, B., De Pauw, G., Jacobs, G., Van Hee, C., Lef- ever,
environmental behavior: evidence and opportunities. Front. Ecol. En- viron. 16, E., Desmet, B., Hoste, V., and Daelemans, W. (2021). Current limita- tions in
159–168. https://doi.org/10.1002/fee.1777. cyberbullying detection: on evaluation criteria, reproducibility, and data
scarcity. Comput. Humanit. 55, 597–633. https://doi.org/10.1007/ s10579-020-
29. Soffer, O. (2021). Algorithmic personalization and the two-step flow of 09509-1.
communication. Commun. Theor. 31, 297–315. https://doi.org/10.1093/
ct/qtz008. 38. Rosa, H., Pereira, N., Ribeiro, R., Ferreira, P.C., Carvalho, J.P., Oliveira, S., Coheur,
L., Paulino, P., Veiga Sima˜ o, A., and Trancoso, I. (2019). Auto- matic
30. Reviglio, U., and Agosti, C. (2020). Thinking outside the black-box: the case for cyberbullying detection: a systematic review. Comput. Hum. Behav. 93, 333–
‘‘algorithmic sovereignty’’ in social media. Soc. Media Soc. 6. 345. https://doi.org/10.1016/j.chb.2018.12.021.
205630512091561. https://doi.org/10.1177/2056305120915613.
39. Publications Office of the European Union (2016). General data protection
31. Yeung, K. (2018). Five fears about mass predictive personalization in an age of regulation. http://op.europa.eu/en/publication-detail/-/publication/3e485e15- 11bd-
surveillance capitalism. Int. Data Priv. Law 8, 258–269. https:// 11e6-ba9a-01aa75ed71a1.
doi.org/10.1093/idpl/ipy020.
40. Department for Digital, Culture, Media and Sport (2021). Memorandum from
32. Barnard, N., and Andrew, S.C.E. (1997). Advertising: strongly persuasive or
the department for digital, culture, media and Sport and the home Of- fice to the
nudging? J. Advert. Res. 37, 21–31.
delegated powers and regulatory reform committee. https://
33. Czoli, C.D., Pauze´ , E., and Potvin Kent, M. (2020). Exposure to food and assets.publishing.service.gov.uk/government/uploads/system/uploads/
beverage advertising on television among Canadian adolescents, 2011 to attachment_data/file/985030/Delegated_Powers_Memorandum_Web_
2016. Nutrients 12, 428. https://doi.org/10.3390/nu12020428. Accessible.pdf.

34. Pinto, A., Pauze´ , E., Mutata, R., Roy-Gagnon, M.-H., and Potvin Kent, M. (2020). 41. Roskomsvoboda (2021). Social media self-censorship law comes into force.
Food and beverage advertising to children and adolescents on television: a https://roskomsvoboda.org/post/vstupil-v-silu-zakon-o-samotsenzure-sots/.
baseline study. Int. J. Environ. Res. Public. Health 17, 1999.
https://doi.org/10.3390/ijerph17061999. About the author
Markus Trengove is a senior researcher in AI ethics, law, and policy at Holi-
35. Alaniz, M.L. (1998). Alcohol availability and targeted advertising in racial/
sticAI. He holds a PhD in political science from University College London.
ethnic minority communities. Alcohol Health Res. World 22, 286–289.

10 Patterns 3, August 12, 2022

You might also like