European Legislative Initiative For Very Large

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 8

bayer

European Legislative Initiative for Very Large


Communication Platforms

Jan Christopher Kalbhenn

Digital Services Act and Digital Markets Act

Background
In December 2020, the European Commission presented the European
Action Plan for Democracy

For example,
Germany, France and Austria already have or are planning initial laws to
combat hate speech on social networks.5 Germany has also already enacted
the first media law regulations for communication platforms.6
A similar picture emerges in competition law. In recent years, the
European Commission has increasingly conducted proceedings against
the major platform companies and has regularly found abuse of market
power.7 National antitrust authorities in the Member States have also
made high-profile decisions in this area, such as the German Federal Cartel
Office prohibiting Facebook from combining user data from its Facebook,
WhatsApp and Instagram services.8

The Digital Service Act (DSA) has two main purposes. On the one hand,
creation of uniform rules for all Member States is intended to promote
the – digital – single market.9 Another objective is to ensure protection of
EU citizens' fundamental rights on the internet.10 This primarily involves
protection of freedom of expression, protection of the personal rights of
those affected by hate speech, and protection of freedom of information.
The Digital Markets Act (DMA) is also intended to impose harmonised
rules on central platform services throughout Europe by way of a regulation,
thus ensuring competition and fair digital markets throughout the
Union in which gatekeepers operate.

The size of online


platforms also plays a role. Small platforms are excluded from the scope of
specific obligations and are spared in favour of innovativeness.13 Very large
online platforms, on the other hand, are subject to significant obligations.
These are online platforms that have an average of 45 million active users
in the EU.14 Very large online platforms include Facebook, Twitter,
YouTube, Twitch, Instagram, and TikTok.

Digital Markets Act imposes further binding obligations on these


digital companies. It focuses on ‘central platform services’. These are a series
of services that are listed exhaustively. They include online brokerage
services such as AirBnB, online search engines such as Google Search, social
networks such as Instagram and TikTok, video sharing platform services
such as YouTube, messenger services such as WhatsApp,

The new ABC of European platform regulation

Content moderation

Downgrading, blocking access or removal are given as examples.

Also included are measures that restrict the ability of users to provide
bayer

information. This also includes closure or temporary suspension of a user

account for content moderation. This definition is very broad. Thus, the

Digital Services Act affects all means available to platforms to manage

content.

Illegal content
special category of content to which the Digital Services
Act attaches certain legal consequences

Advertising

Recommendation systems

General terms and conditions

They are any terms, conditions or specifications,


regardless of their name or form, that govern the contractual
relationship between the provider of intermediary services and users art2liq DSA

Online platforms must also clearly state in their terms and conditions
how they handle account suspensions.

Very large online platforms must therefore


present the most important parameters of recommendation systems in
an accessible and easily understandable way in their general terms and
conditions. All options with which the most important parameters can
be changed or influenced are to be pointed out. User autonomy is to be
strengthened by providing at least one profiling-free (as defined by the
GDPR) option.32 The Digital Services Act makes a design specification in
the event that several such options are provided. In that case, the design of
the user interface must provide an ‘easily accessible function’ for the user
to select the recommendation system.

Introducing the Systems Approach and the Statutory Duty of


Care
Lorna Woods

A Traditional Approach to Liability for Content

Policy Developments in the USA to Address Platform


Information Disorders*
Sarah Hartmann

Introduction and Overview

Online platforms are intrinsically linked to information disorders as a


petri dish that allows extreme content, conspiracy theories and false information
to multiply.

Information disorders include many buzzword phenomena such as “fake


bayer

news” and “hate speech”, but are not limited to these vague terms.

from widespread misinformation about COVID-19, such as


the alleged inefficacy of wearing face masks,7 to allegations of election
fraud culminating in the unprecedented capitol riots of January 6th 2021.8
Discussions on the fallout inevitably zeroed in on the role of online platforms9
and future preventive measures, with the US Congress holding a
hearing10 on the role of social media platforms in promoting misinformation
and extremist content in late March 2021.

Lack of Reliable Sources – Measures against the Decline of Local


News

good news vs bad news


The Local Journalism Sustainability Act,37 proposed in July 2020 by
Representative Kirkpatrick, chooses a different approach, not relying on
antitrust law but rather creating tax incentives in order to support local
media.

Lack of Platform Accountability – Draft Laws to Shrink Section 230


Immunity

Section 230 (c) in its current form prevents platforms


as “providers of interactive computer services” from being treated as
the publisher or speaker of information by another information content
provider. Furthermore, the Good Samaritan clause in Section 230 (c) (2)
excludes civil liability for removal or restriction of content in “good faith”.

Critics from opposing ends of the political


spectrum focus on different aspects, for example alleging left-leaning bias
in content moderation52 and “censorship” by platforms of political opinions,
53 or suggesting a systemic failure to sufficiently protect vulnerable
groups and prevent crime.

Amplification, Recommendation or Monetization of Content

The idea of “earned” immunity has


been discussed by Citron and Wittes, for example, on the condition of
reasonable moderation practices,73 and recommended in the Stigler report
in the form of a “quid pro quo” for fulfilment of obligations mainly relating
to transparency.74 In the context of the recent congressional hearing,
Mark Zuckerberg of Facebook expressed support for a similar system of
conditional immunity, requiring compliance with best practice standards
of content moderation and systems to identify and remove harmful content.
75

Lack of Competition – Introducing Portability and Interoperability

Platforms would
be obligated to implement a system for the transfer of user data in a
structured, commonly used and machine-readable format to other communication
providers at the discretion of the user.

The most prominent topic of recent policy debate in connection with


platforms has been the reform of Section 230 immunity. To a certain
extent, Section 230 has become a symbol of many things regarded as
“wrong” with the current framework for online platforms. 113.o.
bayer

Interoperability of Messenger Services.


Possibilities for a Consumer-Friendly Approach - Jorg Becker, Bernd Holznagel, Kilian Muller
Introduction

While in 2016,
around 67% of all users aged 14 and above used messenger services2,
this figure increased to almost 90% in 2018.

Messenger services

Bundesnetzagentur

classifies
messenger services and other digital platforms in the category of "over-thetop"
(OTT) services

enable communication and other services via


the Internet.5

As stated above, according to Wegner, there are basically two ways to


ensure interoperability between messenger services, standardization and
the creation of interfaces.8 The creation of federated systems uses an interface
to mediate between different protocols or domains, using a common
standard. A federation can be seen as a combination of both approaches.

Standardization of communication protocols is another option for enabling


interoperability between messenger services. Many functionalities
of the messenger services depend on the respectively used protocols. It
must therefore be ensured that a protocol is selected or created that supports
all the functionalities that are to be interoperable. For this purpose,
each messenger service must change its implementations to the new protocol
or support multiple protocols.

As already mentioned, Facebook


has market shares of over 90%. If "Facebook-external" users, i.e., users
without a messenger service from Facebook, want to communicate with
other users who use a Facebook messenger service such as WhatsApp, they
are currently forced to use a messenger service from Facebook.

133.o. táblázat

Regulatory approach

Six Problems with Facebook’s Oversight Board.


Not enough contract law, too much human rights.
Marten Schultz

Facebook is, arguably, the most important catalyst for freedom of expression
in human history. When Facebook set up an independent institution
and gave it the power to overrule its decisions and build its own
“case law” it also established the most influential arbitrator of expression
in human history. That alone is cause for concern. There are other reasons
bayer

to be concerned as well. This article puts forward six problems with the
OB, as it has developed in its still early stage

The Oversight Board Charter (“the Charter”) is the foundational steering


document for the OB.7 The Charter makes clear that a case can be
submitted to the OB either by a user or by Facebook itself (which is how
the decision to remove president Donald Trump from the platform came
before the Board). It is up the Board to decide which cases it should take
on, but the Charter states that it should prioritize cases “that have the
greatest potential to guide future decisions and policies.”8

The Oversight Board Charter (“the Charter”) is the foundational steering


document for the OB.7 The Charter makes clear that a case can be
submitted to the OB either by a user or by Facebook itself (which is how
the decision to remove president Donald Trump from the platform came
before the Board). It is up the Board to decide which cases it should take
on, but the Charter states that it should prioritize cases “that have the
greatest potential to guide future decisions and policies.”8

In the Charter Facebook commits “to the board’s independent oversight


on content decisions and the implementation of those decisions.”11 The
Board not only has the power to overrule Facebook decisions regarding
content on the platform. It can also make advisory statements on Facebook/
Instagram policy.12 Facebook can choose whether to follow these
recommendations or not.

6 problems
The narrative

is, but a simple description of what characterizes a court in a modern


Rechtstaat illustrates why the label is misleading also as a metaphor. A
court is, at least, an institution within a national state that exercises public
authority. The OB is nothing of the sort. Its scope is narrow (content
moderation decisions by Facebook), its authority is narrow (it can decide
either that Facebook needs to put back content it has removed or that its
decision stands) and it lacks the possibility to exercise any public power.

However, to my knowledge, there are no examples in any jurisdiction


of direct application of a general human rights catalogue as a basis for
duties of private companies. There are examples of constitutions that apply
human rights law to (humans and) companies, but only in a limited
sense.24 The most contested issue in
this context is whether private entities (companies and persons) could be
held responsible under human rights rules, an issue discussed under the
heading of “horizontal human rights” or “direkte Drittwirkung”.22 It has
also been a hot topic in international law.23

The bias

Freedom of expression is a fundamental human right. Facebook seeks to


give people a voice so we can connect, share ideas and experiences, and
understand each other.
Free expression is paramount, but there are times when speech can be
at odds with authenticity, safety, privacy, and dignity. Some expressions endanger other people's ability to
express themselves freely. Therefore, it
must be balanced against these considerations.”

The OB thus has protection of freedom of expression as its primary


goal.31 This is an unfortunate formulation. The Board here uses the term in
bayer

the way Facebook’s critics have often used it, when the company is accused
of “censorship”. Removal of content by Facebook restricts the possibility
to reach other people but it is not a restriction of freedom of speech. It
may be a breach of contract, if Facebook has failed to follow the terms of
the agreement, but it is not censorship.

When the Board taps into the language of freedom


of speech and thereafter, in its first batch of decisions, overrides most
of Facebook’s content moderation decisions (of which none were clearly
in conflict with the terms of service) it sent a signal, “When in doubt:
restore”.
Most of all, however, it is unfortunate because it is questionable to
assign freedom of speech – or any fundamental (negative) human right or
freedom – a general priority.

The rules

The relationship between Facebook and its users is contractual. When


conflicts arise between two parties to a contract the first question is: “What
does the contract say?” When a decision maker, for instance a judge, settles
a contractual dispute the starting point of the analysis is always the set
of rules that forms the contract.

The OB has taken another path. Already in the first decisions it became
clear that the Board uses three sets of norms in its handling of cases - Facebook’s Community Standards,
Facebook’s values, and international
human rights law. The OB uses the formulation “Relevant Human Rights
Standards considered by the Board”. More specifically, the Board refers to
“The UN Guiding Principles on Business and Human Rights (UNGPs)”
which were endorsed by the UN Human Rights Council in 2011. These
principles establish “a voluntary framework for the human rights responsibilities
of private businesses”.36

When someone sets up an account with Facebook


a contract is formed. The contract includes different terms that the
parties agree upon. These terms include the community standards but
also Facebook’s values, but not any reference to the UNGPs. When a dispute
between Facebook and a user is resolved under principles of human
rights law it means not only that Facebook’s actions are tested against a
normative framework it has not accepted but also that the decision maker
overrides the rules that both parties had agreed upon. The inclusion of
human rights principles in the OB’s set of rules thus amounts, in a way, to
disregard of the will of both Facebook and its users as expressed through
the contract.

A second problem with the norm sets the OB has chosen is unpredictability.

The process

“I think in any kind of good-functioning democratic system, there


needs to be a way to appeal.”37 This statement comes from Mark
Zuckerberg, in one of the earlier interviews in which he talked about
the need for independent judicial review. Zuckerberg later wrote,
in an open letter in connection with publication of the Charter: “If
someone disagrees with a decision we’ve made, they can appeal to us
first, and soon they will be able to further appeal to this independent
board.”38

“The Oversight Board will select cases for review that raise
bayer

important issues pertaining to respect for freedom of expression and other


human rights and/or the implementation of Facebook’s Community
Standards and Values. These cases will be of critical importance to public
discourse, directly or indirectly affect a substantial number of individuals,
and/or raise questions about Facebook’s policies. These cases will reflect
the user base of Facebook and ensure regional and linguistic diversity.”42
The practicalities of the selection process are regulated in the Bylaws.43
In other words, the OB will not provide every user with a fair and equal
chance to get the Board to review their case.

The decisions

“After concluding deliberations,


a board panel will draft a written decision, which will include:
a determination on the content; the rationale for reaching that decision;
and, if desired, a policy advisory statement. The decision will also include
any concurring or dissenting viewpoints, if the panel cannot reach consensus.”

In the Oversight Board decision on whether Facebook was right to restrict


then president Donald Trump from posting on the platform – the Board
found that Facebook’s decision was not in itself wrong but that the sanction,
indefinite suspension, was not supported by the company’s rules –
it was mentioned that a minority had a different opinion on some issues,
albeit not on the main issue of whether it was within Facebook’s right
to suspend the president.4

Whether dissenting opinions are a good thing or not has long been
widely discussed in legal circles, but it is a fact that a dissent can provide
important contributions to a discussion of how to weigh different interests
against each other. Particularly good examples of this can be found in the
area of freedom of speech. Oliver Wendell Holmes’ dissent in Abrams v.
the United States sparked a debate that changed and broadened freedom
of speech discourse in the USA

The power shift

There has
always been a way to “appeal” Facebook decisions – the national courts.
In practice, however, it is often difficult and risky to bring a company
such as Facebook to court. Moreover, it is not always clear what it would
mean to win a case regarding wrongful moderation of content.48 Even
if one believes the company has made the wrong decision it will not be
worth the trouble or cost to take Facebook to court. Not even Donald
Trump has thought it worth the effort.

The members
of the OB are becoming the most powerful people in deciding the limits of
speech in human history. This concentration of power is in itself worrying.

The Community Standards were continuously changed. Before


changing the rules, Facebook would seek input from people and
organizations around the world.55 At the end of the day, it was Facebook
that decided what kind of rules it wanted and users’ decision whether to
stay on the platform or to leave.

„Open with Caution“.


How Taiwan Approaches Platform Governance in the Global
Market and Geopolitics
Kuo-Wei Wu, Shun-Ling Chen, Poren Chiang1
bayer

Why do Taiwan’s approaches (must) differ from the EU?


GAFAM is only part of the problem

Of the top messaging apps,


the reach rate of Facebook Messenger (26.8%) is only slightly higher than
WeChat (21.4%). LINE, the most popular messaging app, is ultimately a
Japanese company.

31

Digital Platform Regulation in Japan – does the soft approach


work?
Izumi Aizu

Introduction: Three areas and two approaches to platform regulation


Under the digital platform regulation concerns in Japan, there are three
major policy areas:
1) Social and political issues including hate speech, harmful and illegal
content, and fake news;
2) Economic concerns including protection of domestic small and medium
businesses (SMEs) against Big Tech;
3) Consumer protection including protection of privacy and personal data.
When it comes to regulatory frameworks, two different approaches are
observed:
a) Hard approach – use of existing legal framework or establishing a new
legislation with strong enforcement;
b) Soft approach – relying on voluntary activities of citizens, local autonomy,
and industry self-regulation.

Hate speech regulation in Japan

The most visible cases are the direct threats


given to Korean students in the street going to their Korean Schools in
Japan who wear traditional Korean folk-style outfits called Chima jeogori.
They receive such dirty words as “Go home!” or “We will kill you” during
commuting on trains. But that is just the tip of the iceberg: Korean residents
frequently encounter other hostile acts by some Japanese citizens.

You might also like