Digital Services Act: the revolution will be televised

Digital services Act

The Digital Services Act is upon us and, with its bestie the Digital Markets Act, promises to force powerful changes in the digital ecosystem currently in place in the European Union and even globally. The power is shifting back to the people, with the Digital Services Act, and intermediary service providers better listen to its complaints about unclear and deceptive terms and conditions of service, its takedown notices for illegal content, products and services, as well as its concerns about bullying, breach of free speech, unfair targeting of minors, minorities, etc. Otherwise, the European Commission and national Digital Services Coordinators in the 27 European Union member-states, will take swift action to force online platforms, other types of intermediary service providers and search engines, to change their way and comply, with fines which can go up to 6 percent of worldwide annual turnover. Be warned, the Google, Apple, Microsoft and X/Twitter of this world: the revolution will be, is, televised.

1. What is the Digital Services Act?

Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market for Digital Services and amending Directive 2000/31/EC (Digital Services Act) (‟DSA”) is a regulation from the European Union (‟EU”) that regulates online intermediaries and platforms such as marketplaces, social networks, content-sharing platforms, app stores and online travel and accommodation platforms.

The DSA is part of a ‟package” of new EU rules focused on achieving Europe’s digital targets for 2030 and the digital ecosystem ‟Shaping Europe’s digital future”, along with the Digital Markets Act, the passed AI Act, as well as the Data Act and Data Governance Act, which form a single set of rules that apply across the EU, to implement the two following goals:

  • create a safer digital space in which the fundamental rights of all users of digital services are protected by setting clear and proportionate rules, and
  • establish a level playing field to foster innovation, growth and competitiveness, both in the European single market and globally.

More specifically, the DSA creates an EU-wide uniform framework dealing with four issues as follows:

  • the handling of illegal or potentially harmful online content;
  • the liability of online intermediaries for third-party content;
  • the protection of users’ fundamental rights online, and
  • the bridging of information asymmetries between online intermediaries and their users.

2. Who is affected and/or impacted by the Digital Services Act? Providers of online intermediary services

2.1. Intermediary services

The DSA applies to all intermediary services offered to EU users (natural persons and legal entities), irrespective of where the providers of these intermediary services have their place of establishment.

‟Intermediary services” are defined as:

  • a ‟mere conduit” service, consisting of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network (for example, ‟mere conduit” services include generic categories of services, such as internet exchange points, wireless access points, virtual private networks, DNS services and resolvers, top-level domain name registries, registrars, certificate authorities that issue digital certificates, voice over IP and other interpersonal communication services);
  • a ‟caching” service, consisting of the transmission in a communication network of information provided by a recipient of the service, involving the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making the information’s onward transmission to other recipients more efficient, upon their request (for example, ‟caching” services include the sole provision of content delivery networks, reverse proxies or content adaptation proxies), and
  • a ‟hosting” service, consisting of the storage of information provided by, and at the request of, a recipient of the service (for example, cloud computing, web hosting, paid referencing services or services enabling sharing information and content online, including file storage and sharing).

Intermediary services may be provided in isolation, as part of another type of intermediary service, or simultaneously with other intermediary services. Whether a specific service constitutes a ‟mere conduit”, ‟caching” or ‟hosting” service depends solely on its technical functionalities, which might evolve in time, and should be assessed on a case-by-case basis.

2.2. Providers of intermediary services

Therefore, all companies providing online intermediary services on the EU single market, whether established in the EU or not, must comply with the DSA. These include:

  • intermediary service providers offering network infrastructure (internet access providers, caching operators);
  • hosting service providers;
  • online platforms (including social media platforms, social networks, app stores, online travel and accommodation websites, content-sharing websites, collaborative economy platforms and marketplaces), and
  • search engines.

In the DSA, companies are subject to obligations which are proportionate to their size, role, impact and audiences in the online ecosystem, in particular:

  • micro-companies and small businesses (with less than 50 employees and annual sales of less than 10 million Euros) are exempt from some of the DSA’s obligations, and

2.3. Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs)

The European Commission (the ‟Commission”) has begun to designate VLOPs and VLOSEs based on user numbers provided by platforms and search engines, which, regardless of size (except micro and small enterprises), they were required to publish by 17 February 2023. Platforms and search engines will need to update these figures at least every six months.

Once the Commission designates a platform as a VLOP or search engine as a VLOSE, the designated online service has four months to comply with the DSA. The designation triggers specific rules that tackle the particular risks such large services pose to Europeans and society when it comes to illegal content, and their impact on fundamental rights, public security and wellbeing. For example, the VLOP or VLOSE needs to:

  • establish a point of contact for authorities and users;
  • report criminal offences;
  • have user-friendly terms and conditions, and
  • be transparent as regards advertising, recommender systems or content moderation.

The Commission will revoke its decision if the platform or search engine does not reach the threshold of 45 million monthly users anymore, during one full year.

So who are those VLOPs and VLOSEs, identified by the Commission as early as April 2023, so far? Some of the most notable are, inter alia:

  • Alibaba (Netherlands) B.V. is a VLOP under the DSA, for the designated service AliExpress;
  • Amazon Services Europe S.à.r.l. is a VLOP under the DSA, for the designated service Amazon Store;
  • Apple Distribution International Limited is a VLOP under the DSA, for the designated service App Store;
  • Aylo Freesites Ltd. is a VLOP under the DSA, for the designated service Pornhub;
  • Booking.com B.V. is a VLOP under the DSA, for the designated service Booking.com;
  • Google Ireland Ltd. is a VLOSE under the DSA, for the designated service Google Search, and a VLOP under the DSA, for the designated services Google Play, Google Maps, Google Shopping and YouTube;
  • Linkedin Ireland Unlimited Company is a VLOP under the DSA, for the designated service LinkedIn;
  • Meta Platforms Ireland Limited (MPIL) is a VLOP under the DSA, for the designated services Facebook and Instagram;
  • Microsoft Ireland Operations Limited is a VLOSE under the DSA, for the designated service Bing;
  • Pinterest Europe Ltd. is a VLOP under the DSA, for the designated service Pinterest;
  • Snap B.V. is a VLOP under the DSA, for the designated service Snapchat;
  • TikTok Technology Limited is a VLOP under the DSA, for the designated service TikTok;
  • Twitter International Unlimited Company is a VLOP under the DSA, for the designated service X;
  • Wikimedia Foundation Inc 3*** is a VLOP under the DSA, for the designated service Wikipedia, and
  • Zalanda SE is a VLOP under the DSA, for the designated service Zalando.

On 18 December 2023, the Commission opened formal proceedings to assess whether X may have breached the DSA in areas linked to risk management, content moderation, dark patterns, advertising transparency and data access for researchers. This decision to open proceedings was motivated by the analysis of the risk assessment report submitted by X in September 2023, X’s transparency report published on 3 November, and X’s replies to a formal request for information, which, among others, concerned the dissemination of illegal content in the context of Hamas’ terrorist attacks against Israel.

3. What are the obligations that providers of online intermediary services have, under the DSA?

The DSA establishes a new liability framework for companies in the digital sector, meaning they are now subject to a multitude of obligations.

3.1. Key obligations for all intermediary service providers

Here is a summary of the key obligations imposed on different levels of digital intermediary service providers by the DSA:

  • Governance: all providers at all levels must establish two single points of contact, one for direct communication with supervisory authorities, and the other for the recipients of the services. Providers not established in the EU, but offering services in the EU, will be required to designate a legal representative in the EU. Online platforms will need to have an out-of-court alternative dispute resolution mechanism, publish annual reports on content moderation, including the number of orders received from the authorities and the number of notices received from other parties, for removal and disabling of illegal content or content contrary to their terms and conditions, and the effect given to such orders and notices. VLOPs and VLOSEs must perform systematic risk assessments, share data with regulators and appoint a compliance officer;
  • Obligations for VLOPs and VLOSEs to prevent abuse of their systems, by taking risk-based action, including oversight through independent audits of their risk management measures. Platforms must mitigate against risks such as disinformation or election manipulation, cyber violence against women or harms to minors online. These measures must be carefully balanced against restrictions of freedom of expression and are subject to independent audits;
  • Responsible online marketplaces: online platforms and VLOPs will have to strengthen checks on the information provided by traders and make efforts to prevent illegal content so that consumers can purchase safe products and services;
  • Measures to counter illegal content online, including illegal goods and services: the DSA imposes new mechanisms allowing users to flag illegal content online, and for platforms to cooperate with specialised ‟trusted flaggers” to identify and remove illegal content;
  • New rules to trace sellers on online marketplaces, to help build trust and go after scammers more easily; a new obligation for online marketplaces to randomly check against existing databases whether products or services on their sites are compliant; sustained efforts to enhance the traceability of products through advanced technological solutions;
  • Ban on dark patterns on the interface of online platforms, referring to misleading tricks that manipulate users into choices they do not intend to make; providers must not manipulate users (commonly known as ‟nudging”) into using their service, for example, by making one choice more prominent than the other. Cancelling a subscription to a service should also be as easy as subscribing;
  • Wide-ranging transparency measures for online platforms, including better information on terms and conditions, as well as transparency on the algorithms used for recommending content or products to users; Also enhanced transparency for all advertising on online platforms and influencers’ commercial communications;
  • Bans on targeted advertising on online platforms: targeted advertising to minors or targeted advertising based on special categories of personal data, such as ethnicity, political views or sexual orientation, is prohibited for online platforms and VLOPs;
  • Protection of minors on any platform in the EU: for services aimed at minors, the providers of intermediary services must provide an explanation on the conditions and restrictions of use in a way that is understandable to minors;
  • Recommender systems: VLOPs will be required to offer users a system for recommending content not based on profiling. Transparency requirements for the parameters of recommender systems will be included;
  • ‟Notice and action” procedure: providers of intermediary services must explicitly describe, in their terms and conditions, any restrictions that they may impose on the use of their services, such as the content moderation policies, and to act responsibly in applying and enforcing those restrictions. Users will be empowered to give notice of illegal online content. Online platforms and VLOPs will have to be reactive through a clearer ‟notice and action” procedure. Victims of cyber crime will see the content that they report removed momentarily;
  • Protection of fundamental rights: stronger safeguards must be put in place to ensure user notices are processed in a non-arbitrary and non-discriminatory way, and safeguards must protect fundamental rights, such as data protection and freedom of expression;
  • Effective safeguards for users, including the possibility to challenge platforms’ content moderation decisions based on the obligatory information platforms must now provide to users when their content gets removed or restricted; users have new rights, including a right to complain to the platform, seek out-of-court settlements, complain to their national authority in their own language, or seek compensation for breaches of the rules. Now, representative organisations are able to defend user rights for large scale breaches of the law;
  • Accountability: EU member-states and the Commission will be able to access the algorithms of VLOPs and VLOSEs;
  • Allow access to data to researchers of key platforms in order to scrutinise how platforms work and how online risks evolve;
  • A new crisis response mechanism in cases a serious threat for public health and security crises, such as a pandemic or a war;
  • A unique oversight structure: the Commission is the primary regulator of VLOPs and VLOSEs, while other intermediary service providers are under the supervision of member-states where they are established. Indeed, national Digital Service Coordinators (‟DSCs”), designed by each one of the 27 EU member-states, are responsible for supervising, enforcing and monitoring the DSA in that country. In France, the ‟Autorité de régulation de la communication audiovisuelle et numérique” (‟Arcom”) is the DSC. The Commission has enforcement powers similar to those it has under antitrust proceedings. An EU-wide cooperation mechanism is currently being established between national regulators, the DSCs, and the Commission.

While the DSA does not define what illegal content online is, it sets out EU-wide rules that cover detection, flagging and removal of illegal content, as well as a new risk assessment framework for VLOPs and VLOSEs on how illegal content spreads on their services.

What constitutes illegal content, though, is defined in other laws, either at EU level or at national level – for example, terrorist content or child sexual abuse material or illegal hate speech is defined at EU level. Where a content is illegal only in a given EU member-state, as a general rule it should only be removed in the territory where it is illegal.

The DSA stipulates that breaches must be subject to proportionate and dissuasive penalties, determined by each member-state. Intermediary service providers can be fined up to 6 percent of annual worldwide turnover for breaching the DSA and up to 1 percent of worldwide turnover for providing incorrect or misleading information.

3.2. Key obligations specific to VLOPs and VLOSEs

Once they are designated as such, VLOPs and VLOSEs must follow the rules that focus only on VLOPs and VLOSEs due to the potential impact they can have on society. This means that they must identify, analyse and assess systemic risks that are linked to their services. They should look, in particular, to risks related to:

  • illegal content;
  • fundamental rights, such as freedom of expression, media freedom and pluralism, discrimination, consumer protection and children’s rights;
  • public security and electoral processes, and
  • gender-based violence, public health, protection of minors and mental and physical wellbeing.

Once the risks are identified and reported to the Commission for oversight, VLOPs and VLOSEs are obliged to put measures in place that mitigate these risks. This could mean adapting the design or functioning of their services or changing their recommender systems. This could also consist of reinforcing the platform internally with more resources to better identify systemic risks.

Those designated as VLOPs and VLOSEs also have to:

  • establish an internal compliance function that ensures that the risks identified are mitigated;
  • be audited by an independent auditor at least once a year and adopt measures that respond to the auditors’ recommendations;
  • share their data with the Commission and national authorities so that they can monitor and assess compliance with the DSA;
  • allow vetted researchers to access platform data when the research contributes to the detection, identification and understanding of systemic risks in the EU;
  • provide an option in their recommender systems that is not based on user profiling, and
  • have a publicly available repository of advertisements.

To conclude, the DSA is a first-of-a-kind regulatory toolbox globally, and sets an international benchmark or a regulatory approach to online intermediaries. Designed as a single, uniform set of rules for the EU, these rules will give users new protections and businesses legal certainty across the whole single market. Moreover, the DSA will complement the distance selling regulations and EU consumer contract legislation well, empowering consumers and businesses in doing more business and deals online. While we are super glad to be Europeans and therefore to benefit from these wonderful protections, we highly recommend that providers of intermediary services take the DSA very seriously, and work their socks off to become immediately compliant with it, even when online platforms, such as Easyjet, have not yet been designated as VLOPs by the Commission.

Crefovi’s live webinar: Digital Services Act – the revolution will be televised – 29 March 2024

Crefovi regularly updates its social media channels, such as LinkedinTwitterInstagramYouTube and Facebook. Check our latest news there!


    Your name (required)

    Your email (required)

    Subject

    Your message

    captcha

    Leave a comment

    Your email address will not be published. Required fields are marked *