Top of the VLOPs: largest online players in countdown to compliance

Written By

francine cunningham Module
Francine Cunningham

Regulatory and Public Affairs Director
Belgium Ireland

As Regulatory & Public Affairs Director in Brussels, I assist companies facing an unprecedented wave of new EU regulation that will have an impact on every business operating in the digital and data-related economy. I help companies navigate complex EU decision-making processes and understand the practical application of the law to their sectors.

Online intermediaries designated as Very Large Online Platforms (VLOPs) or Very Large Online Search Engines (VLOSEs) under the Digital Services Act have until 25 August 2023 to comply with the substantial new obligations. The European Commission announced on 25 April that it had designated 17 VLOPs and two VLOSEs that reach at least 45 million monthly active users in the European Union.

Under the terms of the DSA, these very large players will now face stricter obligations designed to guard against any illegal or harmful content on their systems. The designated companies comprise app stores, social media networks and online marketplaces, including the following:

Very Large Online Platforms

  • Alibaba AliExpress
  • Amazon Store
  • Apple AppStore
  • Booking.com
  • Facebook
  • Google Play
  • Google Maps
  • Google Shopping
  • Instagram
  • LinkedIn
  • Pinterest
  • Snapchat
  • TikTok
  • Twitter
  • Wikipedia
  • YouTube
  • Zalando

Very Large Online Search Engines

  • Bing
  • Google Search

According to the Commission, its decision was based on the user data the platforms were obliged to publish by 17 February. The Commission is also believed to be considering the designation of an additional four or five platforms in the near future.

To recall, the DSA introduces asymmetric due diligence requirements for different types of online intermediaries, according to the size and impact of their services, with the aim of ensuring a safer and more transparent online environment. All platforms, except the smallest (employing fewer than 50 people and with an annual turnover of less than EUR 10 million) are required to comply with a set of obligations, including providing a complaint and redress mechanism and out-of-court dispute settlement mechanisms, cooperating with “trusted flagger” organisations that have expertise in tackling illegal content online, verifying the credentials of traders on online marketplaces and providing transparency for users with regard to online advertising, among other measures.

Very large platforms (with users representing 10% of the EU population) face stricter obligations in line with their pivotal role in facilitating public debate and economic transactions. Significant additional obligations will apply only to these designated “very large” platforms, with the purpose of ensuring that they have robust content moderation tools in place and their systems are not misused. Designated VLOPs and VLOSEs will have to comply with risk management obligations, external risk auditing and public accountability, provide transparency of their recommender systems and user choice for access to information, as well as share data with authorities and researchers.

With respect to enforcement, the DSA introduces a system of national and EU-level cooperation to oversee how online intermediaries comply with the new rules. Each Member State is due to appoint a Digital Services Coordinator by 17 February 2024, an independent authority responsible for supervising the intermediary services established in their Member State. Digital Service Coordinators will have the ability to impose penalties, including financial fines, according to the penalties outlined in their national laws and in line with the Regulation.

In the case of VLOPs and VLOSEs, the Commission will have direct supervision and enforcement powers and can, in the case of serious violations, impose fines of up to 6% of the global turnover of a service provider. The Digital Services Coordinator and the Commission will also have the power to require immediate actions where necessary to address very serious harms, and platforms may offer commitments on how they will remedy them. There is also the possibility of authorities asking a court for a temporary suspension of a digital service in the event of ongoing refusal to comply with important obligations under the DSA.

To support its enforcement of the DSA, the Commission also recently announced the launch of the European Centre for Algorithmic Transparency (ECAT) in Seville. ECAT will provide the Commission with technical and scientific expertise to ensure that algorithmic systems used by VLOPs and VLOSEs comply with the risk management, mitigation and transparency requirements in the new Regulation.

For more information, please contact Francine Cunningham

Sign up for our Connected newsletter for a monthly round-up from our Regulatory & Public Affairs team

Latest insights

More Insights
Curiosity line teal background

A Deep Dive into China’s Network ID Proposal

Nov 06 2024

Read More
mountain scape

European Union Artificial Intelligence Act Guide

Nov 06 2024

Read More

California’s AI bill vs. the EU AI Act: a cross-continental analysis of AI regulations

Nov 06 2024

Read More