As Regulatory & Public Affairs Director in Brussels, I assist companies facing an unprecedented wave of new EU regulation that will have an impact on every business operating in the digital and data-related economy. I help companies navigate complex EU decision-making processes and understand the practical application of the law to their sectors.
New binding rules to tackle illegal content online, stricter rules for targeted advertising and an EU body with the ability to impose fines are just some of the demands of Members of the European Parliament for the forthcoming Digital Services Act. The proposed legislation, which is due before the end of the year, will usher in new regulation for online platforms and digital marketplaces.
Three parliamentary Committees have now adopted their reports on the Digital Services Act. The whole Parliament is scheduled to vote on the reports during the plenary session of 19-22 October. The reports from the Legal Affairs Committee and the Internal Market Committee, both adopted on 28 September, take the form of legislative initiative reports, which means that MEPs are asking the Commission to submit a proposal. Meanwhile, on 22 September, the Civil Liberties Committee adopted an initiative report which will be viewed as the Parliament's opinion on specific issues related to fundamental rights in the forthcoming Digital Services Act.
Key requests of Legal Affairs Committee
In his report for the Legal Affairs Committee entitled "Digital Services Act: adapting commercial and civil law rules for commercial entities operating online", rapporteur Tiemo Wölken (S&D, DE) called on the Commission to put forward new legislation regarding online content management. Specifically, the report calls on the European Commission to come forward with legislative proposals covering the following:
Illegal and Harmful Content: the Digital Services Act should make a clear distinction between illegal and harmful content. Future rules on content management should also address the spread of harmful content that is not illegal, such as fake news or disinformation;
Updated "notice and action" mechanism: EU-wide standards should be introduced regarding the way in which hosting platforms should moderate content and updated “notice and action” procedures should be introduced to protect users from illegal or harmful content;
Notification and redress: when content is flagged or taken down, users should be notified and able to seek redress through a national dispute settlement body. Final decisions should be taken by an independent judiciary and not by private undertakings;
Possible filtering: platforms should not employ upload filters or any form of ex-ante content control for harmful or illegal content;
Targeted ads: targeted advertising must be regulated more strictly in favour of less intrusive, contextualised forms of advertising that require less data and does not depend on previous user interaction with content;
Transparency: the DSA should include rules on the terms for accumulation of data for advertising purposes and provide for the right to use digital services anonymously whenever possible;
Monitoring and compliance: the Commission should examine options for a European body to monitor the compliance of online platforms with the new rules and impose fines, such as an EU entity or a coordinated network of national authorities.
Key requests of Internal Market Committee
Meanwhile, in a report entitled "Digital Services Act - Improving the functioning of the Single Market" by Alex Agius Saliba (S&D, MT), the Internal Market Committee outlined its ambition for the EU to become a standard setter for other parts of the world when it comes to digital regulation. According to the compromise amendments adopted, the guiding principle for the DSA should be "what is illegal offline is also illegal online", along with a firm approach to user safety and consumer protection.
Specific demands outlined in the compromise amendments adopted by the Internal Market Committee for the DSA include:
Counterfeit and unsafe products: platforms and online intermediation services should improve their efforts to detect and take down false claims and tackle rogue traders, e.g. those selling false medical equipment or dangerous products online, as noted during the COVID-19 pandemic;
Liability: an effective and legally enforceable notice-and-action mechanism must be set up so that users can notify online intermediaries about potentially illegal online content or activities. The new rules should preserve the underlying legal principle that passive online intermediaries should not be held directly liable for the actions of their users;
Hate speech: harmful content, hate speech and disinformation should be addressed through enhanced transparency obligations and by improving the media and digital literacy of consumers;
Know your Business Customer: a "Know your business customer" principle should require platforms to check and stop fraudulent companies using their services to sell their illegal and unsafe products and content;
AI-driven services: consumers should have a right to be informed if a service is enabled by AI, makes use of automated decision-making or machine learning tools or automated content recognition tools. They should have the right to redress and be able to opt out;
Specific ex-ante rules for "gatekeepers”: the DSA package should include a separate proposal for an internal market instrument imposing ex-ante obligations on large platforms that act as "gatekeepers" for market access. The aim would be to prevent, instead of merely remedy, market failures caused by such "systemic operators" and open markets to new entrants, including start-ups and SMEs.
Recommendations of Civil Liberties Committee
Additionally, the Parliament's Civil Liberties Committee adopted on 22 September an initiative report by Kris Peeters (EPP, BE) on the "Digital Services Act and fundamental rights issues posed." The MEPs in the Civil Liberties Committee recommended retaining the limited liability regime for intermediaries in the e-Commerce Directive which dates from 2000, maintaining the ban on imposing general monitoring obligations on online platforms, as well as keeping the country of origin principle.
Furthermore, the report warns against obligations for platforms to use automated content filtering tools and recommends measures for consumers to opt-out of content curation.
Next steps
The vote to adopt the above-mentioned reports is due to take place during the 19-22 October plenary session. These reports will then be sent to the Commission to feed into its proposal for a Digital Services Act package, currently expected to be published on 2 December 2020.