Consumer protectionDigital Services Act: New rules and responsibilities for online platforms

Today, the Digital Services Act (DSA) comes into force and with it a new set of EU rules for a safer and more accountable online environment. The DSA specifies rules for digital services that connect consumers to goods, services, or content, such as online marketplaces, social networks, and other platforms, with the aim of limiting the spread of illegal content and illegal products online, increasing the protection of minors and providing more choice and better information to users.

The DSA is part of a comprehensive framework to ensure a safer digital space including online services, from simple websites to internet infrastructure services and online platforms and follows the Digital Markets Act (DMA) that came into force on November 1st, 2022.

As a regulatory toolbox, the DSA will apply specifically to the following areas:

  • Online intermediaries and hosting services (cloud and webhosting) offering network infrastructure (i.e. Internet access providers, domain name registrars) will have to follow new rules on how they design their services and procedures and comply with wide-ranging new transparency obligations to increase accountability and oversight, for example with new flagging mechanism for illegal content.
  • Large platforms or search engines with more than 45 million users (i.e. more than 10 percent of 450 million EU consumers), underlie further obligations that include comprehensive annual assessments of the risks of online harms to their services and how it affects fundamental rights especially in the light of exposure to illegal goods or content or the dissemination of disinformation.
  • Online platforms (i.e. online marketplaces, app stores, sharing-economy platforms and social media) will have to introduce new protections for the freedom of expression to limit arbitrary content moderation decisions, and offer new ways for users to take informed action against the platform when their content is moderated. For example, users of online platforms will now have multiple means of challenging content moderation decisions, including when these decisions are based on platforms' terms and conditions. Users can complain directly to the platform, choose an out-of-court dispute settlement body, or seek redress before Courts.
  • New rules also require platforms' terms to be presented in a clear and concise manner and to respect users' fundamental rights.

Margrethe Vestager, Executive Vice-President for a Europe Fit for the Digital Age, noted: "With the Digital Services Act, we now have clear legislation. Online platforms are at the core of some of the key aspects of our daily lives, democracies, and economies. It’s only logic that we ensure that these platforms live up to their responsibilities in terms of reducing the amount of illegal content online and mitigating other online harms, as well as protecting the fundamental rights and safety of users."

Source: EC

More information