Digital Services Act – safer internet thanks to platform regulation | News

The text, approved by parliament today with 530 votes against, 78 against and 80 abstentions, will serve as a negotiating mandate with the French presidency, which represents the member states of the Council.

Christel Schaldemose (S&D, Denmark), who chaired the parliamentary negotiating team after the vote, said: “Today’s vote shows that MEPs and EU citizens want ambitious, forward-looking digital regulation. In the 20 years since the adoption of the Online Trade Directive, many online platforms by taking a more prominent place in our lives, bringing new opportunities as well as new threats. Our responsibility is to make the illegal offline illegal online. We must ensure digital regulation for the benefit of consumers and citizens. we will be able to fulfill these tasks. “

Deleting illegal content and preventing the spread of misinformation

The Digital Services Law proposal clearly defines the obligations and responsibilities of intermediary service providers, especially online platforms such as social media and markets.

There is a warning and action mechanism for illegal products, services or online content. Adequate security is also provided. Upon receipt of such notice, hosting service providers must act “without undue delay, taking into account the type of illegal content being reported and the urgency of the action to be taken.” MEPs also provided stronger assurances. They ensure that notices are processed in an involuntary and non-discriminatory manner and respect fundamental rights, including freedom of expression.

MEPs believe that online markets should ensure that consumers can buy safe products online. This is achieved through the obligation to identify vendors (the “know the customer” principle).

Additional responsibilities for very large platforms

Many large online platforms will have special obligations. They pose a particular risk when it comes to the spread of illegal and harmful content. The new regulation will help combat harmful content, including illegal content. It will also prevent the spread of misinformation. This will be achieved through provisions on mandatory risk assessment, risk mitigation measures, independent audits and transparency of recommendation systems (algorithms that determine what users see).

Other important considerations

Parliament made additional changes to the Commission’s proposal. They include:

  • exemption of micro and small enterprises from certain obligations within the framework of regulation;

  • Compensation – Digital service recipients and their representative organizations should be able to claim compensation for any damage caused by platforms that do not comply with the required inspection obligations;

  • Prohibit online platforms from directing or deceiving service users through deceptive interfaces;

  • Targeted advertising – more transparent and informative choices will be made for all service users – they will be able to see information about how the financial benefits of the information are obtained; better protection of minors from direct marketing, profiling and behavioral advertising for commercial purposes;

  • More choice when it comes to prioritizing data based on algorithms – very large online platforms should offer at least one recommendation system that is not based on profiling.

Additional amendments approved by the plenary session address the need for service providers to respect freedom of expression, media freedom and pluralism in their rules of procedure, as well as a new rule on the right to anonymous use and payment of digital services (voting list available). here and all amendments announced at the plenary session are here).

Leave a Comment