We use cookies to collect and analyse information on site performance and usage to improve and customise your experience, where applicable. View our Cookies Policy. Click Accept and continue to use our website or Manage to review and update your preferences.


Platforms must ‘enhance electoral integrity’ under DSA
European Parliament in Brussels during night of EU elections on 26 May 2019 Pic: Shutterstock

03 Apr 2024 EU Print

Platforms to ‘enhance EU vote integrity’ under DSA

The European Commission has published guidelines to mitigate systemic risks online that could affect electoral integrity, ahead of the upcoming European Parliament elections in June.

The measures include a crackdown on foreign information-manipulation and interference in elections.

Protecting electoral integrity is a priority for the Digital Services Act (DSA), to prevent negative effects on democratic processes and civic discourse.

Compliance

Dialogue has been underway with large platforms since late August last year to monitor effective compliance with the DSA.

The guidelines recommend mitigation measures and best practice for Very Large Online Platforms (VLOPs) and search engines before, during, and after elections.

Under the Digital Services Act (DSA), designated services with more than 45 million active EU users must mitigate the risks related to electoral processes, while safeguarding fundamental rights – including the right to freedom of expression.

Very Large Online Platforms and search engines must:

  • Reinforce internal processes to improve mitigation measures,
  • Implement elections-specific risk mitigation tailored to local context. Official information should be promoted, as well as media-literacy initiatives, and recommender systems must empower users and reduce the monetisation and virality of content that threatens electoral integrity. Political advertising should be clearly labelled as such, in anticipation of the new regulation on the transparency and targeting of political advertising,
  • Adopt specific mitigation measures by clearly labelling generative AI as deepfakes, adapting terms and conditions accordingly, and enforcing them adequately,
  • Co-operate with EU and national authorities, including in Foreign Information Manipulation and Interference (FIMI), disinformation and cyber-security,
  • Adopt an incident-response mechanism for incidents that could affect election outcome or turnout,
  • Assess effectiveness of measures through published post-election reviews, post-vote, and invite public feedback on risk-mitigation measures.

The guidelines include specific measures ahead of the upcoming European elections.

VLOPs and search engines have been told to ensuring that sufficient resources and proportionate risk-mitigation measures are available.

The guidelines also encourage close cooperation with the European Digital Media Observatory (EDMO) Task Force on the 2024 European elections.

The guidelines consider the input received from the public consultation launched by the Commission on 8 February.

The European Commission also worked with Digital Services Coordinators on the guidelines, which encourage third-party scrutiny and research into mitigation measures.

Next steps

If platforms’ actions do not measure up to best practice, the European Commission can request further information or start formal proceedings under the Digital Services Act.  

The commission also plans a stress test at the end of April on the most effective use of the instruments and co-operative mechanisms.

Gazette Desk
Gazette.ie is the daily legal news site of the Law Society of Ireland

Copyright © 2024 Law Society Gazette. The Law Society is not responsible for the content of external sites – see our Privacy Policy.