Aller au contenu principal

How the Digital Services Act will affect public communication

Publié le : 10 février 2024 à 07:15
Dernière mise à jour : 15 février 2024 à 15:22
Par Patrice Razet

A few years ago, the introduction of the General Data Protection Regulation (GDPR) significantly changed certain digital communication practices, particularly for public communication. In the summer of 2023, a new EU legal arsenal came into force, called the Digital Services Act. What changes does this have in store for public communication?

Dans les mêmes thématiques :

By Patrice Razet, former head of digital services for the Maine-et-Loire département and communications consultant at Canévet et associés.

What is the Digital Services Act?

The EU Digital Services Act (DSA) is designed to protect the freedom of expression and consumer rights of European citizens and to strengthen democratic control of the internet, for example by preventing the misuse of information. This protection will be achieved through tighter regulation of the activities of online platforms.

The Digital Services Act, which came into force at the end of August 2023, currently covers the 17 platforms with more than 45 million European users per month.

The list published by the European Commission includes Meta, Google, Twitter/X, TikTok, Snapchat and Wikipedia, as well as the search engine Bing. These platforms are being required to comply with a series of new obligations, at the risk of substantial fines of up to 6% of their turnover.

By February 2024, all "online intermediary service providers" will be affected, in other words web hosts, social networks, search engines, travel and accommodation platforms and e-commerce sites. As such, public communication is not currently directly concerned, although this does not mean that the DSA will not affect it.

5 changes for public communication

There are five ways in which public institutions could be affected by the entry into force of the Digital Services Act.

1. Less targeted online advertising

The first change introduced by the DSA concerns the core business model of internet platforms, namely online advertising. Unless explicit consent is given, it is now forbidden to target EU citizens through advertising using so-called "sensitive" data (ethnic origin, political opinions, sexual orientation or health information). This is an issue that was already widely recognised following the entry into force of the RGPD, but it has now been reaffirmed. Targeted advertising aimed at minors is also prohibited. They can still be shown adverts, but with fewer targeting options (see the Meta group's notice). This will make it considerably more difficult for local authorities to run campaigns on public health or social issues, or aimed at young people.

2. Algorithms that are easier to understand

The DSA now requires platforms to offer alternatives to the algorithm for viewing content (the famous "for you" feed, adopted by the various networks). We are waiting to see what other possibilities are offered by the platforms (for example, a chronological order of the content of their subscriptions). Beyond this, the DSA also requires platforms to explain how their algorithms work. This transparency has already been introduced by Meta, which has published a guide on this subject. Careful reading of this guide (and those that their competitors will no doubt publish) would certainly seem to be worthwhile for the public communication sector, which has an opportunity to understand these rules and use them more effectively to increase its visibility.

3. Tools for monitoring the sector

Platforms must also be transparent about the advertising they display and how it is targeted. Most of them have therefore put an "ad library" online so that users can check who is advertising what. Facebook has already done this with regard to political advertising over recent years. LinkedIn, Google, TikTok and X have recently introduced a similar service. This is undoubtedly one of the most useful consequences for public communication. Here are some inspiring tools for sector intelligence.

4. Recourse in the event of removal or banning

The DSA sets out to prevent the dissemination of fake news and to prevent online hate. The platforms concerned must now offer a tool for easily reporting illegal content... and quickly process the request so that the content can be removed if necessary. However, users will now need to be informed before content is deleted or before they are banned from a platform. The platform will have to explain why it has taken this decision, and users will be able to appeal. The local authorities and public bodies whose accounts have been arbitrarily closed in recent years will certainly appreciate the introduction of such a mechanism. Beyond that, we can be sure that this greater attention to content moderation will help to make platforms healthier, and to better identify the opportunity to discontinue others...

5. Dark patterns banned

The DSA has also decided to tackle "dark patterns", the cognitive biases in the interfaces of Internet tools and services that deliberately influence users' choices. Ultimately, all online digital services will have to avoid "misleading or manipulating" Internet users through their design choices. But there still needs to be consensus on what precisely falls within the "dark pattern". The famous "only one place left" on hotel or cultural booking platforms obviously comes to mind, as does the repeated request for consent to cookies until the user finally gives in! Another example is to make cancelling a service (e.g. a newsletter) significantly more burdensome than subscribing to it. Some local authorities are not exempt from these practices, whether deliberately or not. But at a time when the design of the user experience is becoming increasingly strategic, the line between guidance (facilitating the user's journey, in the interests of an institution) and manipulation promises long technical and legal debates, in the absence of an exhaustive list of reprehensible interface biases.

It would be useful for the public communication sector to take part in this debate, by identifying the practices that contribute to ethical design in the service of the general interest.

Read also:
European Declaration on Digital Rights and Principles for the Digital Decade
Lire la suite