The Digital Services Act is in force – how does it affect medium-sized and smaller platforms?

Since 17 February 2024, the Digital Services Act (“DSA”) has also applied to medium-sized and smaller online platforms. By introducing the DSA, the European Commission (“EC”) aims to better protect users’ fundamental rights. To this end, the DSA imposes new obligations with respect to the design and use of advertising, reviews and rankings on online platforms (e.g. dark patterns). This blog centres on the question what these obligations mean for your platform.

The DSA aims to create a safe, predictable and reliable online environment. Unlike the Digital Markets Act (“DMA”) and the Platform-2-Business Regulation (“P2B Regulation”), the DSA therefore serves and safeguards the interests of consumers/users.

Practice shows that the Netherlands Authority for Consumers and Markets ("ACM") is increasingly tightening its supervision of online platforms. Addressing dark patterns in particular is high on its agenda (see also this blog). Providers of digital services risk high fines if they fail to comply (in a timely manner) with the obligations under the DSA. All the more reason for your company to check whether your online platform complies with these obligations.

To whom does the DSA apply?

The DSA, which has applied to the 19 largest platforms (such as Google, Amazon, Facebook, etc.) since August 2023 already, will also apply to medium-sized and smaller platforms as of 17 February 2024. These are platforms with fewer than 45 million users (such as Marktplaats, Bol.com, Vinted, eBay, Beslist.nl, Autotrack and Thuisbezorgd).

By its nature, the DSA has a very broad scope. In principle, all providers of online platforms whose users (or consumers) are located in the EU fall within the scope of the DSA. It is not a requirement that the online platform itself is located in the EU.

Relationship between the DSA and international regulations

The DSA is part of a package of regulations regarding digital services, which also includes the DMA and the P2B Regulation. Under both the P2B Regulation and the DSA, online platforms are obligated to disclose in their general terms and conditions the method of profiling and the parameters that they use for this purpose (see also this blog). The P2B Regulation applies a broader scope in this regard. Whereas the DSA primarily aims to protect end users, the DMA mainly sets rules for so-called gatekeepers to promote competition (see this blog). The DSA therefore does not detract from these (and other) European regulations (Article 2 of the DSA).

New content, advertising and transparency rules for platforms

Article 25 – Design and organisation of online interfaces

Article 25 of the DSA imposes new obligations regarding the design and organisation of online interfaces, such as websites and apps. The basic principle is that the online interface of the digital service may not serve to (directly or indirectly) mislead, nudge or manipulate users. Both the autonomy and the decision-making powers and options available to the user must be safeguarded at all times.

Article 25 of the DSA furthermore opposes the increasing use of dark patterns: in other words, malicious manipulative practices that are known to encourage consumers to make certain choices that are not in their best interest. Article 25 builds on the joint offensive by national consumer authorities against the use of dark patterns by commercial parties.

Specifically, this means that the platform’s interface must be designed in such a way as to be displayed in a neutral manner. This entails that, among other things, visual and auditory components on the platform’s interface may not prompt users to make a particular choice. In other words, the user’s decision-making may not be improperly disrupted or impeded. The following, for instance, are prohibited:

  • giving certain choices more weight by displaying them larger or using prominent markings;
  • repeatedly asking a customer to make a choice when it has already been made;
  • making certain choices more time-consuming; or
  • making it unreasonably difficult to discontinue purchases or change default settings.

Article 26 – Advertising on online platforms

The DSA also introduces additional obligations for advertising or advertisements that may be displayed on the platform. Such advertising may now be displayed only in a clear, concise, and unequivocal manner. The online platform must thereby consider the comprehension abilities of the ‘average’ consumer. In other words, users must be able to easily determine that what they are seeing is an advertisement.

To this end, users must have sufficiently individualised information at their disposal on the basis of which they can independently (in real time) establish (Article 26(1) of the DSA):

  1. that the information displayed is advertising;
  2. on whose behalf the advertising is being shown;
  3. whether the information shown is paid or unpaid advertising; and
  4. why the relevant advertising is being shown (parameters).

It is mandatory, for instance, to disclose that certain positive ratings (reviews) on the platform have been obtained against payment. The platform must make it clear that they are in fact advertising. They can do so, for instance, by placing a ‘sponsored’ icon next to the review.

Lastly, providers may not show advertising based on profiling using special categories of personal data, as referred to in Article 9 of the General Data Protection Regulation (GDPR). This applies, for instance, to personal data relating to health, religion, racial or ethnic origin, political opinions, etc.

Article 27 – Transparency of recommendation systems (ranking)

Article 27 of the DSA sets additional rules on targeting techniques and profiling.

The activities offered on a platform are highly dependent on (i) how information is prioritised (the ranking) and (ii) how that information is presented on the online interface. Algorithms are often used in the prioritisation of that information. The algorithms adjust the visual representation of the information based on the information provided by the buyer (possibly unintentionally): in other words, the needs and preferences of the individual user.

The downside of the use of rankings is that it can significantly influence the user’s decision-making and options, which is not always in their interest.

The DSA therefore introduces additional obligations regarding the transparency of these ranking systems. This means that users of platforms must be informed about how the ranking system affects the information displayed to them. This may be done on the basis of the following parameters, for instance:

  • behaviour of other customers with the same stated preferences;
  • previous click behaviour of the user;
  • paid advertisements (see Expedia, for instance); or
  • consumer reviews by other users.

Users must thereby also be informed of any possibilities of changing or influencing the key parameters. The online platform must offer separate functionality for this purpose, allowing the user to directly and easily change the prioritisation of the information or rank it differently (other than the preferred option). The use of misleading designations, such as ‘sort by relevance’ while the sorting actually takes place on the basis of advertising commission, is expressly prohibited.

Providers must furthermore explain in their general conditions in ‘clear and understandable language’ which parameters are being used in the recommendation system. This in any event includes (Article 27(2) of the DSA):

  • the main criteria for determining the proposed information; and
  • the relative importance of those parameters (relative to each other).

In future: high fines for non-compliance

The entry into force of the Digital Services Regulation Implementation Act (the Implementation Act) wilt give ACM (among other things) the power to impose an administrative penalty and/or an order subject to a penalty. This administrative penalty may amount to 6% of the worldwide turnover of the platform in question. The maximum amount of the order subject to a penalty is 5% of the worldwide turnover of the platform in question.

ACM, as well as the Dutch Data Protection Authority (AP) will also be given the power to impose an ‘independent order’ (subject to approval by the supervisory judge). This will allow both authorities to impose a binding order on a platform, with the aim of enforcing a specific measure. In the event of failure to comply with the independent order, they still have the power to impose an administrative penalty or an order subject to a penalty.

ACM currently has no specific enforcement powers in relation to the obligations under the DSA. The lack of these powers is partly due to the Dutch government’s failure to implement the Implementation Act on time. Progress has now been made, the Council of State has issued its opinion, and the bill was submitted to the Lower House on 3 April 2024. The bill is expected to be adopted soon.

This will allow ACM to take enforcement action in the short term already. Practice shows that enforcement against online deception is a high priority for ACM (see this blog).

Despite the fact that ACM currently cannot take any enforcement action, the Minister of Economic Affairs has already drawn up a temporary regulation authorising ACM to receive notifications. ACM thereby has the ability to share these notifications with other authorities in the EU, which allows it to exercise (indirect) supervision to a certain extent.

Rules of thumb to avoid deception

In light of ACM’s upcoming enforcement powers, platforms are well advised to take note of the requirements arising from the DSA and, if necessary, to make any adjustments in good time. In doing so, at least the following rules of thumb can be used:

  1. Ensure that choices (such as purchase decisions) are displayed in a balanced manner and do not repeatedly ask consumers to make a choice when it has already been made.
  2. Identify the parameters on the basis of which advertising is shown to users. Check whether it can be established that advertising is involved, for instance by having the platform’s interface assessed by a reference group.
  3. Make it clear on the interface that advertising is being shown (for instance by placing an AD icon next to the advertisement or displaying a disclaimer).
  4. Provide information in clear and understandable language about the parameters used in the rankings. Have this explanation reviewed by a reference group.

More information on consumer rules can be found at consumentenrecht.info.

Information on dawn raids by ACM can be found at invalacm.nl.

Follow Maverick Advocaten on LinkedIn

Contact details

Cyriel Ruers

T +31 20 238 20 15
M +31 6 10 257 754

Martijn van de Hel

T +31 20 238 20 02
M +31 6 21 210 853

Paul Breithaupt

T +31 20 238 20 05
M +31 6 39 177 993