“Dark patterns” targeted by EU institutions
“Dark patterns” used by online platform providers have been controversial for some time, but recently there has been a growing buzz about them, in particular due to actions undertaken by EU and national data protection and consumer protection authorities. (For an overview of cases and decisions by EU and national authorities, see the European Commission’s “Behavioural study on unfair commercial practices in the digital environment: Dark patterns and manipulative personalisation, Final Report,” pp. 61–70.) Primarily, these measures are intended to combat deceptive practices in the digital environment, but also to educate consumers and draw their attention to the most common types of practices.
The harmfulness and prevalence of dark patterns has also been noticed by EU lawmakers, who expressly banned such practices by online platform providers in Art. 25(1) of the Digital Services Act (Regulation (EU) 2022/2065 on a Single Market for Digital Services—DSA). The DSA entered into force on 16 November 2022, but most of the obligations in the regulation will apply from 17 February 2024. Therefore, the application of dark patterns may violate not only data protection laws (especially the General Data Protection Regulation) and consumer protection laws, but also (from February 2024) the Digital Services Act.
What are “dark patterns”?
“Dark patterns” currently has no legal definition, but the term is understood to mean practices in digital interfaces designed to direct, deceive, coerce or manipulate users into making choices against their best interests (“Behavioural study,” p. 70). As explained in Recital 67 of the DSA, “Dark patterns on online interfaces of online platforms are practices that materially distort or impair, either on purpose or in effect, the ability of recipients of the service to make autonomous and informed choices or decisions. Those practices can be used to persuade the recipients of the service to engage in unwanted behaviours or into undesired decisions which have negative consequences for them.”
In guidelines issued in February 2023, the European Data Protection Board (EDPB) uses the term “deceptive design patterns,” i.e. “interfaces and user journeys implemented on social media platforms that attempt to influence users into making unintended, unwilling and potentially harmful decisions, often toward a decision that is against the users’ best interests and in favour of the social media platforms’ interests, regarding the processing of their personal data. Deceptive design patterns aim to influence users’ behaviour and can hinder their ability to effectively protect their personal data and make conscious choices” (Guidelines 03/2022 on deceptive design patterns in social media platform interfaces: How to recognise and avoid them, version 2.0).
According to the Commission, such practices are increasingly being used by online platform providers (in particular online stores) regardless of their size or the goods or services they offer. A Commission survey found that 97% of the most popular websites and applications used by consumers in the EU employed at least one dark pattern.
Dark patterns in studies by the Commission and actions by national consumer protection bodies
At the end of January 2023, the European Commission and national consumer protection authorities in the Consumer Protection Cooperation Network (including Poland’s Office of Competition and Consumer Protection—UOKiK) published the results of a website inspection campaign related to retail commerce (online stores). The inspection found that nearly 40% of the 399 surveyed online stores owned by retailers offering products in various sectors used manipulative practices (i.e. dark patterns) to take advantage of consumers’ vulnerability or deceive them. The audit found that 148 of the 399 audited online stores used at least one of the following three manipulative practices:
- Fake countdown timers, pressuring customers to purchase a product by a bogus deadline
- Hiding essential information about a product or service by using a tiny font or low-contrast colours, or placing key information (e.g. on delivery costs, product composition, or the availability of a cheaper option) in an obscure place
- Web interfaces designed to induce consumers to make a purchase, subscribe, or make other choices (from subscriptions to more expensive products or delivery options) through the use of specific graphics or language.
The Commission has urged national consumer protection bodies to take appropriate action to eliminate these practices.
An overview of the most common manipulative practices in the online environment and their assessment from the perspective of EU law is set forth in an April 2022 study by the Commission.
“Deceptive design patterns” in EDPB Guidelines 3/2002
The European Data Protection Board’s revised Guidelines 3/2022 on deceptive design patterns in social platform interfaces, adopted on 14 February 2023, were updated after receiving public feedback. Among other things, the original reference to “dark patterns” in the title was replaced with the broader term “deceptive design patterns,” which the EDPB believes better captures the diversity of emerging deceptive practices. (Version 1.0 was adopted on 14 March 2022.)
The guidelines include specific examples of deceptive design patterns, but also practical recommendations for providers, designers and users of social media platforms on how to avoid deceptive design patterns in social media interfaces that violate the GDPR.
Although the guidelines refer to social media platforms, they are relevant to the entire digital sector and provide guidance on when dark patterns may violate the GDPR. Indeed, deceptive design pattern practices appear not only on social media platforms, but, as mentioned above, also on other platforms, websites or apps.
Among the main types of such practices, the EDPB includes:
- Overloading, where users are confronted with an avalanche or large quantity of requests, information, options or possibilities in order to prompt them to share more data or unintentionally allow personal data processing against the expectations of the data subjects
- Skipping—designing the interface or user journey in a way that users forget or do not think about all or some of the data protection aspects
- Stirring affects the choices users make by appealing to their emotions or using visual nudges
- Obstructing means hindering or blocking users in their process of becoming informed or managing their data by making the action hard or impossible to achieve
- Fickle, where the design of the interface is inconsistent and not clear, making it hard for the user to navigate the different data protection control tools and to understand the purpose of the processing
- Left in the dark means an interface is designed to hide information or data protection control tools or to leave users unsure of how their data is processed and what kind of control they might have over it regarding the exercise of their rights.
According to the EDPB, the use of these deceptive design patterns may violate such principles of the GDPR as:
- Lawfulness, fairness and transparency (Art. 5(1)(a) GDPR)
- Transparent information (Art. 12 GDPR)
- Accountability (Art. 5(2) GDPR)
- Data minimisation and purpose limitation (Art. 5(1)(b)–(c) GDPR)
- Data protection by design and by default (Art. 25 GDPR)
- Conditions for consent (Art. 4(11) and 7 GDPR)
- Provisions on the exercise of rights by data subjects (in particular, Art. 21 GDPR).
Also, the EDPB guidelines include a list of best practices and specific recommendations for user interface design to facilitate effective implementation of the GDPR.
Ban on dark patterns in the Digital Services Act
The DSA, to be applied from 17 February 2024, explicitly bans dark patterns by online platforms: “Providers of online platforms shall not design, organise or operate their online interfaces in a way that deceives or manipulates the recipients of their service or in a way that otherwise materially distorts or impairs the ability of the recipients of their service to make free and informed decisions” (Art. 25(1) DSA). An “online interface” means “any software, including a website or a part thereof, and applications, including mobile applications” (Art. 3(m) DSA).
As enumerated in Recital 67 DSA, such practices include, but are not limited to:
- Exploitative design choices to direct the recipient to actions that benefit the provider of online platforms, but which may not be in the recipients’ interests
- Presenting choices in a non-neutral manner, such as giving more prominence to certain choices through visual, auditory, or other components, when asking the recipient of the service for a decision
- Repeatedly requesting a recipient of the service to make a choice where such a choice has already been made
- Making the procedure of cancelling a service significantly more cumbersome than signing up to it
- Making certain choices more difficult or time-consuming than others
- Making it unreasonably difficult to discontinue purchases or to sign out from a given online platform allowing consumers to conclude distance contracts with traders
- Deceiving the recipients of the service by nudging them into decisions on transactions, or by default settings that are very difficult to change, and so unreasonably bias the decision-making of the recipient of the service, in a way that distorts and impairs their autonomy, decision-making and choice.
This is an open-ended catalogue, and the Commission may issue guidelines on application of the ban in Art. 25(1) to specific practices (Art. 25(3)).
On its face, the ban on dark patterns in the DSA overlaps with other EU laws, in particular on data protection and consumer protection. Indeed, the use of dark patterns by online platforms may also violate other legal acts, in particular the GDPR and the Unfair Commercial Practices Directive (2005/29/EC, implemented in Poland through the Act on Combatting Unfair Commercial Practices of 23 August 2007). However, the DSA stipulates that it does not apply to practices covered by Directive 2005/29/EC or the GDPR, limiting the scope of its application and giving those acts priority (Art. 25(2) DSA).
This means that if a given practice (dark pattern) of an online platform provider violates the GDPR, its legality will be assessed by the national data protection authority according to the requirements of the GDPR, not the DSA. Examples of such practices are included in EDPB Guidelines 3/2022. Similarly, if a practice violates national laws implementing the Unfair Commercial Practices Directive, those laws, enforced by the relevant consumer protection authorities, will apply.
But it should be pointed out that this framing of the DSA’s interrelationship with the GDPR and Directive 2005/29/EC only theoretically solves the problem of overlap between these provisions in practice, as it is not always possible to unequivocally determine that a practice violates “only” the GDPR, “only” Directive 2005/29/EC, or “only” the DSA. Hence, it seems that parallel application of these laws in the digital sector will raise jurisdictional issues between national data protection authorities, consumer protection authorities, and digital service coordinators, and will require closer cooperation between them.
However, as stated in the preamble to the DSA, the rules banning dark patterns should not be understood as preventing service providers from interacting directly with service recipients or offering new or additional services to them, and legitimate practices, for example in advertising, in compliance with EU law, should not in themselves be regarded as constituting dark patterns (Recital 67).
The article was first published on newtech.law
Iga Małobęcka-Szwast, attorney-at-law, New Technologies practice, Wardyński & Partners