top of page

Regulation of Dark Patterns: Lessons for India [PART I]


About the author:

The author is a practicing lawyer specializing in intellectual property and technology law. She also works as a Researcher on artificial intelligence and fairness with the Institute for Internet and Just Society in Berlin. She graduated from OP Jindal Global University in the year 2020.


Introduction

On 15th March, 2021, the legislature of California banned the use of ‘dark patterns’ through an amendment in an effort to strengthen the enforcement of the California Consumer Privacy Act (“CCPA”). Through this ban, California is attempting to “ensure that consumers will not be confused or misled when seeking to exercise their data privacy rights”, as stated by California Attorney General. In essence, dark patterns are user interfaces present in websites or applications that are designed to mislead users into giving away their rights and personal data, or trick them into making certain choices that they may otherwise not make. With legislators across the USA and the EU attempting to devise solutions to effectively regulate dark patterns, it is crucial for regulators in India to take note of the same. This is especially considering the ongoing deliberations of the Joint Parliamentary Committee (“JPC”) on the Personal Data Protection Bill, 2019 (“PDP Bill”), and the additional extension it has received to present its report in the monsoon session of Parliament in 2021.

This post is divided into two parts. Part I examines the need for regulation of dark patterns, the existing regulation of dark patterns in other jurisdictions, and the challenges faced in this. Part II extrapolates the lessons learned from such regulation of dark patterns in other jurisdictions for regulation in India. It also argues that consumer protection laws must be used in tandem with data protection laws, and can be an effective avenue for regulation of dark patterns where data protection laws fall short.

Part 1: Existing Regulation of Dark Patterns – The What, Why and How

What are dark patterns?

The U.S Federal Trade Commission (“FTC”) has defined dark patterns (a term first coined by Harry Brignull) as “design features used to deceive, steer, or manipulate users into behavior that is profitable for an online service, but often harmful to users or contrary to their intent.” These are common techniques, likely experienced several times by users every day. While there are many applications of such patterns, I will illustrate four.

First, a user may not be able to locate an opt-out or unsubscribe option from unwanted communication or collection of personal information. The second is ‘confirmation-shaming’, where the option to decline a certain feature is worded in such a way that it ‘shames’ one into compliance. For instance, if a user is offered a choice to save money upon buying an additional product and the option to decline the same is worded as “No, I do not want to save X amount of money”. The third is ‘forced continuity’, which is a digital trap designed to make it difficult for a user to opt out of a membership by initiating an unwanted card transaction without notifying the user of the option to cancel. The fourth is ‘roach motel’, where the user finds it easy to get into a particular feature but finds it exceedingly difficult to understand how to discontinue. Broader than forced continuity, a ‘roach motel’ is not limited to unwanted financial transactions, and could also include a privacy policy that is so complicated that it disincentivizes a user from opting out of sharing data. Default location sharing settings, misdirection through continuous advertisements, and disguised advertisements through surveys are other examples. An empirical study conducted by Princeton University found dark patterns on over 1,200 websites, with around 200 of these websites being blatantly deceptive.

‘Nudges’ are design choices that influence the decisions that individuals make, and exist on nearly all interfaces. These are not necessarily negative. For example, an email prompt that alerts a user to the absence of an attachment is a positive nudge. However, dark patterns are a more problematic nudge, due to their inherent dependence on deception of a user to give up their certain free choice that they may have otherwise exercised. To protect privacy rights, therein stems the need for regulation of these dark patterns.

Global Approaches to Regulation

The USA:

As discussed above, ‘nudges’ may not be negative per se. However, since dark patterns fundamentally rely on deception of the user with the end goal of coercing the user to give up certain data or rights, they are inherently problematic. Recognising that all ‘nudges’ may not be deceptive dark patterns, the newly approved regulation under the CCPA makes an even more granular distinction, only prohibiting those dark patterns which have “the substantial effect of subverting or impairing a consumer’s choice to opt-out” of providing their personal data for the entity’s usage. Indicative examples in the regulation are using double negatives such as “Don’t Not Sell My Personal Information”, or requiring users to “search or scroll through the text of a privacy policy or similar document or webpage to locate the mechanism for submitting a request to opt-out.”

In an attempt to standardise compliance, the regulation goes so far as to suggest an “eye-catching” opt out icon for businesses to adopt, as well as a 30-day timeline for compliance. The above-mentioned press release by the Attorney General provides a link to an icon that businesses are encouraged to adopt. Interestingly, the suggested icon was designed by Carnegie Mellon University’s CyLab and the University of Michigan’s School of Information, and was tested against other icon designs. It was found that the suggested icon was most effective in communicating privacy choices to users over others. The California Privacy Rights Act (“CPRA”) ballot initiative which has recently passed provides that consent obtained through a dark pattern is not informed and valid for the collection of user data.

In addition to the CCPA and CPRA, the FTC has recently taken cognisance of dark patterns. In December 2020, the FTC issued a complaint for permanent injunction against ABC Mouse, for making cancellation of recurring subscription fees difficult, through free trials that automatically converted into paid memberships and taking users through a “labyrinth” of pages which urge them not to cancel the subscription. In addition, the complaint alleges that ABC Mouse uses a ‘negative option feature’, wherein the mere lack of affirmative action on the part of the user to cancel the subscription is deemed as an acceptance of the offer. Terms of the proposed settlement between Age of Learning and the FTC include ABC Mouse providing a simple mode of cancellation of subscriptions, and making disclosures regarding negative options, including taking the users’ informed consent. An additional proposed $10 million dollar fine is indicative of the FTC’s intention to closely monitor the use of dark patterns going forward.

Legislation in the USA clearly seems to be making an attempt to strike a balance between user rights and compliance requirements, given that the prohibition of dark patterns is contingent on the facts and circumstances on each application and its impact on user choice. There have been attempts such as The Deceptive Experiences to Online Users Reduction Act (DETOUR Act), introduced in April 9, 2019, which sought to outlaw dark patterns in a wider manner, and explicitly sought to regulate large online operators (those with more than 100 million users in a 30 day period). The DETOUR Act would have been a combination of a wide definition of dark patterns for the purposes of enforcement and self-regulation by large online operators. However, it was never tabled for a vote before the US Congress.

The European Union:

Although the General Data Protection Regulation (“GDPR”) does not explicitly provide guidance on dark patterns, it nevertheless runs contrary to the requirement of informed consent, as well as the requirement to adopt technical and organisational measures to ensure privacy by design, which is done through the integration of Consent Management Platforms (“CMPs”). An international study conducted in January 2020 examining dark patterns after the GDPR came into effect found a lack of enforcement in this area, concluding that “dark patterns and implied consent are ubiquitous; only 11.8% meet the minimal requirements that we set based on European law.” The Data Protection Authority of France, the Commission Nationale de l’informatique et des Libertés (“CNIL”) released a report in 2019 emphasising the need for a framework to regulate dark patterns through enforcement of privacy by design and focusing on the user experience rather than individual choice, to ensure that consent is provided freely.

In Part 2 of this post, I will explore the implications of the regulation of dark patterns in the USA and the EU, and the lessons for Indian regulators from the global experience. In this regard, I will evaluate the possibilities for regulation through the PDP Bill, making a case for explicit prohibition of dark patterns, strengthening the provisions on privacy by design, and strong regulatory oversight. I will also make a case for adopting a rights-based approach to data protection rather than a consent-based framework, to more effectively regulate dark patterns which rely on exploiting the loopholes in a framework based purely on informed consent. Lastly, I will argue that consumer protection laws provide an avenue for regulatory pluralism and must be used in tandem with data protection laws.


Comments


Recent

bottom of page