116 B, Mittal Towers, Nariman Point, Mumbai, India

Have your ever tried to deactivate or delete your Instagram account from the mobile application? You can’t. The app has no feature to that effect. A user can temporarily disable or deactivate their account only by logging into the website version. Even then, the user has to go through a maze of settings to reach the disable or deactivate feature. Additionally, several interstitials and intrusive questions are asked, as shown in the picture below.


This design architecture is purposely placed in such apps to fuel the economy of attention. It is a tactic, known as a “dark pattern”, coined by an English UX specialist Harry Brignull which he refers to as “a user interface that has been carefully crafted to trick users into doing things which they did not intend to do”. Under the guise of a harmless marketing tool, dark patterns manipulate the user to engage with the interface in a manner beneficial to the owner of the interface and the detriment of the user.

Types of Dark Patterns
There are several kinds of dark patterns that exist. Brignull, in his website’s “Hall of Shame”, has collated some of the most abrasive, manipulative practices that compromise the rationality of a consumer. In today’s age, the most common example would be “Privacy Zuckering”, wherein because of the continuous introduction of new features on social media platforms, people are inadvertently sharing more information on the digital space than they had intended to. Instagram introduced Reels after the popular app TikTok was banned in India. Instagram’s daily usage has amped up to over a hundred million active users since the introduction of Reels. This increased usage is the perfect example of “Privacy Zuckering” as people are now more readily sharing information in the digital world because of the advent of Reels.

Other examples include, “confirmshaming” whereby the user is guilted into opting for something which they would otherwise would not have. To illustrate, when applications ask you to rate the app on the relevant app store, often the two options are - “yes”, which will direct you to the app store or “no, I’m not enjoying this app”, thus guilt-tripping the consumer into agreeing to rate the application in the absence of any intention to do so. A summary of the various kinds of dark patterns can be found below:


Another common example is the “sneak-in basket” method employed by marketplace e-commerce entities. Per this method, when consumers purchase something, additional products are added into the basket of the consumer, without their knowledge. For example, on MakeMyTrip, donations to COVID-Relief organisations are automatically added to your ticket price when booking a flight ticket, as shown in the picture below. While the consumer has the choice to opt-out of it and the last stage of billing, the additional “confirmshaming” because of donations to a charity minimises the risk of the consumer actually doing so. This is also known as negative option billing and is explicitly banned in the United Kingdom under the Consumer Protection (Distance Selling) Regulations.


Privacy Concerns
Dark patterns tap into the consumer's psyche and prioritise eliciting the interest of the consumer as opposed to helping them make informed decisions. To illustrate, dark patterns dilute a consumer's privacy by forcing them to share more information and details that prima facie are linked to availing benefits on the application. For example, when signing up for LinkedIn, the more information you share, the greater the strength of your profile which is directly linked to the user’s visibility on the platform. To receive an “All-Star”, over fifteen fields are required to be filled. Thus, a user’s privacy is often compromised vis-a-vis the amount of information they intend to share.


Scope of regulation in India
Dark patterns are increasingly being deployed by commercial entities. Thereby, dark patterns target users both as consumers and as data subjects. Consequently, both consumer and data protection regimes are activated.

In the European Union, personal data protection is a fundamental right enshrined in Article 8 of the EU’s Charter of Fundamental Rights. The Charter also delineates data protection principles with transparency at the forefront. Further, For instance European Union’s General Data Protection Regulation (“GDPR”) mandates information to be provided in a “concise, transparent, intelligible and easily accessible form, using clear and plain language” when data is collected from the data subject. Informed consent is necessary under the GDPR. Similarly, in the United States, in March 2021, the California Senate banned dark patterns under the California Consumer Privacy Act (hereafter “CCPA”) with the two-fold objective of mitigating misleading information and increasing consumer autonomy. Similarly, Federal Trade Commission (“FTC”), the body responsible for consumer protection has been regulating dark patterns in the United States under the provisions that makes “unfair and deceptive acts or practices” illegal.

Similarly, in India as well dark patterns can be targeted by The Personal Data Protection Bill, 2019 ( “PDP”), which is yet to be passed by the Parliament as well as the E-Commerce Rules, 2020 (“Rules”), issued under the Consumer Protection Act, 2019.
  1. Consumer Protection (E-Commerce) (Amendment) Rules, 2020
    All digital and electronic facilities that perform electronic commerce fall under the Rules such as Amazon, Myntra, Nykaa, Flipkart etc. The Rules place an obligation on the platforms not to engage in “unfair trade practices”. Additionally, the platform cannot display “misleading advertisements” in the course of business or otherwise. And most importantly, the Rules mandate that all marketplace and inventory e-commerce entities must provide information in “clear and accessible manner, displayed prominently to its users, to enable consumers to make informed decisions at the pre-purchase stage”. Even when this requirement is legally complied with, an argument can be made that the assumption that consumers are capable of comprehending the information provided to them is not always accurate.
     
  2. The personal data protection bill, 2019
    Section 11 of the PDP Bill requires entities that control the use of personal data to obtain the user’s consent to process the acquired data. Consent must be free, informed, specific, clear, and capable of being withdrawn. A dark pattern such as an overly complex privacy policy would violate Section 11. Additionally, while a dark pattern might seemingly obtain the user’s consent and satisfy the threshold of consent under the law, it would necessarily reduce the user’s control over the information they have consented to for processing.

    However, the language of the PDP Bill might be favourable to the data fiduciaries. For example, Section 5 of the Bill imposes a limitation on the purpose of processing personal data. Accordingly, personal data may be processed for the purpose consented to by the user or purposes that are incidental and connected to the consent obtained. What qualifies as “incidental” or “connected to” remains ambiguous. Therefore, dark patterns might fall foul of liability because of the unclear thresholds of consent.
Conclusion
The chilling impact of dark patterns on the privacy of a consumer cannot be ignored. The United States and the European Union have taken steps in the right direction to address these concerns, India must follow suit. Protection under the E-Commerce Rules, while present, is limited. Further, given the plethora of dark patterns that exist and the ambiguous language of the PDP, protection from dark patterns is minimised. An explicit banning under the PDP, similar to the CCPA or a different legislation in itself that caters to dark patterns exclusively is recommended. Dark patterns are an affront to an individual's privacy and may lead to more significant implications on the performance of digital entities in the coming future. If left unregulated, they might distort user autonomy entirely.