Rohit Gupta & Shravya Devaraj
The advertising technology (‘AdTech’) structure is built upon data gathering through ‘internet cookies’ (small textual data files). Advertisers, through the website, launch third-party cookies onto a webpage to collect user data which establishes a digital footprint for marketers and enterprises. This provides a detailed profile of a user’s likes, buying patterns, and other inclinations. Thereafter, ad banners are deployed, for example, with Google’s Ads and AdSense, to target and retarget consumers.
In this comment, I shall deep dive into the law on internet cookies in India, contrasting the dearth with best practices developed in the Global North. While the legal developments of the draft Data Protection Bill, 2021 (‘DPB, 2021’) are alluded to, the query of specific and granular guidance for cookie consent management is raised.
With the presence of a plethora of unanswered questions, data literacy concerning cookies in India continues to remain underdeveloped. This allows the exploitation of data subjects, who not only face a lack of remedial channels but also struggle to become aware of such harms in the first place. In light of this, we must turn to specifically identifying such harms and locating their solutions.
III. Psycho-Social Factors ‘Nudging’ Cookie Consent
The drawbacks and difficulties with online cookies stem from uncertainty about how websites acquire personal data. This seems to stem from not knowing exactly what a web cookie is. People also feel a lack of control over online cookies and do not always understand why they're being used to harvest personal data. Additionally, several studies have evidenced the use of ‘dark patterns’ in formulated cookie banners/notices which attempt to ‘nudge’ user consent towards accepting non-essential third-party cookies. As noted by Luxembourg’s National Commission for Data Protection (‘CNPD’), some of these include: (1) different sizes (e.g., a huge “agree” button with a little hyperlink for “refuse”); (2) distinctive fonts (e.g., an “accept” button with legible font and a “refuse” button with an unintelligible font); (3) contrasting colours/highlights (e.g., an “accept” button in a strong contrast making it clearly visible, while a “refuse” button with very low contrast compared to the rest of the banner, and therefore not very visible). Other forms of nudges include auto-enrollment (the ‘opt-out’ method), maintaining an information dump/deficit, or prompting the acceptance of cookies by a majority of web-visitors. In some such cases, such as auto-enrollment, cookie acceptance has been seen to duplicate in percentage, compared to the ‘opt-in’ method. Other studies have concluded the negative effects of outsourcing information onto another webpage and reference to browser settings for consent management. Resultantly, several sectoral regulators operating under the GDPR have specifically outlawed such practices and penalized the use of the same by companies like Google and Facebook.
IV. Global Cookie Standards and Best Practices
Under the General Data Protection Regulation, 2018, companies must choose a legal basis for processing personal data (Article 6(1)(a)). Obtaining consent required it to be voluntarily provided, particular, informed, unambiguous, explicit, revocable and requested in a legible and accessible way (Articles 4(11) and 7). Further, Recital 32 of GDPR, as well as the Planet49 GmbH Case, reads that consent given in the form of a preselected tick in a checkbox does not imply active behaviour of the user and that pre-ticked boxes do not constitute consent. A strong presence of the purpose limitation and data minimization principle is also found in Article 5(1)(b), with regulators such as the Finnish and Belgian DPAs laying down the contours of the availability (manner), explicitness (language), and specificity (extent) to which such information must be disclosed.
The ePrivacy Directive (‘ePD’), supplements the GDPR in the area of electronic communication, such as websites. Unlike GDPR, which is a rule immediately enforceable in all European countries, the ePD is a directive that each member state must apply in its national legislation. Under the ePD, website publishers must get user consent before collecting and processing personal data using non-mandatory (not strictly necessary for the service requested by the user) cookies or other tracking technologies (Article 5(3)). Moreover, Recital 66 of the ePD is quite explicit while directing that “the methods of offering the right to refuse should be as user-friendly as possible”.
Additional guidance issued under the GDPR, by several individual DPAs, have also specifically outlawed certain malpractices. The Dutch DPA, for example, stated that cookie walls that demand that website visitor agrees to their internet browsing being tracked for ad-targeting as the ‘price’ of entry to the site are not compliant with the GDPR. Similarly, sites that claim to obtain ‘consent by scrolling’ would post a small banner at the top of the screen with their GDPR consent information and indicate that “by continuing to scroll, you grant consent to the usage of cookies.” According to the Greek DPA, this cannot constitute full consent as the act is indistinguishable from scrolling to read the remainder of the page. The United Kingdom’s Information Commissioner’s Office, for its part, has also identified cookies that may be classified as strictly necessary and those which may not fall under such categories to promote data literacy.
The California Consumer Privacy Act (‘CCPA’) took effect on January 1, 2020. It impacts companies that gather data on Californians that fit one of three criteria: earn $25 million-plus in revenue, process data of 50,000 consumers, households, or devices, or derive at least 50% of its annual revenue from selling the personal information of California residents.
V. Looking Forward: The Data Protection Bill, 2021
In November 2021, the Joint Parliamentary Committee on the Personal Data Protection Bill, 2019, introduced a new draft version of the bill, in the form of the DPB, 2021 subsuming under its ambit non-personal data. While the draft, and the report, still fail to acknowledge cookies as a matter of legislative concern, the codification of several principles of best practices (such as the need for express, rather than implied consent) is very welcome. Note, however, the acute lack of any conversation regarding the above-mentioned issues, including that of the use of dark patterns. Among the several concerns which arise, however, the ones highlighted below are those that may directly impact the collection and processing of cookie data.
A. Disclosure of Algorithmic Fairness in Processing Personal Data
Clause 23(1)(h) of the DPB, 2021, now demands disclosure of the “fairness of the algorithm and method used” in processing personal data. This means that intermediaries could potentially be required to disclose granular information regarding key processes conducted on personal data. For instance, intermediaries may be required to delineate how cookie data is used to conduct targeted advertising, including the process of selecting which targeted advertisement would be placed before the end user and the manner in which it will be displayed (the frequency, positioning, etc.). However, the vague language of the provision has been criticized to render the amendment inane. The precise content and extent of the information so required to be disclosed remain unmentioned. Interpretation as to the exact compliance point of the provision seems to have been left to forthcoming regulations or judicial interpretation (albeit without mentioning either). Additionally, the entity responsible for assessing whether certain information contributes to the determination of the ‘fairness’ of the algorithm used is also unassigned. Inspiration as to the requisite level of specificity, in this case, may be drawn from the February 2021 report of the NITI Aayog on ‘Responsible AI’ which states, in great detail, a self-assessment procedure for AI usage and the method of determining and maintaining transparency for artificial intelligence systems. The report also maps ethical considerations and corresponding regulatory and technological management options to each consideration.
B. Denial of Service for Want of Consent
C. Special Concerns regarding Processing of Children’s Data
The current draft of the DPB, 2021 retains the age of majority as 18 years (note that the same has been reduced to 16 years under the GDPR, with options to lower the same to 13 years). Additionally, Clause 16(2) now requires both prior age-verification and parent consent before the processing of personal data belonging to minors. This may introduce onerous burdens upon platforms. For instance, age-verification mandates, depending on their required stringency, may require the collection of more information than previously required (such as Aadhar/One-Time-Password verification). Most parents might not be users of websites frequented by minors. Hence, consent flow must be made off-platform, onboarding not only issues such as consent fatigue, but impacting active users, revenue models and, in turn, increasing the risk of verification fraud. The psychological effects of allowing parents to solely govern the nature of online content accessed by children must also be acknowledged. Studies have pointed to the potential hampering mental growth and the minimizing legitimate avenues of free expression when the agency for content consumption is robbed.
Perhaps the most important for the current discussion, however, is Clause 16(4) which states that data fiduciaries are “barred from profiling, tracking, or behavioural monitoring of, or targeted advertising directed at children”. This prohibition has been applied without the caveat that harm accrued by virtue of such processing needs to be demonstrable. A complete crackdown, thus, has been undertaken with respect to the AdTech industry engaging with minor users. While seemingly extreme, this may not necessarily be a negative development, given the high degree of impressionability and the recent rising trend of political polarization via social media (on the effects of polarization on mental health, see here). However, the report of the Joint Parliamentary Committee fails to delineate the considerations that led up to such a suggestion, and thus, one may doubt its potential for considered application.
D. Cross-Border Transfers of Cookie Data
Recently both the Austrian DPA and the European Data Protection Supervisor held that intentional or unintentional transfers of personal or sensitive personal data, as applicable, collected through cookies to foreign jurisdictions must be subject to compliance with the ‘essentially equivalent’ standard laid down by the Court of Justice of the European Union in the Schrems-II Decision. The former, in fact, interpreted ‘transfers’ to imply that even a European Economic Area-based company may not use a cookie management provider that relies on a US-based service to collect personal data, regardless of whether data leaves the EEA without an adequate transfer mechanism ensuring ‘essentially equivalent’ standard of data protection. Naturally, there is no such specific guidance as to what may constitute ‘transfers’ in India, much less a threshold for the determination of an ‘adequate level of protection’ as required under Clause 34(1)(b)(i)/(ii) of the DPB, 2021.
Further, the DPB, 2021 does not address previous concerns raised as to the non-compliance of its predecessor with the ‘essentially equivalent’ threshold laid down in the Schrems-II Decision. Inter alia, these points of non-compliance include: (1) provisions for unbridled government access to personal data and exemption of key government agencies from the ambit of the DPB, 2021 (Clause 35); and (2) the lack of independent oversight mechanisms, since the Data Protection Authority is to be constituted by the Central Government, on the recommendation of key government officials (Clause 42).
VI. Alternatives to AdTech’s Reliance on Third-Party Cookies
In a bid to do away with the several privacy concerns inherent in the operations and transfer of cookie data, and the subsequent difficulty in executing and enforcing legal policies on cookie drafting, several alternatives have been proposed that attempt to either ease the process of consenting or entirely do away with third-party cookie advertisements. While not all such developments are technically or legislatively viable, these may be considered for industry facilitation.
A. noyb’s Advanced Data Protection Control
None of Your Business (‘noyb’), a privacy group, has proposed an automated browser-level mechanism that addresses the acute issue of ‘consent fatigue’. It proposes an automated signal layer that would allow users to establish advanced consent choices, such as only being prompted to authorise cookies if they often visit a website. By automating the yeses and noes, the mechanism avoids the nightmare that has made cookie consent acquisition so cynical. noyb notes that GDPR’s Article 21(5), as well as the ePrivacy Regulation, already permits automated browser signals to alert websites whether a user consents to data processing or not.
B. Google’s Cohorts
Google’s Federated Learning of Cohorts (‘FLoCs’) analyses a user’s Chrome browser history to categorize them into ‘cohorts’. By masking each individual into a ‘crowd’ of at least a thousand people, consumers are stripped of unique identification. FLoCs only shares a designated ‘cohort I.D.’ with websites and advertisers, which may then target audiences based on their categorization. Furthermore, FLoCs stores user data locally rather than sending it over the internet to prevent cross-site browsing history rebuilding.
These advancements, however, come with a hidden risk: automated systems might perpetuate prejudice. FLoCs’ clustering technique may reproduce potentially discriminatory algorithmic behavioral targeting by grouping people according to ethnicity, sexual orientation, or political affiliation. Additionally, the system may also infer sensitive information from general habits and interests. So Google will have more personal data through its services and Cohort ID. The proposal itself highlights that websites that are aware of an individual’s personally identifiable information, such as their email address, can record and reveal their cohort. Thus, FLoCs may function as a ‘behavioural credit score’ capable of distorting the anonymity it seeks to preserve.
Recently, Google itself announced ‘Topics’ – the replacement for FLoCs. This will feature categorization of browser activity (of up to three weeks) into one of three hundred preformulated ‘Topics’, which will not include sensitive categories such as ‘race’ or ‘gender’. Tests are currently being conducted to determine how different its operation is from the FLoCs.
C. Contextual Advertising
Rather than depending on user data for advertisements, contextual targeting uses the content of the webpage being visited. For instance, if a person visits a blog about coffee, they may be served adverts for coffee or coffee machines. Contextual targeting works well on websites with “highly-themed” material and for marketers with niche audiences. Recently, an Irish Council for Civil Liberties analysis also concluded greater income for those who converted from tracking-based to contextual advertising. A 149% rise in advert income was also witnessed by a Dutch publisher, the NPO Group, after switching to Contextual Advertising.
D. Relevance for India
Since the DPB, 2021 embraces and encourages the participation of regulatory sandboxes, the AdTech industry should regularly conduct trials of the functioning of innovative technologies which could reduce algorithmic bias and undue profiling of users. In light of the Cambridge Analytica revelations, corporates should feel responsible for functioning as instrumentalities in political strategy. While the advertising business will inevitably run on the basis of monetary advantage, certain industry standards need to be formulated which may serve as ethical guidance for self-regulation. Of prime importance here, is to give full effect to the obligation to enact ‘Privacy-By-Design’ (given under Clause 22 of the DPB, 2021). This does not only represent the practice of preparing privacy policies and maintaining audits of safeguard systems deployed and their functioning, but the attitudinal reform of building products and services with privacy as a centerpiece, one that is not merely an accessory, but a sine qua non.
The authors are final-year students of the West Bengal National University of Juridical Sciences, Kolkata.