top of page

Removing the Category of ‘Sensitive’ Personal Data under the DPDPA: Heightening Data Principal Vulnerability

-Sriya Sridhar*



Among the key changes in the Digital Personal Data Protection Act 2023 is the removal of the category for ‘sensitive personal data’ and instead considering all personal data in a uniform manner in terms of legal protections. This article analyses this change through Margieli and Nikolas’s work on applying theories of vulnerability to data protection law, and argues that the removal of this categorization does not take into account the information asymmetry between Data Principals and Data Fiduciaries, and does not account for differential affects that data processing has on different categories of Data Principals. I argue that the BN Srikrishna Committee’s conceptualization of categorizing sensitive personal data came closer to such a vulnerability aware approach, and that the current version of the Act heightens Data Principals’ vulnerability through applying a uniform approach which does not reflect the realities of power imbalance in the digital sphere.



The Digital Personal Data Protection Act, 2023 (‘DPDPA’ or ‘Act’) came with a significant departure from the earlier data protection regime, previous drafts of the Act, and from other jurisdictions – it removed the categorization of ‘personal’ and ‘sensitive personal’ data. The removal of this categorization, which previously led to a higher degree of protection for ‘sensitive’ personal data (‘SPD’) has been a subject of debate.

The 2022 draft and the current version of the Act in force, do away with a category of SPD, instead treating all forms of personal information on par. The Act in its current form also does away with a heightened requirement for ‘explicit’ consent for any form of data, with a notice and consent framework and data subject rights which are, in essence, substantially similar to the Information Technology (Reasonable security practices and procedures and sensitive personal data or information) Rules, 2011 (‘SPDI Rules’)  (i.e, the rights to review, access, and erasure).

In this article, I argue that doing away with such a categorization does not acknowledge the vulnerability of data subjects in relation to certain categories of personal information, and does not adequately grant protection to SPD, the collection and processing of which could interfere with the fundamental rights of data subjects and cause greater harm if misused. Further, I argue that had the removal of this categorization resulted in a heightened level of protection for all types of data, the negative effects might have been counteracted (albeit with an increased compliance burden). Finally, I discuss the vulnerability aware approach to data protection law and discuss how removal of the SPD category goes against this approach. [Note: All terms capitalized but not defined in this article are as defined in the DPDPA.]

Previous position and the position in other jurisdictions

The concept of ‘sensitive’ personal data was first introduced in India through the SPDI Rules. While the SPDI rules did contain drafting ambiguity in their interchangeable usage of the terms, sensitive personal data was subject to certain heightened compliance requirements for collection of this information, including explicitly notifying the individual about the purpose for collection, and allowing for access, review, and erasure. Individuals were also required to be notified of the intended recipients and provided an option not to provide consent at the time of collection.

The BN Srikrishna Committee constituted post the Puttaswamy judgment, endorsed creating a category of SPD since the processing or certain categories of personal data might be likely to cause greater harm to an individual, necessitating the need for this to be delineated specifically. The Srikrishna Committee deliberated upon multiple approaches to categorise SPD, finally arriving at the following factors – (i) the likelihood that processing the category of data would cause significant harm to the data principal, (ii) any expectation of confidentiality that might be applicable to that category of personal data, (iii) whether a ‘significantly discernible’ class of data principals could suffer harm of a similar and relatable nature, and (iv) the adequacy of general rules to personal data.

The Committee also recommended that the collection and processing of SPD be subject to a higher standard of ‘explicit’ consent, with the grounds for such collection limited. This was drawn from legislation like the GDPR, for example, which prohibits the processing of what it refers to as ‘special category’ data by default, with certain grounds for its processing, such as protecting the vital interests of the data subject or when such data is manifestly made public by the data subject.

The Committee also recommended that the Data Protection Authority to be formed, should have the residuary power to notify categories of SPD in future, accounting for new categories of data which may come to be collected.

The 2018 and 2019 drafts of the Act, as well as the version of the Act proposed by the Joint Parliamentary Committee contained provisions reflecting the recommendations of the Srikrishna Committee, namely, (i) a definition for carving out categories of SPD, (ii) specific grounds under which SPD can be collected and processed, (iii) the requirement for ‘explicit’ consent from the Data Principal over and above the consent requirements under the Act, and (iv) residuary powers granted to the DPA to notify further categories.

The EU GDPR, UK GDPR and US state laws such as the California Consumer Privacy Act also follow a similar categorization, specifying certain types of data which are ‘sensitive’ and merit higher protection. Recital 51 of the GDPR specifies that special category data should be granted a higher degree of protection due to its effect on fundamental rights and freedoms. The types of data which fall under the ambit of ‘special category’ data are influenced by European anti-discrimination laws. The UK’s Information Commissioners’ Office has stated a similar rationale, influenced by the UK Equality Act – this also includes inferences which have the potential to reveal information which is classified as special category data.

Therefore, it is worth questioning why India has chosen to depart from a widely accepted standard. Since there has been no official rationale provided, the speculation of any such reasons may be beyond the ambit of this article. Nevertheless, I argue that this leads to adverse consequences in relation to the rights of data subjects, which I will elaborate upon in the following section.


Towards a Vulnerability Aware Approach to the DPDPA

I have previously argued that the scope and applicability of the Act is not adequate to address privacy harms such as those caused by behavioural advertising. In addition to this, failing to grant legal recognition to SPD leaves such data vulnerable to be used without heightened security standards, on par with other forms of personal data, such as one’s name. In Puttaswamy, the Supreme Court has also observed that Aadhaar data is a more sensitive form of data, which deserves a higher degree of protection – in the context of biometric information, the Court considered that such information is ‘intimately connected to the individual concerned’, since it involves using the human body as a basis for identification and authentication.  The Court also discusses the different aspects of privacy which are involved in the collection of biometric information, which are informational privacy and physical privacy.

Had the heightened protections for the collection and processing of SPD in the previous drafts of the Act been applied to all forms of data, the removal of this categorization would not have had an adverse effect. However, the Act as it reads, conversely brings down the level of protection for all forms of data to the bare minimum requirements, which were in any event present in the SPDI Rules – such as, providing a privacy notice, notifying the Data Principal about purposes of processing, notifying the Data Principal about her rights and the means to make a complaint to the Board. Although the definition of consent has been aligned with the Indian Contract Act, there are no categories of information of any kind, which are subjected to a higher degree of ‘explicit’ consent or protection.

While one could argue that it is desirable that data subject rights can now be exercised regardless of the form of data that the Data Fiduciary collects, not acknowledging that some of that very data is more sensitive does not account for the inherent imbalance of power between Data Fiduciaries and Data Principals, where Data Principals will always be less informed than the Data Fiduciary on the manner in which their data is processed and further utilised. Therefore, creating one uniform category of data as opposed to granting a higher level of legal protection for more sensitive forms of data, heightens the negative effects of such an information asymmetry and counteracting it.   

Malgieri, in Vulnerability and Data Protection Law (OUP 2023) discusses the importance of acknowledging power dynamics in the digital sphere, where there may be multiple sources of domination – one-to-one between the relevant Data Controller (as defined in the GDPR) and the individual, or the larger market forces which prevent users from exercising free choice online, with respect to their data. This may differ depending on the socio-economic contexts within which the users have had their lived experiences and the origins of such vulnerability, making this assessment contextual in nature.


Malgieri and Nikolas (2020) have argued that adopting a ‘vulnerability aware’ interpretation of data protection law, to account for the ‘many layers’ of data subjects, including their levels of awareness, understanding about the manner in which their data is used, and their weaknesses in the context of how their information might be exploited. An example of a specific ‘vulnerable’ category of individuals recognized in data protection law including the DPDPA, is children. The vulnerability aware approach can also aid to re-evaluate the relationships of power being individuals, corporations, and institutions by assessing how different categories of individuals may perceive privacy, and how the use of their data may leave them vulnerable to heightened levels of harm, thus leading to the incorporation of legal safeguards through data protection legislation. Similar to areas such as consumer protection law where certain categories of vulnerable individuals are legally recognized in the EU, they argue that the recognition of such categories of individuals in data protection law can lead to the implementation of duties of care, and address discrimination and manipulation.

This is a useful framework to assess the removal of the category of SPD. Let us consider two examples of illustrations in the DPDPA. The illustration to Section 5(2) (Notice) deals with an individual who has provided her consent to an e-commerce service provider for processing personal data for use of an online shopping application. The illustration to Section 6 (Consent) deals with an individual providing consent for telemedicine services. Regardless of the substance of the sections themselves, the two illustrations deal with very different categories of data – the former represents someone who typically needs to allow the Data Fiduciary to process data such as their name and address, and the latter, depicts someone who is likely to disclose sensitive health related information. The vulnerability of these individuals is likely to change dependent on context – such as their gender or caste. An added layer of vulnerability is present if both these Data Fiduciaries aggregate these personal data points for predictive algorithms, or online behavioural advertising. However, the DPDPA would treat both these scenarios on par as far as applicability of its provisions is concerned. This also perpetuates a problematic assumption of an average data subject who is aware of what data is being collected from them and what purposes this data is used for, and does not account for varying levels of digital literacy, a lack of awareness of one’s rights in the digital sphere, and differences in languages spoken across the country. In Puttaswamy,  the Supreme Court has also acknowledged in the context of Aadhaar data, that the effect of intrusions into privacy may not be uniform, and may be particularly problematic for people depending on cultural or religious backgrounds. The Court also discusses how the possession of such information can lead to the monitoring and persecution of groups of civilians on the basis of race, ethnicity and religion.

To address this power imbalance and the differential effect depending on the category of Data Subjects in question, Malgieri and Nikolas have (in the context of the GDPR) posited adopting Florencia Luna’s layered theory of vulnerability – where vulnerability is conceptualised not as fixed attributes of individuals or groups, but are continually constructed by social circumstances, time, and location – affecting the severity of privacy harms dependent on this context. This would influence the requirement and level of legal protections for such individuals/groups, an intersectional analysis of the origins of such vulnerability and consequences, and adopting different strategies for mitigation of harms based on this analysis. The current position in the DPDPA is in fact a step back from even having a discussion of vulnerability, given the lack of legal acknowledgment that there is a category of data which could give rise to increased harms, or affect fundamental rights if such data is misused. This puts Indian Data Principals at a higher risk of this data being collected and processed in ways that they are not aware of without a higher degree of protection or specific grounds for the collection of SPD. This is also likely to more adversely affect Data Principals who are people with certain vulnerabilities – for instance, those who necessarily need to give up their health data for chronic conditions, women who may utilize health tracking of fem-tech applications, internet users who are not digitally literate (and could even be more susceptible to financial fraud), asylum seekers, or activists.

Malgieri proposes three factors to classify such vulnerabilities – mental or social characteristics of the data subjects (in India, principals), hierarchical or information power of the data controller (in India, Fiduciary) and effects of vulnerability drivers (which are elements which can be used to manipulate decision making or discriminate against a data subject). I argue that the factors which were initially proposed by the Srikrishna Committee to determine the classification of SPD, are aligned with the vulnerability aware approach as discussed here, since it was a contextual analysis, and one which placed focus on the possibility of harms to be caused to groups of Data Principals. Had that approach been adopted, it would have nevertheless merited more scrutiny as to the criteria to decide on a ‘significantly discernible’ group of Data Principals, the definition of harm, and how the adequacy of the data protection framework would be determined.



In this article, I have discussed how removing the category of SPD from the DPDPA has multiple adverse consequences. Firstly, it does not grant legal recognition to the fact that there are multiple types of data, the processing of which could lead to a higher level of harm for individuals and marginalized groups. Secondly, rather than increasing the level of protection for all forms of data, the Act brings down the level of protection for data which would have been classified as SPD to a level which I argue is substantially similar to the SPDI Rules with certain additional components for consent. Thirdly, I suggest that the vulnerability aware approach to data protection law posited by scholars in the field provides useful insights to analyse the effects of this change in the DPDP. It demonstrates how the lack of acknowledgment of sensitive personal data in the law also has the consequence of not recognizing that data collection and processing can have disparate effects on Data Principals, depending on context, power imbalance, marginalization, socio-economic circumstances, digital literacy levels, and mental states. Finally, I argue that the Srikrishna Committee’s initial contextual approach to classifying categories of SPD would have been a step closer towards acknowledging these factors and the information asymmetries present between Data Fiduciaries and Data Principals. Providing residuary power to the Data Protection Authority, could ensure that the categorization of SPD is suitably updated as newer technologies emerge.  


*Sriya is an academic at a law school in Chennai, and is currently pursuing her LLM (Master of Laws) in Innovation, Technology and the Law from the University of Edinburgh. Her research interests include data protection and privacy legal theory and compliance, examining regulation and innovation as co-dependent processes, and the dynamics between information, power, and society. She also consults on matters relating to technology and data protection law and policy.



bottom of page