top of page

High Hopes, Higher Hurdles: Analysis of the DPDP Act and its Draft Rules (Part I)

  • Pushpit Singh and Silvia Tomy Simon
  • 6 days ago
  • 14 min read

Updated: 5 days ago

-Pushpit Singh* and Silvia Tomy Simon**

[This is the first part of a two-part contribution which discusses the Indian Data Protection regime and analyses the friction between the DPDP Act and the draft DPDP Rules.]

Abstract


India’s emerging data protection framework contains a structural inconsistency. The Digital Personal Data Protection Act 2023 proclaims to be a risk-based and innovation-oriented system. However, the Draft DPDP Rules 2025: (a) superimpose open-ended ministerial control over cross-border transfers, (b) impose a single list of data security safeguards whose extent of implementation is undefined, and (c) grant broader exemptions to public bodies than to private entities. This combination generates legal uncertainty. It pushes smaller firms toward costly over-compliance, leaves high-risk processing of personal data without tailored safeguards, and disrupts artificial intelligence development that relies on continuous, iterative data use. This article responds to the above problems in three stages. First, it dissects the friction between the DPDP Act and the Draft DPDP Rules, focusing on data-transfer controls, data security obligations, and data erasure requirements. Second, it assesses the practical impact on Data Principals’ rights and on data-driven industries such as artificial intelligence, financial technology, and cloud services. Third, it draws on the EU GDPR to propose concrete reforms such as sector-specific codes of practice, tiered safeguards that scale with risk, and objective thresholds for anonymisation.

  1. INTRODUCTION

 

The Digital Personal Data Protection Act, 2023 (“the Act” or “DPDP Act”) seeks to establish a unified legal framework to govern how entities in India collect, store, and process personal data in digital form. Its objectives are to safeguard personal data, enhance individual autonomy, and foster data-driven economic growth. The recently released Draft Digital Personal Data Protection Rules, 2025 (“Draft DPDP Rules” or “the Draft Rules”), issued under the Act’s enabling provisions, are designed to bring clarity to specific operational aspects. They attempt to formalize notice requirements, consent mechanisms, security obligations, and breach disclosures, among other key areas. These Draft Rules provide much-needed substance in some realms, specifically in delineating how notices must be given and how Consent Managers may be structured. However, these Rules also raise concerns about the Act’s broad interplay with cross-border data transfers and the administrative complexity that could burden businesses, especially smaller tech start-ups. These issues are not solely technical; rather, they are capable of directly affecting India’s ability to portray itself as a reliable and innovation-friendly digital jurisdiction.

 

In this regard, it is essential for us to examine global data protection regimes for instructive reference points. For this, the authors have chosen the European Union’s General Data Protection Regulations (“GDPR”). This comparison is merely to shed light on the tools implemented by the EU, which have facilitated continued and lawful data flows. It is not intended to pedestalize the EU model or make claims regarding its supposed infallibility. Rather, it is merely to showcase that both regulatory certainty and privacy protection can co-exist.

 

This comparison will help us understand how India’s approach is presently underdeveloped, particularly due to its reliance on executive “general or special orders” to govern data exports. This creates legal uncertainty since there are no structured fallback mechanisms or clear legal pathways, and institutional oversight in India’s Draft Rules. As a result, businesses are left in a position of uncertainty, particularly those involved in data-intensive sectors like artificial intelligence, fintech, and cloud services. In particular, businesses that depend on artificial intelligence now face fresh challenges. Large-scale data processing, which is essential to AI development, is constrained by overlapping rules on retention, anonymization, and security. These restrictions risk slowing down innovation at a time when global competition demands stable and supportive frameworks. In particular, businesses that depend on artificial intelligence now face fresh challenges. Large-scale data processing, which is essential to AI development, is constrained by overlapping rules on retention, anonymization, and security. These restrictions risk slowing down innovation at a time when global competition demands stable and supportive frameworks.

 

In light of these challenges, this article shall first examine the legal and structural tensions between the DPDP Act and the Draft Rules, with a focus on cross-border data flows and regulatory uncertainty. Second, it shall analyze the impact of the regime on Data Principals’ rights and civil liberties. Third, it shall assess the implications for the AI sector, before concluding with reflections on the path ahead for India’s data protection landscape.

 

  1. TENSIONS IN THE INTERSECTION OF THE DPDP ACT AND THE DRAFT DPDP RULES

 

Section 2(n) of the Act adopts a broad definition of “digital personal data,” referring to it simply as “personal data in digital form.” Section 16 further empowers the Central Government to issue notifications governing cross-border data transfers. The primary concern lies in the combination of this expansive definition and the reliance on future government notifications to clarify critical aspects of such transfers. This uncertainty is further compounded by the Act’s broad territorial scope. Section 3 of the DPDP Act extends its scope to all digital personal data processed within India, as well as to data processed outside India if it relates to goods or services offered to individuals within the country. Rule 14 of the Draft Rules builds on this by allowing the Central Government to issue “general or special orders” that can either restrict or permit cross-border data transfers on a case-by-case basis. This open-ended approach creates the possibility of sudden policy shifts, making compliance unpredictable for global technology companies. As a result, international firms must constantly adapt their internal frameworks to align with evolving Indian data transfer regulations. The risk is that previously lawful transfers may be halted or reclassified without adequate notice, leading to operational uncertainty. For multinational businesses, this may translate into repeated system overhauls, higher compliance costs, and the need for ongoing legal reassessment each time a new order is introduced or modified.

 

Contrasting this with the GDPR, one can observe that the EU regime provides a relatively predictable system of adequacy decisions, standard contractual clauses (“SCCs”), and binding corporate rules (“BCRs”) for cross-border transfers. In other words, under the GDPR, organizations have several well-defined and transparent avenues to lawfully transfer personal data outside the European Economic Area. These include adequacy decisions, where the European Commission certifies that a non-EU country provides data protection similar to EU standards. It is pertinent to note that this certification is done through a political process that has been criticised for lacking transparency and timeliness. However, it is essential to understand that the resulting adequacy decisions are published, legally binding, and offer a degree of regulatory stability that enables long-term planning.

 

In the absence of adequacy, the GDPR additionally provides fallback instruments. Most notably, it is the SCCs (pre-approved templates that allow data sharing between EU and non-EU entities in a compliant manner) and BCRs (help ensure that internal data transfers within the group follow GDPR requirements). They seem to have been widely adopted in response to judicial scrutiny. Following the Court of Justice of the European Union’s Schrems II ruling, thousands of organisations modified their existing SCCs, conducted transfer impact assessments, and updated their internal governance systems. It is pertinent to note that the common thread across these options is predictability. In other words, companies know what procedures to follow and what safeguards to adopt to facilitate cross-border data flows without fear of running afoul of EU data protection authorities.

 

However, in the Indian context, the Draft Rules do not establish a coherent legal infrastructure for authorizing cross-border transfers of personal data. As it stands, there is no formal procedure for recognising jurisdictions as offering “adequate” levels of data protection, nor are there standardised contractual mechanisms akin to those used in other jurisdictions to legitimise such transfers. Rule 14 of the Draft Rules merely contemplates the issuance of “general or special orders” by the Central Government. These orders may either restrict or permit transfers on an ad hoc basis. In the absence of any published evaluative criteria or procedural safeguards governing these executive determinations, entities are left to anticipate policy shifts without any settled compliance roadmap.

 

From an operational standpoint, this legal uncertainty imposes disproportionate burdens on sectors that rely on seamless data mobility — particularly artificial intelligence, digital finance, and cloud computing. For such enterprises, datasets are often acquired, processed, and deployed across multiple jurisdictions in iterative, transnational cycles. Abrupt regulatory shifts — introduced without sufficient notice or transition provisions — may disrupt these workflows, force repeated system reconfigurations, and deter sustained investment in Indian data infrastructure. This regulatory volatility is especially detrimental for India’s aspirations to serve as a global node for AI research and data-intensive innovation.

 

Another layer of complexity that arises from the Draft DPDP Rules is the treatment of technical and organisational safeguards. In particular, Section 8(5) of the DPDP Act imposes a general obligation on Data Fiduciaries to implement “reasonable security safeguards” to prevent personal data breaches. At first glance, this appears to adopt a context-sensitive, principles-based standard, allowing entities to determine the adequacy of their safeguards based on the scale, sensitivity, and purpose of data processing. However, Rule 6 significantly narrows this flexibility by prescribing a fixed list of technical controls, including encryption, masking, and logging, along with a mandatory one-year log retention period. This dual structure — one that invokes “reasonableness” while simultaneously mandating fixed measures — creates both conceptual and practical ambiguity. This is because smaller enterprises or early-stage start-ups may find the prescribed standards disproportionate to their processing risks or operational capacities, yet may be penalised for non-compliance even where less stringent safeguards would have sufficed. By contrast, the European Union’s GDPR adopts a clearer risk-calibrated framework under Articles 24 and 32, instructing data controllers to implement “appropriate technical and organisational measures” tailored to the likelihood and severity of harm to data subjects.

 

It is true, as some scholars have pointed out, that the GDPR’s reliance on the evolving notion of “appropriateness” introduces a degree of uncertainty. Regulatory authorities and courts have, at times, diverged on whether relatively minor deviations from evolving industry standards contravene Article 32. However, it is argued that this interpretive elasticity enables a more proportionate and scalable enforcement model. Building on this flexibility, supervisory authorities across member States have increasingly issued sector-specific guidance, allowing industries such as finance, healthcare, and cloud services to benchmark their practices against evolving expectations. This guidance helps bridge the gap between abstract statutory standards and practical compliance obligations, offering regulated entities clearer direction while retaining the adaptability needed in a fast-changing digital landscape. Moreover, the jurisprudence emerging from EU enforcement actions such as the Meta or Centric Health inquiry — while still in development — has begun to crystallise minimum expectations over time, particularly in relation to data breach preparedness and pseudonymisation standards. Therefore, it is not that the GDPR model eliminates uncertainty altogether, but that its risk-based logic allows for both innovation and proportional enforcement.

 

Rule 6(1) prescribes a single set of uniform and generic types of security safeguards that every Data Fiduciary must adopt, such as encryption or masking, access-control, logging, business-continuity arrangements, contractual obligations on processors, and “technical and organisational measures.” These safeguards are uniform because there exists certainty regarding the fact that generic types of safeguards have been prescribed to protect personal data in different situations. However, what is non-uniform is the undefined scope of implementation because each safeguard is qualified by terminology like “reasonable” or “appropriate.” For example, Rule 6(1)(a) obliges all entities to apply “appropriate data-security measures” such as encryption, obfuscation, or tokenisation. Yet, the Rule does not articulate any thresholds like algorithmic strength, key-management practices, or coverage ratios by which a Data Fiduciary can concretely decide whether its chosen measure is indeed “appropriate.” Rule 6(1)(b) mandates “appropriate measures to control access.” However, the Rule does not specify what those measures should include. For example, whether multi-factor authentication, privilege-tiering, or routine password changes suffice for the purpose of this Rule, and, in the absence of such benchmarks, organizations would be left to their own subjective judgment to decide what level of access control is sufficient. Rule 6(1)(d) requires “reasonable measures for continued processing” after a compromise, but it is silent on recovery-time or recovery-point objectives, leaving organisations to guess what business-continuity posture will satisfy the Data Protection Board. Rule 6(1)(f) compels Data Fiduciaries to embed “appropriate provision” for security in processor contracts, without indicating the level of due diligence, audit rights, or indemnities that would meet the standard. Rule 6(1)(g) repeats the requirement of “appropriate technical and organisational measures” to ensure observance of security safeguards, yet again, without any metrics. Thus, while Rule 6(1) creates an ostensibly uniform checklist of safeguard categories, it gives no uniform yardstick for the degree of rigour each safeguard must entail during its implementation. The indeterminate notions of “reasonable” and “appropriate” introduce subjectivity that may foster divergent interpretations, regulatory uncertainty, and uneven enforcement across the data economy.

 

Thus, the above results in a “uniformity conundrum” because, although the type of generic safeguards to be adopted is uniform, the scope and degree of their implementation remains non-uniform. This has the potential to result in over-compliance by smaller players and under-compliance in high-risk contexts where more stringent controls may be warranted. This uniformity conundrum risks flattening the nuance necessary for a dynamic digital economy. Further, this is also particularly ill-suited to the context of AI, where training data is often ephemeral, transient, and handled by lean teams working on experimental pipelines.

 

To address this, India may consider empowering the Data Protection Board to issue sector-specific codes of practice, akin to the United Kingdom's Information Commissioner Office-approved codes under the UK GDPR. These codes provide industry-specific compliance guidance that, once approved, carry statutory weight and help clarify how legal obligations apply in real-world contexts. Alternatively, India could also adopt a tiered system of obligations. In this system, basic safeguards could apply as a “baseline,” with stricter requirements kicking in only when data is sensitive or processed at a large scale. This would give organizations clearer compliance thresholds, without relying on rigid technical checklists. It would also preserve proportionality, offering more structure than a vague “reasonableness” standard. In its current form, the Draft Rules risk promoting box-ticking compliance rather than meaningful safeguards. This could discourage innovation, especially among smaller or more agile players. A more flexible approach — whether drawn from the EU or adapted from other models — would strike a better balance between protecting data and supporting growth.

 

A further layer of complexity emerges from Rule 8, read in conjunction with Section 8(7) of the DPDP Act. These provisions impose a mandatory forty-eight (48) hour waiting period before a Data Fiduciary may erase personal data. It requires fiduciaries to notify the Data Principal and delay deletion for two days thereafter. It can be argued that this arrangement is clearly intended to safeguard against covert or premature data destruction and to enable individuals to object or seek clarification before their data is permanently removed. While the normative intent behind this Rule, i.e., to ensure transparency and user empowerment, is commendable, its blanket application to all deletion scenarios risks generating unintended consequences. This is primarily because many AI-driven systems operate on cyclical or iterative models of data use, in which short-lived, non-identifying datasets are routinely discarded as part of automated optimisation processes. This implies that, in such environments, the introduction of a hard waiting period complicates the architecture of real-time deletion workflows and may lead to technical inefficiencies, elevated storage costs, and the retention of data that no longer serves any operational or legal purpose. The impact goes beyond just administrative burden. It may also creep in data retention, where data is kept longer than necessary and cause significant risks as follows.  First, the very principle of purpose limitation is diluted because the legal measure, designed to increase transparency, simultaneously mandates the prolonged storage of data whose purpose may have already expired. Second, prolonged possession may expand an organization’s “attack surface.” This is because it has been shown that the probability of a data breach may increase with both time and volume of the data held. In other words, for every extra hour that obsolete data remains live, it may widen the window of vulnerability and increases the possibility of a data breach should a cybersecurity incident occur. Third, extended retention may destabilize data-governance controls. For example, once outdated data is architecturally sequestered in a “pending deletion” tier, it may become harder to ensure that access controls, encryption keys, and audit trails are applied with the same rigor as to frontline production datasets. This segregation may often create grey zones in which engineers or data scientists can — intentionally or inadvertently — repurpose residual datasets for model tuning, A/B testing, or other ancillary analytics. Thus, these risks are harmful because the above legal measure intended to enhance a Data Principal’s agency encourages Data Fiduciaries to hold larger sets of obsolete data, thereby magnifying both breach exposure and function-creep potential. In effect, the mandatory notice-and-wait protocol transforms the right of erasure into a right of delayed erasure. This undermines the immediacy and makes the above legal measure less effective in fast-moving, AI-driven environments. This outcome may end up weakening, rather than strengthening, the privacy rights of Data Principals.

 

This is not to suggest that procedural safeguards around data erasure are inherently misguided. Rather, the issue is of proportionality and scope. For example, Article 5(1)(e) of the GDPR, requires that personal data be erased when it is no longer necessary for the purpose for which it was collected. However, it does not prescribe a uniform advance notice period. Instead, it permits contextual assessment by the data controller, subject to ex post review by supervisory authorities. While this approach is not without interpretive challenges, it allows for differentiation across processing contexts, particularly between routine operational deletions and data removals that implicate core privacy rights. The Indian framework, by contrast, imposes the same procedural obligation across all processing environments, without regard to the nature of the data, its sensitivity, or the scale of processing. This one-size-fits-all model may be excessive in low-risk contexts, especially where deletion is driven by technical efficiency rather than any rights-implicating motive. A more calibrated approach would preserve the protection of Data Principals while alleviating unnecessary burdens on sectors such as AI and data infrastructure services. One possible resolution may involve a rule-based carve-out: for example, exempting log-based deletion of ephemeral datasets from the 48-hour rule where no personally identifiable information is involved, or permitting immediate deletion where such actions are part of an automated, pre-audited workflow subject to Data Protection Board oversight. Ultimately, the question is not whether the protection of data subjects should be deprioritised in favour of business convenience, rather it is whether the form of protection adopted by the Draft Rules is proportionate to the risk it seeks to mitigate. Where a uniform safeguard imposes high compliance burdens without yielding a corresponding privacy benefit, legal design must consider more flexible, risk-tiered instruments. In that sense, the GDPR’s structure, though imperfect, provides a useful comparative lens: not as a wholesale model to emulate, but as a system that builds in contextual variation while preserving robust rights protections.


The implementation of a fixed notice-and-wait protocol under Rule 8(5) of the Draft DPDP Rules also raises structural concerns for large-scale platforms, such as e-commerce marketplaces and real-time recommendation engines, where data is updated, analysed, and discarded on a rolling basis. For such entities, data cleaning is often part of automated maintenance and optimisation cycles. Requiring manual intervention or delay before every deletion introduces friction into these workflows, complicating compliance and operationalising delays into core infrastructure. This is particularly pronounced in AI-intensive systems, where model performance and training integrity may often depend on the rapid elimination of outdated or noise-heavy datasets.

 

Scholarly work has increasingly recognised that overly rigid retention and erasure timelines can disrupt data lifecycle management in AI pipelines. For instance, it is noted that GDPR’s data minimisation and purpose limitation principles, while sound in theory, have produced operational uncertainty in machine learning contexts where data reuse and deletion timelines are not always linear or foreseeable. These findings indicate that while erasure safeguards serve important purposes, particularly in promoting user transparency and accountability, their implementation must be risk-based and context-sensitive, especially in high-velocity data environments. While the Draft Rules seek to protect Data Principals from covert or non-consensual data deletion, they may unintentionally restrict the technical flexibility needed for experimentation and iterative development. Unless the framework includes clear exemptions or streamlined processes, particularly for anonymised or low-risk datasets used in closed AI systems, it risks becoming overly rigid. Such an approach may prioritise formalistic compliance at the expense of innovation and resource efficiency.

*Pushpit Singh is a TMT lawyer in Bengaluru, India. He has experience in dealing with commercial, technology, and data privacy matters. He is a BBA LLB graduate from Symbiosis Law School, Hyderabad (Symbiosis International University, Pune) and an Advanced Diploma Holder in Alternative Dispute Resolution from NALSAR, Hyderabad, India.


**Silvia Tomy Simon is a BA LLB graduate from Symbiosis Law School, Hyderabad (Symbiosis International University, Pune). She holds a certificate in Data: Law, Policy and Regulation from the London School of Economics and Political Science (LSE), UK. Her areas of interest revolve around technology, data privacy, and artificial intelligence.

 
 
 

Commentaires


Les commentaires sur ce post ne sont plus acceptés. Contactez le propriétaire pour plus d'informations.

Recent

Published by the National Law School of India University,
Bangalore, India – 560072

Follow and Subscribe for updates

  • Facebook
  • LinkedIn
  • Twitter

Thanks for submitting!

© 2021 Indian Journal of Law and Technology. All Rights Reserved.
ISSN : 0973-0362 | LCCN : 2007-389206 | OCLC : 162508474

bottom of page