Children’s Data: Rethinking Schedule IV Exemptions

India’s Digital Personal Data Protection Act, 2023 (DPDP Act) creates heightened safeguards for children’s personal data, including requirements for verifiable parental consent (VPC) and a ban on tracking, behavioural monitoring, and targeted advertising for minors. However, the DPDP Rules introduce Schedule IV, which exempts wide categories of data fiduciaries and purposes from these child-specific protections. This article analyses whether these exemptions are justified, excessive, or structurally risky — and whether India has departed too far from global best practice in protecting children online.

The exemptions in Schedule IV of the DPDP Rules carve out wide classes of data fiduciaries and broad categories of purposes from the strict parental-consent and anti-tracking requirements under Section 9(1) and (3) of the DPDP Act. These exemptions are provided to healthcare, education, child and infant care sectors, and any third-party service providers who are engaged by such educational institutions and childcare establishments. Data used for the exercise of any power, performance of any function in the interests of children, providing public funds, subsidies, and benefits, for creating e-mails, for real-time location tracking, for ensuring restricted access to information, service and advertisements targeted towards children are also exempted. These exemptions are simultaneously pragmatic and problematic. They are pragmatic because they respond to clear implementation constraints; they are problematic because they hollow out the strongest legal protections for children and create regulatory complexity that will advantage well-resourced actors and confuse smaller ones. The central question, then, is whether these exemptions are excessive or required, and whether India should have retained verifiable parental consent (VPC) as the normative rule for processing children’s data.

While the exemptions are required to an extent, their scope is too wide—almost defeating the purpose of the protections in place. On one hand, the Act’s parental consent mandate and its categorical prohibition on behavioural monitoring for children are administratively heavy: meaningful VPC at scale demands reliable, ubiquitous age-verification infrastructure that India does not yet possess. Requiring authentication for routine public-interest services such as education, health, welfare delivery, or disability support would render these systems unworkable. The Rules’ drafters clearly intended to avoid paralysing essential services and to protect the informal economy by permitting carve-outs for state functions, healthcare, schooling, and benefits delivery. From an administrative-feasibility perspective, this is defensible: a legal rule that cannot be implemented in practice produces chaos and inequitable compliance. In this sense, some form of exemption was not merely advisable but inevitable.

However, on the other hand, the exemptions are also overbroad in form and risk creating perverse incentives. Firstly, Schedule IV’s Parts A and B do not simply list narrow exceptions; they cover entire sectors and wide functional categories — “education,” “healthcare,” even “creation of user accounts for communication.” This drafting invites interpretation disputes and allows commercial actors to strategically characterise their processing to fall within an exempted category. For example, an ed-tech platform offering ad-supported features can plausibly argue that it is an “education service” and thus escape the VPC and anti-tracking obligations, even when engaging in behavioural analytics for monetisation. The practical effect is a shift from clear statutory protection toward a discretionary, fact-dependent exercise that empowers sophisticated actors to structure themselves into exemptions while leaving small schools, NGOs, and local service providers uncertain and overburdened.

Additionally, these exemptions aggravate enforcement and accountability asymmetries. One would expect that the highest-risk actors, such as large digital platforms and ad-driven engagement ecosystems, should face the strictest constraints when processing children’s data. Yet the sectoral carve-outs create pathways for precisely such actors to claim exemption. Regulators will be forced into complex, purpose-driven interpretation disputes, a resource-intensive task that disproportionately benefits well-funded incumbents capable of deploying legal and technical argumentation. Meanwhile, smaller actors, fearing non-compliance, may collect unnecessary data or avoid collecting information that is genuinely needed, weakening their services and potentially excluding vulnerable children. This tension between usability and meaningful protection has been exacerbated by the carve-outs under Schedule IV.

Furthermore, from a child-rights and ethical standpoint, the exemptions represent a significant step backwards. The purpose of child-specific provisions is to recognise children’s limited capacity to assess digital risks and their heightened susceptibility to manipulation. By treating VPC as dispensable in most contexts, the Rules implicitly subordinate child protection to administrative convenience. While this may be justifiable for narrowly defined public-interest functions, it is far harder to defend across sectors where commercial incentives to profile and nudge minors remain strong. When the law ceases to provide a strong presumption in favour of parental oversight, responsibility shifts to ad hoc regulatory interventions and voluntary corporate restraint — approaches that have historically been inadequate.

Comparative global practice reinforces this concern, although different jurisdictions use different legal mechanisms. In the United States, COPPA requires verifiable parental consent for the collection of personal information from children under 13 by online services directed to children or services that have actual knowledge of a child user; its exemptions are narrow and mostly operational (such as security, support for internal operations, or legal compliance). In the European Union, GDPR Article 8 requires parental authorisation for processing the personal data of children under 16 in the context of “information society services” offered directly to them, with Member States allowed to lower the age to between 13 and 16; while GDPR does not rely on broad sectoral carve-outs, it instead uses other lawful bases such as “public interest” or “legal obligation” for essential services, and these bases still impose strict necessity and proportionality tests. The United Kingdom’s Age-Appropriate Design Code (Children’s Code) does not create a separate parental-consent regime but imposes stringent design-level obligations such as high privacy by default, limits on profiling unless it is demonstrably in the child’s best interests, and mandatory DPIAs for services likely to be accessed by children. California’s emerging approach (under the California AADC, currently under litigation but indicative of policy direction) similarly emphasises safety-by-design standards and assessments rather than relying solely on parental consent.

Taken together, these regimes differ in technique but converge on a common principle: special protections for children should be the baseline, and any relaxation of those protections must be narrow, justified, and accompanied by strong structural safeguards. Against this backdrop, India’s choice to exempt wide classes of data fiduciaries and broad categories of purposes through Schedule IV represents an unusually permissive model in comparative terms.

There are also practical steps that would preserve necessary flexibility while restoring protective force. The Rules should adopt narrow, tightly worded exemptions tied to specific tasks rather than entire sectors; mandate baseline technical and organisational safeguards, including data minimisation and default bans on profiling; require child-specific impact assessments; and create a public register of entities claiming Schedule IV exemptions. Most importantly, the burden of proof must be inverted: data fiduciaries seeking exemption should be required to publicly justify why VPC is infeasible and how risks to children are mitigated.

In short, while exemptions were necessary to avoid operational paralysis, the current sweep of Schedule IV is excessive and risks institutionalising weaker standards for child data protection. A more transparent, proportionate, and child-centric regulatory model grounded in VPC as the normative rule and supported by narrowly tailored exceptions would better reconcile India’s administrative realities with its substantive responsibility to protect minors in an increasingly data-driven ecosystem.

Image Credits:

Photo by Stock Dignity on Canva

While exemptions were necessary to avoid operational paralysis, the current sweep of Schedule IV is excessive and risks institutionalising weaker standards for child data protection. A more transparent, proportionate, and child-centric regulatory model grounded in VPC as the normative rule and supported by narrowly tailored exceptions would better reconcile India’s administrative realities with its substantive responsibility to protect minors in an increasingly data-driven ecosystem.

POST A COMMENT