The National Human Rights Commission (NHRC) has issued notices to several central ministries seeking a report on action taken within 15 days on alleged widespread violations of child data protection norms by artificial intelligence, social media and ed-tech platforms in India.

In a notification dated March 24, 2026, the Commission said it was aware of a complaint based on a research report by the Institute for Advanced Study in Asia (ASIA), which cited “serious, widespread and systematic abuses” by digital, social media, educational technology and artificial intelligence platforms that are widely accessible by children in India.
The notice has been sent to the Ministry of Electronics and Information Technology (MeitY), Ministry of Home Affairs, Ministry of Women and Child Development, Ministry of Education, Department of Communications, among others.
The ASIA report, titled ‘DPDP Compliance in relation to Children’s Data’, assesses 14 widely used platforms under the provisions of the Digital Personal Data Protection (DPDP) Act 2023. It finds that across 196 compliance checks, 71% were non-compliant, 16% were partially non-compliant, and only 13% were relatively compliant.
The report classifies platforms into four risk levels. Instagram, xAI Grok, Canva, ChatGPT, Perplexity, and WhatsApp fall into the very high risk category, with scores ranging from 89% to 100%. Gemini, Notebook LM, Microsoft Math Solver, and Claude are rated as high risk, while Photomath, Khan Academy, and SATHEE are in the medium risk category.
DIKSHA, the government-run platform, is the only platform to be placed in the low risk category with a score of 46%.
To be sure, the DPDP framework is being implemented in phases, starting with the Notification of Rules in November 2025, which brought the underlying legal framework into force along with the activation of institutions such as the Data Protection Board. This is followed by a second phase 12 months later, around November 2026, when key systems such as approval managers and related compliances are expected to become operational. Full compliance is required after 18 months, in approximately May 2027, when all provisions are expected to be fully implemented including notice and consent requirements, protection of children’s data, breach reporting obligations, and the wider responsibilities of data custodians. The report pits its findings against this ongoing trend, as platforms move toward compliance.
However, it should be noted that the IT Ministry is in consultation with industry on fast-tracking this timeline and reducing compliance periods, in some cases, from 18 months to immediate compliance.
The report identifies recurring vulnerabilities across platforms, including the absence of verifiable parental consent mechanisms, reliance on self-declared age verification, behavioral tracking and profiling of minors, and sharing of children’s data with third parties without adequate safeguards. It also highlights the mismatch between Indian law and platforms’ policies, noting that while the DPDP Act defines a child as anyone under the age of 18, most platforms use a minimum age of 13, creating a “five-year regulatory gap.”
The NHRC said in its memorandum that the findings prima facie indicate that these platforms, which act as data stewards, are “failing to fulfill their legal obligations,” exposing children to risks such as illegal data processing, behavioral monitoring, profiling, and algorithmic manipulation.
Such entities allegedly enable “continuous tracking, behavioral profiling, and manipulation of minors using algorithms,” along with the widespread sharing of children’s data with third parties without informed permission, the committee added. It described these matters as indicating “serious fiduciary misconduct in relation to data” and violations of principles such as purpose identification and data minimization.
Waiting for a response from the ministries, and the version will be updated whenever it is received.

