03/25/2026 | Press release | Distributed by Public on 03/25/2026 04:12
The Government recognises the transformative potential of emerging digital technologies, particularly in enhancing access to education, information, and services for children. At the same time, the Government is cognizant of the associated risks, including exposure to harmful content, cyberbullying, and issues related to excessive screen time and digital dependency, thereby necessitating appropriate safeguards for children in the digital environment.
As per the information received from the Ministry of Electronics and Information Technology (MeitY), the Government has put in place a comprehensive legal and regulatory framework under the Information Technology Act, 2000, the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (as amended), and the Digital Personal Data Protection Act, 2023 to ensure a safe, secure and accountable online environment for users, including children.
The Information Technology Act, 2000, read with the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, as amended from time to time, provide a comprehensive legal framework to ensure a safe and accountable online environment.
The Act contains specific penal provisions to address cyber offences, including computer-related offences (Section 43 read with Section 66), identity theft (Section 66C), cheating by personation (Section 66D), violation of privacy (Section 66E), and publication or transmission of obscene, sexually explicit or child sexual abuse material (Sections 67, 67A and 67B). It also provide for blocking of unlawful content (Section 69A), abetment of offences (Section 84B), and empowers law enforcement agencies to investigate offences and take appropriate action (Sections 78 and 80).
Further, the Act, together with the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, establishes a robust framework to prevent the hosting or transmission of unlawful and harmful content online and prescribes due diligence and accountability obligations for intermediaries, including social media platforms.
The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 mandate intermediaries to exercise due diligence and prohibit hosting or transmission of unlawful content, including content that is obscene, pornographic, invasive of privacy, harmful to children, promotes hate or violence, impersonates individuals, or threatens national security or public order. The Rules also require intermediaries to periodically inform users of their policies and the consequences of non-compliance, including removal of content or termination of access.
The Digital Personal Data Protection Act, 2023, along with the Rules framed thereunder, provides a comprehensive framework for protection of personal data, including that of children, and mandates lawful processing with appropriate safeguards and accountability. The Act lays down specific safeguards for children by requiring verifiable consent of parents or lawful guardians prior to processing of their personal data and prohibits practices such as tracking, behavioural monitoring and targeted advertising directed at children. It also provides for the right to withdraw consent, with corresponding obligations on data fiduciaries to erase such data in accordance with the provisions of the Act.
The Ministry of Electronics and Information Technology (MeitY) has issued advisories from time to time, including on 26.12.2023, 15.03.2024 and 29.12.2025, reiterating the due diligence obligations of intermediaries under the Information Technology Act, 2000 and the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. These advisories, inter alia, emphasise the need to prevent dissemination of unlawful content, including obscene, pornographic, paedophilic material, and emerging harms such as malicious synthetic media and deepfakes.
In view of the increasing use of generative AI and the risks associated with synthetically generated content (SGI), including deepfakes, as well as the potential misuse of such technologies to create or generate SGI of obscene, vulgar, sexually, explicit nature, including CSEAM, which may cause user harm, spread misinformation, manipulate elections, or enable impersonation of individuals, after due consultation, has notified amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 on 10.02.2026, which came into force on 20.02.2026.
The amendments strengthen due diligence obligations of intermediaries, including social media and significant social media intermediaries, by requiring deployment of appropriate technical measures to prevent dissemination of unlawful AI-generated content, including content that is obscene, misleading, impersonating, or harmful to children. The Rules also mandate clear labelling and traceability of permissible synthetic content, enhanced user awareness, and stricter compliance requirements.
Further, the framework explicitly covers harms such as child sexual exploitation material, non-consensual intimate imagery and impersonation, and prescribes stricter timelines for action, including removal of unlawful content within 3 hours upon appropriate directions, time-bound grievance redressal, and expedited action in cases involving sensitive content.
Through these measures, the Government of India is committed to fostering AI innovation and the growth of digital platforms in a responsible and secure manner, while protecting children and other vulnerable users from emerging risks in the digital ecosystem.
The Ministry of Education issued the PRAGYATA Guidelines on Digital Education in July 2020, which provide a framework for safe and effective online learning, including promotion of students' well-being and responsible use of social media and electronic devices. CBSE has supplemented these efforts through guidelines on digital etiquette, cybersecurity training for teachers, publication of the 'Cyber Security Handbook', and advisories to schools for establishing Cyber Clubs to promote cyber safety awareness. NCERT has also incorporated cyber safety in its curriculum, including a chapter on "Societal Impacts" in Classes XI and XII (https://ncert.nic.in/textbook.php?kecs1=ps-11), and CIET-NCERT has developed and disseminated resource materials on cyber safety (https://ciet.nic.in/pages.php?id=booklet-on-cyber-safety-security&ln=en).
Further, in line with the National Education Policy, 2020 and the National Curriculum Framework for School Education, 2023, Artificial Intelligence (AI) is being integrated in school education as a key 21st-century skill. NCERT and CBSE have been tasked with developing age-appropriate AI curriculum across K-12, and AI-related content has been incorporated in textbooks, senior secondary curriculum and skill-based learning. Capacity-building initiatives for teachers are being undertaken through training programmes, workshops and online courses, with participation of institutions such as NCERT, CBSE, Kendriya Vidyalaya Sangathan and Navodaya Vidyalaya Samiti. A Centre of Excellence for AI in Education is also being established, along with collaborations with academic and industry partners, to promote safe and responsible adoption of AI in schools.
(d) The data relating to crime against children is maintained by the National Crime Records Bureau (NCRB) under the Ministry of Home Affairs (MHA) which may be seen at https://www.ncrb.gov.in/. However, specific data on year-wise and State and UT-wise number of child suicides attributable to digital addiction is not maintained separately.
This information was given by the Minister of State for Women and Child Development Smt. Savitri Thakur in Rajya Sabha in reply to a question.
****
SS