
The Indian government has amended the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, to strengthen transparency and accountability in the removal of unlawful online content. These changes, effective from November 1, 2025, aim to regulate the increasing misuse of AI-generated content and ensure that content takedown processes are conducted with greater oversight.
Key Amendments to Rule 3(1)(d)
Under the revised Rule 3(1)(d), intermediaries are required to remove unlawful content upon receiving actual knowledge, either through a court order or a notification from the appropriate government. The amendments introduce several safeguards to enhance transparency and accountability:
Addressing AI-Generated Content
In addition to procedural reforms, the Ministry of Electronics and Information Technology (MeitY) has introduced measures to address the rise of AI-generated content. The amendments define “synthetically generated information” as content created or altered using AI or algorithmic tools to appear authentic. Platforms are now required to:
Implications for Tier 2 Cities
The amendments are particularly relevant for Tier 2 cities, where internet penetration is rapidly increasing, and digital literacy varies. The enhanced transparency in content removal processes will help users in these regions better understand and navigate online content, reducing the impact of misinformation and harmful content.
Conclusion
The government’s amendments to the IT Rules signify a proactive approach to regulating online content and enhancing user trust in digital platforms. By ensuring that content removal processes are conducted with greater transparency and accountability, the government aims to create a safer and more reliable online environment for all users.