Centre Tightens IT Rules to Enhance Transparency in Online Content Removal

The Indian government has amended the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, to strengthen transparency and accountability in the removal of unlawful online content. These changes, effective from November 1, 2025, aim to regulate the increasing misuse of AI-generated content and ensure that content takedown processes are conducted with greater oversight.

Key Amendments to Rule 3(1)(d)

Under the revised Rule 3(1)(d), intermediaries are required to remove unlawful content upon receiving actual knowledge, either through a court order or a notification from the appropriate government. The amendments introduce several safeguards to enhance transparency and accountability:

  • Senior-Level Authorization: Content removal orders can now only be issued by senior officials, not below the rank of Joint Secretary or equivalent. In the absence of such an officer, a Director or equivalent can issue the order. For police authorities, only an officer not below the rank of Deputy Inspector General (DIG), specially authorized, can issue such intimations.
  • Reasoned Intimations: All content removal orders must include a clear legal basis, the statutory provision invoked, the nature of the unlawful act, and the specific URL or identifier of the content to be removed. This replaces the earlier broad reference to ‘notifications’ with ‘reasoned intimation,’ aligning the rules with the requirement of ‘actual knowledge’ as mandated under Section 79(3)(b) of the IT Act.
  • Monthly Review: All intimations issued under Rule 3(1)(d) will be subject to a monthly review by an officer not below the rank of Secretary of the appropriate government. This review ensures that such actions remain necessary, proportionate, and consistent with the law.

Addressing AI-Generated Content

In addition to procedural reforms, the Ministry of Electronics and Information Technology (MeitY) has introduced measures to address the rise of AI-generated content. The amendments define “synthetically generated information” as content created or altered using AI or algorithmic tools to appear authentic. Platforms are now required to:

  • Label AI-Generated Media: All synthetic media must be clearly labeled to ensure transparency.
  • Obtain User Declarations: Platforms must obtain user declarations regarding the nature of the content.
  • Deploy Verification Mechanisms: Platforms must implement technical mechanisms to verify user declarations and the authenticity of the content.

Implications for Tier 2 Cities

The amendments are particularly relevant for Tier 2 cities, where internet penetration is rapidly increasing, and digital literacy varies. The enhanced transparency in content removal processes will help users in these regions better understand and navigate online content, reducing the impact of misinformation and harmful content.

Conclusion

The government’s amendments to the IT Rules signify a proactive approach to regulating online content and enhancing user trust in digital platforms. By ensuring that content removal processes are conducted with greater transparency and accountability, the government aims to create a safer and more reliable online environment for all users.

Sakshi Lade

0 Votes: 0 Upvotes, 0 Downvotes (0 Points)

Leave a reply

Loading Next Post...
Sidebar Search Trending
Popular Now
Loading

Signing-in 3 seconds...

Signing-up 3 seconds...