ARTICLE
11 February 2026

INDIA: IT Intermediary Rules Amended – New Obligations For AI / Synthetic Content

RS
Remfry & Sagar

Contributor

Established in 1827, Remfry & Sagar offers services across the entire IP spectrum with equal competence in prosecution and litigation. Engagement with policy makers ensures seamless IP solutions for clients and contributes towards a larger change in India’s IP milieu. Headquarters are in Gurugram, with branches in Chennai, Bengaluru and Mumbai.
The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2026 have been notified and will take effect on February 20, 2026, significantly expanding the regulatory framework governing synthetically generated audio-visual content.
India Technology
Bisman Kaur’s articles from Remfry & Sagar are most popular:
  • within Technology topic(s)
  • with Inhouse Counsel
  • in United States
  • with readers working within the Construction & Engineering industries

The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2026 (see here) have been notified and will take effect on February 20, 2026, significantly expanding the regulatory framework governing synthetically generated (AI-generated) audio-visual content.

Key Takeaways:

1. Statutory recognition of "synthetically generated information"

The Rules now expressly define synthetically generated information to include AI-created or AI-altered audio, visual and audio-visual content that appears real or indistinguishable from real persons or events (i.e. deepfakes). Routine or good-faith editing, accessibility improvements, and good-faith educational or design uses are expressly excluded.

Further, any reference to "information" used for unlawful acts under the Rules now expressly includes synthetically generated information.

2. Proactive compliance obligations for AI-enabling platforms

Intermediaries that enable or facilitate the creation or dissemination of synthetically generated content must:

  • Deploy reasonable and appropriate technical measures – including automated tools - to prevent generation or circulation of unlawful synthetic content (including child sexual abuse material, non-consensual imagery, obscenity, false documents, explosives or weapons-related material, and deceptive impersonation or deepfakes that misrepresent individuals or real-world events.
  • Ensure all lawful AI-generated content is clearly and prominently labelled and embedded with permanent metadata / provenance identifiers, to the extent technically feasible; and
  • Prevent removal or tampering with such labels or embedded metadata.

3. Mandatory user disclosures and enforcement consequences

Earlier, intermediaries were required to inform users annually of the consequences of non-compliance with platform rules. The revised framework now mandates that users be informed at least once every three months.

In addition to reiterating the intermediary's right to suspend or terminate access and remove content, platforms must now expressly warn users that unlawful activity may attract statutory penalties, and that offences requiring mandatory reporting - including under the Bharatiya Nagarik Suraksha Sanhita, 2023 and the POCSO Act - will be reported to authorities.

Where platforms enable synthetically generated content, they must additionally notify users that misuse of such tools may result in disclosure of identity to victims (where legally permissible).

4. Enhanced obligations for significant social media intermediaries

Before publishing content, significant social media intermediaries must:

  • Require users to declare whether content is synthetically generated;
  • Technically verify such declarations; and
  • Prominently label AI-generated content.

Failure to do so may be treated as a lack of due diligence.

5. Sharply reduced response timelines

Under the amended framework:

  • The overall grievance resolution period has been reduced from 15 days to 7 days.
  • The 72-hour window for specified removal requests has been shortened to 36 hours.
  • The 24-hour timeline for action on sensitive content (including non-consensual intimate imagery and impersonation) has been reduced to 2 hours.
  • Compliance with lawful takedown directions issued by authorised officers has been reduced from 36 hours to 3 hours.

The amendments further require intermediaries to act promptly when they become aware- either independently or upon complaint - of violations involving synthetically generated information, including disabling access, suspending accounts and reporting offences where legally mandated.

Importantly, it is clarified that removal or disabling of access to synthetically generated information in compliance with the obligations of the revised rules does not jeopardise safe harbour protection under Section 79(2) of the Information Technology Act, 2000.

6. Alignment with new criminal laws

References to the Indian Penal Code have been replaced with the Bharatiya Nyaya Sanhita, 2023, aligning intermediary obligations with India's updated criminal framework.

Why this matters

These amendments raise the compliance bar for platforms that host or enable AI-generated content. The focus is clearly on labelling, traceability and preventing misuse of synthetic media, with direct implications for AI platforms, social media companies and content-hosting intermediaries.

While the final rules broadly follow the October 2025 draft rules, they make some important changes. The earlier proposal had set very specific display requirements for labels. The final version instead requires clear and prominent labelling, subject to technical feasibility. At the same time, it introduces firmer due-diligence duties and much shorter response timelines.

Notably, unlike the current proactive-monitoring provision applicable to significant social media intermediaries, the new synthetic-content clause for all intermediaries does not expressly repeat safeguards such as proportionality, human oversight or bias review - reflecting a stronger emphasis on compliance and enforcement.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More