- with Senior Company Executives, HR and Finance and Tax Executives
- in India
- with readers working within the Accounting & Consultancy, Technology and Retail & Leisure industries
February 10, 2026 marks a significant step by the Ministry of Electronics and Information Technology ("MeitY") towards regulating a specific output of Artificial Intelligence ("AI") tools, i.e., the "Synthetically Generated Information" ("SGI"), vide amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 ("SGI Amendment Rules"), requiring intermediaries to adopt active measures to mitigate SGI in order to benefit from safe harbour under Section 79 of the (Indian) Information Technology Act, 2000. Although notified on February 10, 2026, the SGI Amendment Rules will come into effect from February 20, 2026.
In this alert, we examine the operational implications of these amendments for entities using and/or deploying AI tools in India.
Introduction of "Synthetically Generated Information"
The SGI Amendment Rules introduce the term SGI, as any artificial or algorithmically created or modified "audio, visual or audio-visual information" using a computer resource in a manner that such information appears to be "real, authentic or true" and depicts any individual or event in a manner that may be "perceived" as indistinguishable from a natural person or real event. Accordingly, intermediaries are now required to prevent, respond and monitor the creation, alteration, modification, generation, etc. of SGI in the manner detailed below.
The exemption to this definition is any:
- "routine" or "good faith" i.e., (a) editing that does not "materially alter, distort, or misrepresent" the substance, context, or meaning of the underlying audio, visual or audio-visual information; and (b) creation of documents, PDFs, training or research materials and alike, that does not result in false electronic records; and
- use of a computer resource "solely" to improve accessibility, clarity, quality, translation, description, searchability or discoverability, without generating, altering or manipulating any "material part" of the underlying audio, visual or audio-visual information.
While not explicitly stated in the amendment itself, the Frequently Asked Questions ("FAQs") clarify that text-only AI outputs do not constitute SGI. However, intermediaries remain subject to their general obligations regarding unlawful information with respect to text-only AI outputs.
Business Impact: The SGI Amendment Rules effectively reshape the compliance perimeter governing the development, deployment, and licensing of AI tools within India's safe harbour framework. By defining SGI through a subjective lens, such as "perception" standards, tools from marketing automation to video editing may now face enhanced liability requiring defensible compliance strategies to either fall within the exemption or arguably demonstrate that there is a distinction between the real person/event to the AI generated output. While the accompanying FAQs indicate that exemptions from SGI classification are limited to basic editing tools, a closer reading suggests that these carve-outs may extend more broadly to other AI applications that do not materially alter or misrepresent underlying content.
Interestingly, outputs from large language model-based tools, enterprise copilots, and customer-support chatbots may not be directly regulated as SGI, unless they generate synthetic audio-visual outputs, given such entities some immediate relief.
Enhanced Compliance required in case of Synthetically Generated Information
- Periodic Notifications: Every 3 (three)
months, intermediaries are required to inform users that: (a) a
breach of their terms and/or privacy policy can result in
suspension or termination of the account and/or removal of content;
(b) posting illegal content can result in penalties under
applicable laws and personal legal consequences; (c) where a
violation relates to the commission of an unlawful act that is
mandatorily reportable, in such case, the intermediary will report
the user.
Business Impact: The mandatory quarterly notification requirement creates significant operational and user engagement challenges for intermediaries. By forcing platforms to repeatedly warn users about account termination, legal liability, and mandatory crime reporting every 3 (three) months, the rule introduces permanent friction into the user experience that risks driving user attrition, suppressing content creation, and damaging brand perception. Quarterly legal warnings create notification fatigue, conditioning users to dismiss platform communications and undermining the effectiveness of genuinely important updates. This is particularly problematic for user-generated content platforms where the chilling effect of repeated criminal liability warnings may reduce posting frequency and encourage self-censorship among legitimate users. - SGI Specific Warnings by Intermediaries Facilitating
SGI: In addition to the 3 (three)-month notification
requirement applicable across all intermediaries and service
functions, specifically intermediaries that "enable,
permit or facilitate the creation, generation modification, etc. of
SGI" must also, inform its users that the creation,
generation, modification, etc. of SGI may attract penalties, lead
to the suspension or termination of their accounts and/or
disclosure of the identity of the violating user, and require the
intermediary to report such user in case the violation relates to a
reportable crime. The SGI Amendment Rules, however, remain silent
on the periodicity of these warnings.
Business Impact: SGI-enabling platforms must deliver both universal quarterly warnings and separate SGI-specific warnings on an undefined schedule, compounding notification fatigue and user friction while creating enforcement risk since platforms cannot determine whether daily, per-session, or per-feature warnings satisfy the unspecified "periodicity" requirement. - Reporting Obligation vis-vis SGI: If the
intermediary "becomes aware" either on its own accord,
through receipt of actual knowledge or based on any grievance of
the creation, alteration, modification, hosting, displaying,
uploading, publishing, transmitting, storing, updating, sharing or
otherwise dissemination of information as "SGI" the
intermediary is required to immediately disable access, suspend or
terminate a user account, disclose the identity of the violating
user, and report such user in case the violation relates to a
reportable crime.
Business Impact: The requirement for immediate action upon SGI awareness creates a compliance paradox, i.e., whether to moderate to avoid liability for delayed responses, or risk exposure to regulatory penalties and potential criminal exposure for under-enforcement. - Deploying Detection Tools: Reasonable and
appropriate technical measures, including automated tools or other
suitable mechanisms, to not allow any user to create, generate,
modify, alter, publish, transmit, SGI must be employed by any
intermediary that may enable, permit, or facilitate the creation,
generation, modification, alteration, publication, transmission,
sharing, or dissemination of information as SGI.
Business Impact: Presently, the stand of such technical measures remains unknown, requiring intermediaries to substantially invest in detection technology that doesn't reliably exist at scale while bearing liability for any SGI that bypasses their controls. - Labeling Obligations: Intermediaries must
implement a dual-layer disclosure system combining user-facing
labels with technical tracking mechanisms.
On the user-facing side, visual content like images and videos must display a prominent, easily noticeable label that clearly identifies the material as SGI, i.e., the disclosure should be positioned conspicuously and worded in a manner that ordinary viewers can immediately recognize the content was created or modified by AI. Similarly, in case of audio content, a pre-fixed audio disclosure of SGI must be made.
Beyond these visible and audible warnings, intermediaries must also embed permanent metadata and technical provenance mechanisms within the content files themselves, including unique identifiers that trace back to the specific computer resource or platform used to create the synthetic content. This technical layer creates a digital fingerprint that persists even when files are downloaded or shared, enabling verification and accountability. Importantly, the rules acknowledge practical limitations by requiring these technical measures only "to the extent technically feasible," recognizing that certain file formats or distribution channels may not support robust metadata preservation.
Business Impact: Labelling necessarily degrades the appeal and shareability of AI-generated content, which directly undermines the value proposition of creative AI tools used for marketing, entertainment, and social media. The requirement to embed permanent metadata and unique identifiers in content files demands substantial engineering investment in provenance tracking systems (such as C2PA standard implementation), creates ongoing maintenance costs for metadata preservation across format conversions and distribution channels, and exposes platforms to liability when users strip metadata or share content through third-party services that don't preserve technical markers. While the "to the extent technically feasible" qualifier provides some flexibility, it is little respite since platforms must prove infeasibility rather than assume it, creating compliance uncertainty and potential enforcement risk when metadata is lost through legitimate technical limitations. - Shortened Take Down Timelines: Apart from the
3 (three) month (from yearly) notification to users discussed
above, set out below is a snapshot of all shortened
timelines:
Compliance Erstwhile Timeline New Timeline Take Down for any unlawful act (and not just SGI) pursuant to actual knowledge from a Court Order or intimation by an officer 36 hours 3 hours Resolve any grievance compliant 15 days 7 days Take down of content in respect to a grievance compliant 72 hours 36 hours Removal of sexual, nude or morphed content, from receipt of compliant 24 hours 2 hours
Enhanced Obligations upon Significant Social Media Intermediaries ("SSMI")
An intermediary with 50,00,000 (fifty lakh) (or 5,000,000 (five million)) registered users is required to ensure that its users declare whether any information is SGI and necessarily deploy AI detection tools (unlike other intermediaries that may endeavor to do so). To the extent that declaration or technical verification confirms that the information is SGI, appropriate labels or notices must be placed to indicate that the content is SGI. The SGI Amendment Rules further clarify that it is the liability of the SSMI to take reasonable and appropriate measures to verify the correctness of declarations made by users and to ensure that no SGI is published without label.
Business Impact: Beyond mandatory deployment of detection tools, SSMIs must operationally integrate mandatory user disclosure, verification workflows, and automated labelling controls, effectively shifting responsibility and costs onto intermediaries to record and prevent unlabeled SGI from being published and increasing compliance and moderation costs.
Conclusion
The SGI Amendment Rules mark a shift in India's intermediary liability framework from reactive content moderation toward proactive governance of SGI. While the amendments formally target synthetic audio-visual content, their operational effect extends more broadly, requiring platforms to redesign onboarding, moderation, detection, and labeling workflows to preserve safe harbour protection.
For AI developers and deployers, the immediate impact will be most acute where tools enable generation or distribution of realistic SGI. Businesses should therefore undertake prompt reviews of product architecture, user flows, and risk controls to ensure that AI-enabled services can continue to operate within India's evolving regulatory perimeter.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.