Introduction
A deepfake connotes a highly realistic synthetic media of a real person, generated by an Artificial Intelligence. While a parallel can be drawn between photo-alteration technology and deepfakes, the latter is inherently disingenuous because it makes it difficult to ascertain doctoring. The gravity of leaving this technology unregulated is severe because it can be used to disseminate misinformation with drastic political, reputational and financial implications.
Recently, the Ministry for Electronics and Information Technology organised the Digital India Dialogues on Misinformation and Deepfakes. The government cautioned social media and other internet-based intermediaries that a failure to act on Deepfake content posted on their platforms, amongst the eleven other ‘user harms’ stipulated in the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules of 2021 (“IT Rules”), could warrant penalisation under the relevant sections of the Indian Penal Code, 1860 (“IPC”). This position could decidedly mark an influx of litigation, especially given the dearth of effective guidelines and adjudicatory mechanism to adjudge intermediary liability.
Intermediary Liability in India
Liability accorded to digital intermediaries vis-à-vis cybercrimes has undergone a transient shift in recent years. We have previously written an article on the legal liability of intermediaries in India, which can be accessed here. Section 79 of the Information Technology Act, 2000 (“IT Act”), introduced the ‘safe-harbour immunity clause’, which absolves an intermediary from any liability for third party content on their platform, provided they comply with the prescribed due diligence requirements. With the 2008 Amendment to the IT Act, the provision was modified to lay down certain stipulations that a digital intermediary requires to comply with, to qualify for availing immunity.
The IT Rules have furthered this regulatory regime. Rule 3 of the IT Rules substantiates the standard of due diligence that must be upkept, which includes inter alia: (a) publication of regulations, policies and user agreements and their enforcement; (b) disabling access to unlawful information in under 36 (thirty-six) hours from being notified by the competent authority; (c) undertaking reasonable security practices; and (d) reporting cyber security incidents and sharing any related information with the requisite enforcement agencies. Rule 4 of the IT Rules provides specific due diligence requirements for significant social media intermediaries. The accountability expected of digital intermediaries has gradually evolved to a liability, failure of which shall accrue legal consequences.
This position is mirrored by an advisory issued by the Government of India, dated November 7, 2023, to significant social media intermediaries. It fastens upon them a responsibility of ensuring expeditious action against Deepfakes within the timelines stipulated in the IT Rules, a contravention of which would rescind their immunity under Section 79(1) of the IT Act. This dilution of the safe harbour principle was also a central tenet of the proposed Digital India Bill, 2023, which signaled the need to reconsider the extent of liability avoidance of social media intermediaries, among others. It was observed that the platforms for which the principle was created have morphed into functionally diverse platforms, which required different thresholds of regulatory requirement.
Liability under existing laws
Deepfakes can be used for identity theft, criminal intimidation, hate speech, creation of false representations of individuals, or to manipulate public opinion, causing reputational and credibility loss and spreading misinformation in public, which is a fundamental breach of the right to privacy vested with every individual. All the aforelisted acts are punishable under the IPC and require adjudication of criminal liability by competent courts of law.
Section 79(3) of the IT Act, however, shifts the liability on intermediaries under two circumstances: (a) they have conspired, abetted, aided or induced the commission of an unlawful act; or (b) they have failed to act expeditiously and disable access of any data residing in a resource within their control, which is being used to commit an unlawful act. However, such provisions require substantial proof at the behest of the injured.
Failure to block access to deepfakes when directed to do so by a competent authority, can be punishable with imprisonment for a term of up to seven years, or a fine, or both under the IT Act. Consequences as severe as being charged with cyber terrorism or waging a war against the Government of India could also be invoked if the deepfake content is ascertained to erode at the sovereignty of the nation and an intermediary derelicts to take action against it. An intermediary can also be punished for publication and transmission of obscene material in electronic form. Provisions for criminal breach of trust, criminal conspiracy and cheating under the IPC may also be attracted, basis the circumstances of each case.
Challenges of Adjudication
The IT Act provisions a two-tiered framework to enable victims to hold the wrongdoers accountable and seek damages, namely an Adjudicating Officer and the Cyber Appellate Tribunal (“CAT”). Cybercrime cells of police stations, and subsequently, viz courts possessing criminal jurisdiction usually deal with offences under the IT Act. However, the implementation of this hierarchical system has not been as envisioned.
Firstly, lack of any guiding jurisprudence on investigation and adjudication of ever-evolving cyber contraventions gravely affects the capacity of these bodies to decide nuanced matters. For instance, offences involving deepfakes are often enmeshments of constitutional principles such as privacy, in-rem rights such as intellectual-property rights, and at times, cross-border application of laws.
Secondly, post the report on the functioning of CATs in the country by the Parliamentary Standing Committee in 2015, CAT was merged with the Telecom Disputes Settlement and Appellate Tribunal (“TDSAT”) in 2017. Given the separate subject matters of both these forums, the technical backing behind such appellate determinations is lacking.
Thirdly, the multiplicity of forums for reporting frauds and hacking, such as the National Cyber Crime Reporting Portal, create the scope for conflict between the authorities vested with power under the IT Act and other sectoral authorities.
Key Takeaways
With the evolution of technology, the speed with which deepfakes are generated would grow exponentially. Thus, ascertaining liability is key and will have to be adjudicated speedily, depending on the sophistication of the deepfake technology employed against the standard of diligence upkept by the intermediary. The unique need for succinct technical expertise with judicial principles against the backdrop of a vastly developing field necessitates a special and distinct adjudicatory system.
While the Ministry for Electronics and Information Technology has been engaging in discussions with digital intermediaries by way of conducting the Digital India Dialogues on Misinformation and Deepfakes, it is argued that there is a greater need for stricter rules within the IT Rules to fasten stricter liability on any omissions on the part of intermediaries to take down Deepfake content.
However, the imposition of criminal liability under the IPC on such omissions/ non-fulfillment of due diligence obligations marks a legal paradigm, which would have to be tested against the sufficiency of the judicial system to enforce it. Therefore, creating a specialised division within the existing criminal justice system, exclusively dealing with matters of criminal offences under the IPC, arising from use of generative AI systems could balance the technical nuances of regulating AI related legal liabilities such as Deepfakes, while simultaneously upkeeping the right to freedom of speech and expression of digital users, against unwarranted censorship.
For further information, please contact:
Ankoosh Mehta, Partner, Cyril Amarchand Mangaldas
ankoosh.mehta@cyrilshroff.com