The Implications of the ‘Fact Check Unit’ Under the IT Rules 2021

K Chakra Pani

Introduction

The Information Technology Rules (IT Rules) were enacted in 2021 to primarily regulate the online intermediaries who were providing content on their platforms, be it social media or others. These IT rules are to be within the scope of its parent legislation, the Information Technology Act 2000. This Act regulates and substantiates the conduct of people on media platforms and ensures genuine supervision over most activities online. The IT Rules were amended in 2023 owing to the growing space of online gaming and money games, with more platforms coming up with similar ideas. Although the Rules majorly focus on online gaming and other platforms, it provide for a ‘Fact Check Unit’ (FCU) of the Central Government, which comes under Rule 3, providing for due diligence to be taken by the Intermediary. Before going into the same, it is to be noted that the general rule of legislating subordinate legislations like Rules is that they must not go beyond the extent of the parent Act, or else it will be considered ultra vires the Act and can be declared invalid by the Courts. [1]

Controversy around the FCU

Rule 3(1)(b)(v) of the Rules provides that the intermediary shall take reasonable steps not to display content or information that deceives or misleads the person or intentionally and knowingly spreads information that is untrue and misleading. In case any information is connected with the ‘business of the Central Government’, it shall be identified by the FCU of the Central Government, who shall instruct the intermediary to take it down.[2] The FCU gets information relating to the functioning or business of the Central Government. If that is false, fake, or misleading, it can notify the intermediary to make reasonable efforts to prohibit the spreading of such info. While this prima facie seems reasonable and justified, the problem arises over the terms used, like ‘business, fake, false, misinformation,’ which do not have a specific meaning attached to them. The use of such abstract terms and phrases empowers the government to not only take down content that it finds to be misleading but also censor comments and criticisms made against it. Essentially, the FCU exercises special supervision over the ‘business of the Central Government’ and such supervision is over no other area of concern like fake news, adulterated content, or others.

Rule 3 provides for the role of the intermediary to take ‘reasonable efforts’ in exercising due diligence over content specified in Rule 3(1)(b)(i) to (xi). However, the Unit is established only for information that relates to the ‘business of the Central Government’. Several other categories of information that are significantly important, like pornographic content, threatening the sovereignty, unity, and integrity of the nation, or disrupting public order, are not given such scrutiny. For such content, the intermediary has the responsibility to make efforts to take down such content. But in case of intentional or unintentional information on the business of the government, the FCU has complete authority over its publication, and the intermediary is a mere facilitator of such information. The need for an FCU for only specific information and no other content is questionable. The Central Government is concerned with the security, integrity, and unity of the nation through its Defence and Home Ministries, but such content threatening the same is not regulated through an FCU; instead, it is the intermediary itself. But for the abstract ‘business’, an FCU is created, which raises doubts over the scope of ‘business’, which could include anyone’s comments, criticisms, sarcasm, and other forms of online speech.

Rule 4 provides for additional due diligence to be followed by a significant social media intermediary, which includes further functions and responsibilities over content published over its platform.[3] Clause (4) of the Rule states that the intermediary shall use tech-based measures to identify content that depicts rape, child sexual abuse, or any other related content. The proviso states that the measures to regulate such content shall be ‘proportionate concerning the interests of free speech and expression.’ A plain reading of this clause would imply that the content of rape and child sexual abuse would not require the need for a separate body like an FCU; the responsibility is on the intermediary to take such measures. However, provision for the consideration of freedom of speech and expression is not given for content relating to the ‘business of the Government’. This implies that the FCU has blanket supervision to declare whether any information is fake, false, or misleading regarding the Central Government’s business, and there would be no restriction relating to grounds of freedom of speech and expression. Rule 4(4) considers the interests of freedom of speech and expression under Article 19(2) of the Constitution of India. However, no such ground is mentioned for regulating content in government business.

The Courts on the establishment and validity of FCU

Soon after the Amendment was brought out, there was widespread criticism and concern over the powers and functions of the FCU in regulating content. There was a concern that any content or information could be taken down unfairly under the garb of ‘business of the government’, thereby potentially attacking criticisms, sarcastic comments, and others. The stand-up comedian, Kunal Kamra, filed a petition before the Hon’ble Bombay High Court challenging that the rules violate his rights under Articles 19(1)(a), 19(1)(g), and 14 of the Constitution.[4] The Bombay High Court delivered a split verdict for which the hearing took place before another judge, who decided that the Amendment was not unconstitutional and could be notified for implementation. Going before the Supreme Court, the petitioners, Kunal Kamra and others, challenged the notification for implementing the rules, which was subsequently stayed, and could be implemented only after the final orders were delivered by the Bombay HC.

The petitioners contended that the freedom of speech and expression was being curtailed by this Rule and was not a reasonable restriction under Article 19(2). There are no references to ‘national security’, ‘public interest/order’, and others under the Article. They are mentioned in another clause in the Rules, and ironically, any content violating the same does not warrant the interference of the FCU as it is not ‘the business of the government.’ Such content cannot be regulated under ‘public interest’ as it is not within the domain of Article 19(2), and the same does not cover ‘business of the government’. The State cannot compel the intermediary to curtail a certain speech or content being published; that amounts to censorship. The challenge under Article 14 is that the setting up of FCU for special purposes fails the test under Article 14 as there is no reasonable classification to ensure equality in the regulation of distinct categories of content published online.

The respondents contended that there is no violation of any fundamental rights and that the FCU is a valid body to regulate content relating to the business of the government. The phrase ‘business of the government’ can be defined as required by the Executive desires. Relying on the plain reading of the Rule, the respondents stated that the FCU will act upon only when the information is ‘patently false’ or affects the business of the government as notified by the FCU, and prima facie does not curb free speech. The regulation by the FCU is to merely regulate the information published on the intermediary’s platform and does not criminalise it, thereby falling under Article 19(2) as a reasonable restriction. The broader argument of the respondents was that the rule seeks to ascertain ‘authentic information’ for the ‘containment of public harm to the public at large.’

The current status is that the IT Amendment Rules 2023 providing for setting up an FCU is stayed vide order of the Supreme Court.[5]

Is there a need to establish an FCU?

The Press Information Bureau (PIB) is notified as the Fact Check Unit under the IT Rules. This raises concerns about the government’s direct involvement in regulating online content. The FCU is a wing of the Government that can regulate content that is connected with the ‘business of the government’. Implicitly, the government can choose to take down content that it does not find true about itself under the garb of fake, false, or misleading news. Before the Amendment Rules 2023, the intermediary had all the responsibility to regulate content over its platform, failing which it would lose its safe harbour protection. If any dispute arose over the publication of content, the matter would be referred to the different authorities set up by the IT Rules. However, the amendment rules bring in a new regulating body over specific content that is concerned with the government’s business, and that is the Fact Checking Unit headed by the central government. Any information relating to its business would be identified by the Unit and the intermediary would be notified to take such content down, or else it would lose its safe harbour protection. The intermediary already exercises complete regulation over every other kind of content and information published specified in sub-clauses (b)(i) to (b)(xi) of Rule 3(1), except that concerning the business of government. Ironically, the PIB, which is the top authority for authenticating information on the business of the government, is now the FCU. Thus, it can now choose which content can be displayed or not and not specify any grounds for the same. The cover of false, fake, or misleading information is the only one available to the FCU to order the intermediary to take down select content. There is a scope for extremes: absolute truth and absolute falsehood.

Ambiguous scope of powers of FCU

The Amended Rule 3(1)(b)(v) creates a lot of room for interpretation, which can go beyond the scope of the parent Act, the IT Act. Section 69A of the Act does not grant powers for the regulation by the Central Government or any unit constituted by it. The present rule, if interpreted along with its parent Act, is invalid as it goes beyond the scope of the Act itself, which grants many powers to regulate content online.

Rule 3(1)(b)(v) provides for three different types of content that can be regulated. One is deceiving or misleading the person about the origin of the content. Another is the intentional and knowingly communicating any false, fake, or misleading information, and any false, fake, or misleading information which concerns the business of the government. For the third category, no element of intention or knowledge is required. The mere act of publication of content can be identified and taken down by the FCU. The FCU does not provide any justification or reasons for taking down any content; rather it classifies such content as fake, false, or misleading. In essence, the FCU is the main authority that decides what is true or false, and not the people of the nation. The Central Government, through this unit, acts as a censorship authority, deciding what can be published or not, and anything that goes against its business can be brought under the garb of false, fake, or misleading information. Vagueness in the powers conferred by provisions of law, which can lead to unguided and unlimited power, can be grounds for invalidation.[6]

Another point to be noted is the ‘mandate’ or ‘compulsion’ through Rule 7 of the IT Rules that, if the intermediary does not take action on the orders of the Government given under the Rules, the intermediary shall lose its privilege of safe harbour and can be held liable for the acts and publications of its users. Safe harbour is an essential component in the existence of the intermediary, without which it can be held liable for the violative acts of the users. It is implied that when an FCU identifies any content that is false, fake, or misleading, the intermediary has to take down such content without any justification being given. There is no opportunity to be heard given either to the intermediary or to the user who published the content. This creates indirect censorship over anything posted online, including comments, criticisms, and others, thereby restricting the freedom of speech and expression of users publishing on the intermediary’s platform. There is no opportunity for the intermediary to interpret whether such content on the business of the government is false, fake, or misleading, unlike in other categories where the intermediary has discretion and authority over the content published. Such categories include pornographic content, infringement of copyright or trademark, threatening the unity, integrity, and security of the nation (which ironically is not classified as the ‘business of the Central Government), or violating any existing law.

Conclusion

The need for Government intervention in online media is needed to prevent and prohibit acts of complete immorality and illegality like DeepFakes, fake news, etc. It is to be noted that such interference has to be regulated. Like every other law, this law, the IT Rules, is a double-edged sword, capable of being used for the benefit of the people if executed with the right intention or can be misused under the name of ‘protection of the public from public harm’ as quoted by the respondents in the Bombay High Court during the hearings on the validity of the IT Amendment Rules in  Kunal Kamra v. Union of India.[7] The law can be used according to its legislative intention (which contrarily suggests that FCU is a tool for censorship by the Government) or can be executed according to its textual interpretation, which is rarely the case considering the consistent misuse of misinterpreting the laws by the Centre for their own benefit. The FCU is a necessary evil, but it must be used with the right intention, or else it is nothing short of creating censorship over people’s voices.

[1] State of Tamil Nadu v. P Krishnamurthy & Ors. (2006) 4 SCC 517.

[2] Rule 3(1)(b)(v), The Information Technology Rules, 2021. (Inserted by IT Amendment Rules, 2023).

[3] Rule 4, the Information Technology Rules, 2021.

[4] Kunal Kamra v. Union of India, W.P. No. 9792/2023 Bombay HC.

[5] Anmol Kaur Bawa, Supreme Court Stays Centre’s Notification Of ‘Fact Check Unit’ Under IT Rules Till Final Decision By Bombay HC, LiveLaw (Mar. 21, 2024), https://www.livelaw.in/top-stories/supreme-court-kunal-kamra-editors-guild-notifying-fact-check-unit-it-rules-2023-252998.

[6] Government of Tamil Nadu & Ors. v. R Thamaraiselvam & Ors. (2023) 7 SCC 251.

[7] Supra Note 7.