You are on page 1of 8

SABU MATHEW GEORGE V.

UNION OF INDIA: CASE ANALYSIS IN LIGHT OF SECTION 79


OF THE IT ACT, 2000 WITH SPECIAL FOCUS ON THE CONCEPT OF INTERMEDIARY
LIABILITY
(Project towards fulfilment of the assessment in the subject of Law relating to Cyber Crimes)

SUBMITTED BY: Abhijeet Singh


Roll No. 1264
Ashish Kumar

Roll No. 1269

SUBMITTED TO: Ms. Preeti Badola


Faculty of Law

NATIONAL LAW UNIVERSITY, JODHPUR


WINTER SESSION
(JANUARY – MAY 2020)
INTRODUCTION

In 2008, Sabu Mathew George, a gender activist and doctor, filed a writ petition in the
Supreme Court of India to ban advertisements related to pre-natal sex determination from
search engines like Google, Bing and Yahoo. It was contended by the petitioner that
the display of these results violated Section 22 of the Pre-Conception and Pre-Natal
Diagnostic Techniques (Prohibition of Sex Selection) Act, 1994 (“PCPNDT Act”). In their
reply, the respondents argued that they are “conduits” and not content providers and hence
protected un-der Section 79 of the IT Act. It was also argued that there are innumerable
activities banned by law, but their information is still available online and offline.
Disentitling anyone from receiving information or gaining knowledge on a subject is
violative of Article 19(1)(a) of the Constitution, which includes the right to know and
right to receive or access information. Over the course of proceedings, the court is-
sued interim orders directing Google, Microsoft and Yahoo to ‘auto-block’ pre-natal
sex determination ads from appearing in search results. The court also drew a list of forty
key words that were to be auto-blocked if anyone attempts to look them up. Expert in-house
committees were directed to be formed by search engines to evaluate and delete content
violative of Section 22 of the PCPNDT Act “based on its own understanding.” The
Supreme Court also directed the Central Government to constitute a nodal agency for
receiving complaints from anyone who came across anything that has the nature of an
advertisement or has any impact in identifying a boy or a girl in any method, manner or
mode by any search engine. The nodal agency was then required to convey actionable
complaints to the concerned intermediaries, who were obliged to delete the content in
question within 36 hours and intimate the nodal agency. This petition was disposed off in
December 2017, with the apex court issuing additional directions to the newly formed
nodal agency and expert committee to hold a meeting with the assistance of the petitioner’s
legal team, “so that there can be a holistic understanding and approach to the problem”.
Google, Yahoo and Microsoft were also directed to work with the committee to
identify and implement a “constructive and collective approach to arrive at a solution”.

LIABILITY UNDER THE IT

Act beyond Section 79 of the IT Act, Section 81 is a non-obstante clause, providing for an
overriding effect of the IT Act over all other laws in times of conflict. But, this clause carves

2
out an exception for copyright and patent holders.1The Intermediaries Guidelines also require
intermediaries to notify their users for not uploading content that -“infringes any patent,
trademark, copyright or other proprietary rights”2 and to not host/ publish such content on
their platforms.

ARE SEARCH ENGINES ‘CONDUITS’ OR ‘CONTENT-PROVIDERS’?

An earlier order in this case, dated December 4th, 2012, states that the respondents argued
that they “provided a corridor and did not have any control” over the information hosted on
other websites.

There is often confusion surrounding the characterization of search engines as either


‘conduits’ or ‘content-providers’. A conduit is a ‘corridor’ for information, otherwise known
as an intermediary. A content provider however, produces/alters the displayed content. It has
been suggested by authors like Frank Pasquale that search engines (Google specifically) take
advantage of this grey area by portraying themselves as conduits or content-providers, to
avoid liability. For instance, Google will likely portray itself as a content-provider when it
needs to claim First Amendment protection in the United States, and as a conduit for
information when it needs to defend itself against First Amendment attacks. When concerns
related to privacy arise, search engines attempt to claim editorial rights and freedom of
expression. Conflictingly, when intellectual property matters or defamation claims arise, they
portray themselves as ‘passive conduits’.

In the Indian context, there has been similar dissonance about the characterization of search
engines. In the aftermath of the Sabu Mathew George judgment, the nature of search engines
was debated by a few. Apar Gupta pointed out that the judgment would contradict the
Supreme Court’s decision reading down Section 79(3)(b) of the Information Technology Act,
2008 (IT Act) in Shreya Singhal vs. Union of India 3, where the liability of intermediaries was
restricted. Therefore, he characterized search engines as passive conduits/intermediaries.
According to him, the Sabu Mathew George judgment would effectively hold intermediaries
liable for content hosted unbeknownst to them. Others have criticised this argument, stating
that if Google willingly publishes advertisements through its Ad Words system, then it is a
publisher and not merely an intermediary. This portrays Google as a content-provider.

1
Information Technology Act 2000, Section 81
2
[Intermediaries Guidelines (Amendment) Rules], 2018, Rule 3(2)(d) and Rule 3(3)
3
Shreya Singhal v. Union of India, (2013) 12 SCC 73

3
‘ORGANIC SEARCH RESULTS’ AND ‘SPONSORED LINKS’

One important distinction in this case is between ‘organic search results’ and ‘sponsored
links’. A submission by the DeITY explaining the difference between the two was not
addressed by the Supreme Court in the order dated December 4th, 2014.

The PNDT Act criminalizes the display of ‘advertisements’ 4, but does not offer a precise
definition for the term. The respondents argued that ‘advertisement’ would relate to
‘sponsored links’ and not ‘organic search results’. As per the order dated September 19th,
2016, Google and Microsoft agreed to remove ‘advertisements’ and stated that search results
should not be contemplated under Section 22 since they are not ‘commercial
communication’. However, on November 16th, 2016, the Supreme Court stated that the block
would extend to both ‘sponsored links’ and ‘organic search results’. The respondents
expressed concern against this rationale stating that legitimate information on pre-natal sex
determination would be unavailable, and that the ‘freedom of access to information’ would be
restricted. The Court stated that this freedom could be curbed for the sake of the larger good.

THE ‘DOCTRINE OF AUTO-BLOCK’

By the order dated September 19th, 2016, the Court discussed the ‘doctrine of auto block’
and the responsibility of the respondents to block illegal content themselves. In this order, the
Court listed roughly 40 search terms and stated that the respondents should ensure that any
attempt at looking up these terms would be ‘auto-blocked’. The respondents also agreed to
disable the ‘auto complete’ feature for these terms.

However, according to this empirical study conducted by The Centre for Internet & Society,
blocking these specific search terms has not proven successful since websites linking to sex
selection still show up.5

In addition, Google has blocked search terms from their auto-complete system in several
other countries, often with little success. This paper points out that illegal search terms
relating to child pornography have been allowed on auto-complete while more innocuous
terms like ‘homosexual’ have been blocked by Bing, proving that this system of blocking has
several discrepancies.

4
Pre-Conception and Pre-Natal Diagnostic Techniques Act, 1994, Section 22
5
https://cis-india.org/internet-governance/blog/search-engine-and-prenatal-sex-determination

4
Other than a chilling effect on free speech, disabling auto complete can also lead to other
adverse effects. In one instance, the owner of a sex-toy store complained about her business
not benefitting from the autocomplete feature, like several others had. She stated that “…
Google is … making it easier for people to find really specific information related to a search
term. In a sense it’s like we’re not getting the same kind of courtesy of that functionality”.
Similarly, several legitimate websites discussing pre-natal sex determination might lose
potential readers or viewers if ‘autocomplete’ is disabled.

ANALYSIS

A. Conflict with the Shreya Singhal Judgement and International Norms

The orders of the Supreme Court in the Sabu Mathew case suggest that intermediaries can be
held liable even for information without prior knowledge. This is surprising, considering the
landmark judgment of the Supreme Court in the case of Shreya Singhal v. Union of India 6,
where S.79(3)(b) of the Information Technology Act was read down by the Supreme Court to
mean that an intermediary can be held liable only when, “the intermediary upon receiving
actual knowledge that a court order has been passed asking it to expeditiously remove or
disable access to certain material must then fail to expeditiously remove or disable access to
that material”.

The decision of the Supreme Court also conflicts with the view expressed by a host of
committees on Internet rights and freedom of speech. The UN’s Special Rapporteur on
Freedom of Opinion and Expression has expressly stated that,

“while a notice-and-takedown system is one way to prevent intermediaries from actively


engaging in or encouraging unlawful behaviour in their services, it is subject to abuse by
both State and private actors. Users who are notified by the service provider that their
content has been flagged as unlawful often have little recourse or few resources to challenge
the takedown. Moreover, given that intermediaries may still be held financially or in some
cases criminally liable if they do not remove content upon receipt of notification of users
regarding unlawful content, they are inclined to err on the side of safety by over-censoring
potentially illegal content. Lack of transparency in the intermediaries’ decision making
process also often obscures discriminatory practices or political pressure affecting the
companies’ decisions. Furthermore, intermediaries, as private entities, are not best placed to

6
Shreya Singhal v. Union of India, (2013) 12 SCC 73

5
make the determination of whether a particular content is illegal, which requires careful
balancing of competing interests and consideration of defences.”7

B. Concerns that arise from the case

We are concerned with the developments in the Sabu Mathew George case and consider it
problematic for a host of reasons.

Firstly, the doctrine of auto-block can easily lead to legitimate information being censored.
Even if the website uses the words in a legal manner, access is blocked because of the mere
presence of the questionable terms. This leads to censorship of potentially legal information,
and thereby dilutes the right of freedom of speech and expression.

Secondly, the nodal agency established as per the 16.11.2016 order goes against the
Information Technology (Procedure and Safeguards for Blocking for Access of Information
by Public) Rules, 2009. This is because Rules prescribe that there will be a Committee for the
examination of requests for blocking access will review each blocking request and verify
whether or not it is in lines with S.69A of the IT Act. There is no such review committee as
per the Sabu Mathew George case as the Nodal Agency is merely responsible for intimation
of the request to the intermediary. Furthermore, as per the rules, intermediaries are provided a
48-hours window to respond to the request for blocking, whereas the Sabu Mathew George
orders only provide for a 36-hours window.

Thirdly, the nature of the ban is more generic than content-specific. The very basis of the
censorship is a list of words and their combinations that have been censored, and these words
in themselves may not necessarily be illegal - and could in fact often result in censorship of
medical literature and other likely legitimate discussions pertaining to this subject.

In light of these reasons, the Sabu Mathew George orders have caused a deep apprehension
that the safeguards provided to intermediaries under the IT Act and as per the Shreya Singhal
case would be diluted and would no longer effectively protect the intermediaries. Therefore,
we urge that the Union Government - after obtaining legal opinion of its law officers - should
make the following submissions in Court by filing a counter affidavit to support the rights of
Internet users:

7
Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and
expression available at https://documents-dds ny.un.org/doc/UNDOC/GEN/G16/095/12/PDF/G1609512.pdf?
OpenElement

6
That the doctrine of auto-block and blocking of auto-complete not be used as it leads to over-
censorship and impacts the right of freedom of speech and expression. The doctrine of auto
block is a dangerous act of censorship which would lack proper safeguards.

Restate the safe harbors for intermediaries as contained in the judgment of Shreya Singhal v.
Union of India and argue against their dilution. The position of law with respect to
intermediary liability stated in Shreya Singhal should not undermined by carving out
exemptions as being attempted in the Sabu Mathew George.

C. Sabu Mathew George defies existing legal standards

As mentioned above, the Sabu Mathew George judgment contradicts the Supreme Court’s
decision in Shreya Singhal, where the liability of intermediaries was read down under Section
79 (3) (b) of the IT Act. The Court in Shreya Singhal 8 held that intermediaries would only be
compelled to takedown content through court orders/government notifications. However, in
the present case, the Supreme Court has repeatedly ordered the respondents to devise ways to
monitor and censor their own content and even resort to ‘auto-blocking’ results.

The order dated November 16th, 2016 also contradicts the Blocking Rules under the
Information Technology Act, 2008. In the order, the Supreme Court directed the Center to
create a ‘nodal agency’ which would allow people to register complaints against websites
violating Section 22 of the PNDT Act. 9 These complaints would then be passed on the
concerned search engine in the manner described below-

“Once it is brought to the notice of the Nodal Agency, it shall intimate the concerned search
engine or the corridor provider immediately and after receipt of the same, the search engines
are obliged to delete it within thirty-six hours and intimate the Nodal Agency.”

The functioning of this nodal agency would circumvent the Information Technology Act
Blocking Rules. Under the Blocking Rules, the Committee for Examination of Requests
reviews each blocking request and verifies whether it is in line with Section 69 of the IT Act.
The Sabu Mathew George order has no such review system in place and also lowers the 48-
hour time limit to 36 hours. While the author acknowledges that the nodal agency’s blocking
rules are not a statutory mandate, its actions could still lead to over-blocking.

8
Shreya Singhal v. Union of India, (2013) 12 SCC 73
9
https://www.thehindu.com/news/national/body-set-up-to-deal-with-sex-determination-info-on-internet-govt-
tells-sc/article19625444.ece

7
CONCLUSION AND SUGGESTIONS

The author would like to make two broad suggestions. First, the functioning of the nodal
agency should be revisited. The recommended system lacks accountability and transparency
and will certainly lead to over-blocking and will also lead to a chilling effect. Second, search
engines should not be given over-arching powers to censor their own websites. It is well-
established that this leads to over-censorship. In addition to contradicting Section 79(3)(b) of
the IT Act, the Court would also be delegating judicial authority to a private search engine.

The Supreme Court seems to be imposing similarly arbitrary rules upon search engines in
other judgments. Recently, the Court ordered Google, Microsoft and Yahoo to create a
‘firewall’ that would prevent illegal videos from being uploaded to the internet. They cited
the example of China creating a similar firewall to prove the feasibility of the order.

You might also like