Digital Threats to Free and Fair Elections

Monitoring Political Content On Social Media

Radhika Jha*


Social media and online platforms have become the new battleground for electoral politics. Although access to smartphones and the internet is limited to privileged sections, even this restricted access has empowered historically marginalised voices. Platforms like Facebook and YouTube, initially intended for entertainment and social connection, now host a wide range of political opinions and serve as primary news sources for many users

The ability to harness digital platforms to influence public emotions and perceptions can significantly sway electoral choices. Political parties are investing heavily on digital campaigns to shape voter perceptions.

This article explores the use and misuse of digital space by Indian political parties, its role in shaping electoral politics, the legal framework in place, and efforts by civil society to hold tech platforms accountable for spreading misinformation and hatred.

Mapping the Landscape of Misinformation and Hate Speech

A 2022 Reuters Institute survey found that 63 per cent of English-speaking online users rely on social media for news. Further, according to the Internet in India Report 2023, 55 per cent of Indians, or 821 million Indians, are using the Internet. Another survey found that 33 per cent of the users of social media said that they frequently read news related to politics on social media (Lokniti-CSDS, 2019). Thus, social media’s role in disseminating news has grown exponentially. However, the lack of oversight or monitoring of fake news, misinformation, and hate speech on these platforms poses serious risks.

These shifting preferences can have major ramifications in the absence of any monitoring of fake news, misinformation, disinformation and hate news.

The ability to harness digital platforms to influence public emotions and perceptions can significantly sway electoral choices. Political parties are investing heavily on digital campaigns to shape voter perceptions.

From the Cambridge Analytica scandal of the 2010s1 to the Capitol attack in the USA in 20212 , there are many examples over the years of how big data harvesting, microtargeting and manipulation of social media can seriously impact electoral choices and the political climate of even developed countries, which typically have better legal mechanisms to monitor and prevent such digital manipulations and to ensure digital privacy and autonomy.

The Indian electorate is far from immune to such harmful influences of social media and there is a lot more opaqueness regarding the extent of manipulation of social media by political parties. Reports indicate that tech companies favour certain political parties, such as Facebook offering cheaper advertisement deals to Bharatiya Janata Party (Sambhav & Ranganathan, 2022) and fail to curb misinformation and hate speech, especially around elections, while there is little to no effort by the government to make the tech companies accountable in any way. Sophisticated methods of spreading misinformation, such as deepfake videos, have additionally made it challenging to distinguish between real and fake content.

Despite India being Facebook’s largest market, only a fraction of the company’s budget is allocated to combat misinformation in the country

Surrogate and shadow advertisements by political parties have been used extensively over the last decade, as has been documented by various investigations. A 2024 study by multiple organisations found, among other things, evidence of one million USD spent by 22 far-right shadow advertisers over 90 days on Meta, 36 ads promoting hate speech, Islamophobia, communal violence and misinformation over a 90-day period on Meta, a coordinated network of ads promoting BJP and terming opposition, women lawmakers, journalists and activists as “anti-India” with more than 23 million interactions over a 90-day period (Ek?, ICWI & The London Story, 2024).

Whistleblowers like Frances Haugen have exposed how bots and fake accounts tied to Indian political figures disrupt elections. As an example, a test user created a new Facebook account as a person living in Kerala and followed all the algorithms suggested by the website for three weeks. The result was an inundation of hate speech, misinformation and celebrations of violence. The content included graphic images of dead people, violence and gore (Frenkel and Alba, 2021). Despite India being Facebook’s largest market, only a fraction of the company’s budget is allocated to combat misinformation in the country (Zakrzewski, De Vynck, Masih & Mehtani, 2021).

Different Countries, Different Policies

In fact, there is not just an uneven distribution of resources to combat disinformation by the tech companies, but very often even the policies of the companies vary significantly across countries, with fewer safeguards being present for non-English countries. One study looked at 200 different policy announcements from Meta, TikTok, X, and Google (the owner of YouTube) and found that nearly two-thirds were focused on the US or European Union (Madung, O. and OSR&I, 2024).

Several investigations by factcheckers and CSOs revealed the extent of the problem in India. An investigation by Alt News found that crores of rupees were being spent by masked websites for advertisements promoting BJP on Facebook. Several of these websites were identical in their appearance and the content was identical verbatim, except for the domain name (Kumar, 2023).

Another similar investigation by the Reporters’ Collective and ad.watch mapped political advertisements on Meta from February 2019 to November 2020 and found that aside from the official accounts, at least 23 ghost and surrogate advertisers placed 34,884 ads costing more than Rs 58.3 million, mostly to promote BJP or denigrate its opposition (Sambhav, Ranganathan & Jalihal, 2022).

More recently, in 2024, Access Now and Global Witness submitted 48 advertisements containing content prohibited by YouTube’s advertising and election misinformation policies. Even though YouTube reviews ad content before it can run, yet the platform approved every single ad for publication (Access Now & Global Witness, 2024).

Some Legal and Regulatory Provisions

Taking into account the technological advancements

Regulations are frequently circumvented by political parties by using surrogate advertisers to mask political content while also hiding the amount of money spent on campaigning.

and the consequent spread of election campaigns in the digital sphere, the Election Commission of India (ECI) has time and again reiterated that the Model Code of Conduct (MCC) would apply to campaigning on online platforms as well. Further, paid political advertisements during the election period on all media channels—television, radio and print, as well on social media platforms—would require a pre-certification from the ECI or the designated officer before dissemination. However, several of these regulations are frequently circumvented by political parties by using surrogate advertisers to mask political content while also hiding the amount of money spent on campaigning.

Over the years, ECI has also issued advisories to political parties, candidates and star campaigners, specifically warning against a ‘notice’ for the violation of MCC, including against surrogate advertisements. It has also clarified that the 48-hour ‘silence period’ under Section 126 of the Representation of Peoples Act, 1951, applies to social media platforms as well.

Some of the major tech companies entered into a ‘Voluntary Code of Conduct’ with the ECI before the 2019 general elections, which were applicable for the 2024 elections as well. However, several investigations suggest that the pre-certification of political ads as well as other provisions of the code were not being followed. There is also no transparency on how this Code is implemented or how this channel is used.

ECI has been using some of these provisions to selectively take down social media content of some political parties for allegedly violating the MCC (The Indian Express, April 17, 2024). Aside from this selective targeting of political parties and issuing of advisories, no substantial efforts have been taken by the ECI to tackle the issue of surrogate advertisement, hate speech and misleading political advertisements on social media.

Civil Society Advocacy

Civil society organisations have time and again appealed to the ECI to monitor the online content as part of the extension of the MCC to the digital platforms, as well as to the tech platforms to encourage self-regulation and monitoring. However, the responses from both ends have been less than satisfactory.

No substantial efforts have been taken by the ECI to tackle the issue of surrogate advertisement, hate speech and misleading political advertisements on social media.

Before the April 2019 national elections, a group of civil society organisations and activists made a representation to the ECI, demonstrating the need to uphold and defend the integrity of the elections by safeguarding it from the misuse of social media and digital platforms.

A civil society group comprising of organisations such as Common Cause, Internet Freedom Foundation (IFF), Association for Democratic Reforms (ADR), Free Software Movement of India (FSMI), to name a few, were signatories to the letter, along with concerned citizens, including former public servants and chief election commissioners. The letter included an action plan with six suggestions, such as asking the ECI to make it mandatory for political parties to disclose their official handles on all major platforms, appealing to the ECI to monitor the online spending of political parties for election campaigns, and not just spending by candidates, etc.

In January 2022, several human rights and civil society organisations, joined by whistleblowers Frances Haugen and Sophie Zhang and former Facebook Vice President Brian Boland, called on Facebook to release the India Human Rights Impact Assessment (HRIA) and address the grave concerns about the company’s human rights records in India (Real Facebook Oversight Board, 2022). The company commissioned an independent assessment in 2019 to evaluate Meta’s role in spreading hate speech and incitement to violence on its platforms but published only snippets from the India report and refused to publish the India HRIA (Brown and Bajoria, 2022). The signatories to the letter to Facebook included Amnesty International, India Civil Watch International, Human Rights Watch and Real Facebook Oversight Board, among over 20 other organisations.

On April 8, 2024, Common Cause, along with 11 other civil society organisations including the Internet Freedom Foundation (IFF), Association for Democratic Reforms (ADR) and Mazdoor Kisan Shakti Sangathan (MKSS), wrote to the ECI with an urgent appeal to uphold the integrity of the upcoming elections and hold political parties, candidates and digital platforms accountable to the voters. The letter highlighted some major concerns, namely the need to regulate online campaigning and surrogate advertisements, the dangers posed by emerging technologies such as deepfakes in influencing voter perceptions, the inadequacies of the voluntary code of conduct and the use of facial recognition and video surveillance of voters.

Some of the important suggestions to the ECI included the scrutiny of expenditure on surrogate advertising and targeted online campaigns by political actors, measures to

Suggestions to the ECI included the scrutiny of expenditure on surrogate advertising and targeted online campaigns by political actors, measures to increase the accountability of political actors who deploy generative AI with the intent of influencing voter perceptions

increase the accountability of political actors who deploy generative AI with the intent of influencing voter perceptions and political narratives and initiating a transparent and participatory process to arrive at a MCC for digital platforms.

Since early 2023, a group of organisations across the globe came together to form what eventually came to be known as the ‘Global Coalition for Tech Justice’, a growing movement to ensure that Big Tech plays its role in protecting elections and citizens’ rights and freedoms across the world. Common Cause is a part of the coalition and other steering group members of the Coalition include Digital Action and the India Civil Watch International. On April 16, 2024, the Coalition organised a public event to put the spotlight on tech platforms’ failures to protect people and democracy during elections in the first quarter of the election megacycle. The speakers talked about the Big Tech failures ahead of India’s election, using evidence from their investigations.

References

  • Access Now and Global Witness (April 2024). “Votes will not be counted”: Indian election disinformation ads and YouTube. Available at: https://bit.ly/4cSrKjr
  • Brown, D. and Bajoria, J. (21st July 2022). Meta and Hate Speech in India. Human Rights Watch. Available at: https://bit.ly/4cWvssq
  • Ek?, India Civil Watch International and the London Story (2024). Slander, Lies and Incitement: India’s Million-Dollar Election Meme Network. Available at: https://bit.ly/3LDIr6i
  • Election Commission of India (1st February 2024). Handbook on Media Matters for CEOs & DEOs: Edition- 1 Feb 2024. Available at: https://bit.ly/4ccTqyt
  • Express News Service (17th April 2024). On EC orders, X takes down posts of parties & leaders, but disagrees. New Delhi, The Indian Express. Available at: https://bit.ly/4ddhL8a
  • Frenkel, S. and Alba, D. (23rd October 2021). In India, Facebook Grapples with an Amplified Version of Its Problems. The New York Times. Available at: https://bit.ly/4cVuX1u
  • Kantar and Internet and Mobile Association of India (IAMAI) (2024). Internet in India 2023. Available at: https://bit.ly/3ycgShi
  • Kumar, A. (4th April 2023). Exclusive: Network of shadow Facebook pages spending crores of on ads to target Oppn are connected to BJP. Alt News. Available at: https://bit.ly/4fhQfrW
  • Letter by civil society and human rights organisations addressed to Facebook Director of Human Rights Miranda Sissons dated 19th January 2022. Available at: https://bit.ly/3YfIxbu
  • Letter to the Election Commission of India on ‘Safeguarding Democracy from Digital Platforms’ dated 06th April 2024. Available at: https://bit.ly/46kxdNn
  • Letter addressed to Shri Rajiv Kumar, Chief Election Commissioner, Shri Gyanesh Kumar, Election Commissioner and Shri (Dr.) Sukhbir Singh Sandhu, Election Commissioner on ‘Technology accountability and digital platforms’ dated 08th April 2024. Available at: https://bit.ly/4bWIvst
  • Lokniti- Centre for Study of Developing Societies (CSDS) (2019). Social Media & Political Behaviour. Available at : https://bit.ly/4d5S0Xq
  • Madung, O. and Open Source Research and Investigations (27th February 2024). Platforms, Promises and Politics. Available at: https://bit.ly/3YzuXAf
  • Newman, N. Fletcher, R., Robertson, C.T., Eddy, K. and Nielsen, R.K. (2022). Digital News Report 2022. Reuters Institute for the Study of Journalism. Pp 134-135. Available at: https://bit.ly/46iGjds
  • Press Information Bureau Delhi (01st March 2024). Ahead of General Elections 2024, ECI warns political parties to maintain decorum in public campaigning; conveys stern action against direct of indirect MCC violations. Press Information Bureau. Available at: https://bit.ly/46gElKM
  • Real Facebook Oversight Board (19th January 2022). Civil Society, Advocates, Whistleblowers to Facebook: Release the India Human Rights Report, Without Redaction or Delay. Medium. Available at: https://bit.ly/3Wgn4wA
  • Sambhav, K., Ranganathan, N. and Jalihal, S. (15th March 2022). Inside Facebook and BJP’s world of ghost advertisers. Available at: https://bit.ly/4db5AIQ
  • Sambhav, K. & Ranganathan, N. (16th March 2022). Facebook charged BJP less for India election ads than others. Aljazeera. Available at: https://bit.ly/3SkU0Dc
  • Zakrzewski, C., De Vynck, G., Masih, N. and Mahtani, S. (24th October 2021). How Facebook neglected the rest of the world, fueling hate speech and violence in India. The Washington Post. . Available at: https://bit.ly/3WztWq7
  • (1.) Cambridge Analytica is a British consulting firm which, in collaboration with Facebook, harvested personal data belonging to millions of Facebook users in 2010s without their consent and the data was then used to provide analytical assistance to Donald Trump before the 2016 US Presidential elections. The firm was also accused of interfering with the Brexit referendum, influencing the “Leave” voters.
  • (2.) On January 6, 2021, rioters supporting Donald Trump attacked the US Capitol in Washington DC with the goal of stopping the certification of Joe Biden’s election. Five people were killed, including one police officer who was beaten by rioters. The riots were allegedly organised on social media, with even President Trump inciting supporters to come to DC in a call to action over a series of social media posts.

  • NEXT »

    Demystifying Electoral Bonds >>

April - June 2024