VELES, a small town in Macedonia, came into the global spotlight in the context of the 2016 US presidential elections, albeit for all the wrong reasons. This small factory town appeared to be an epicentre of political disinformation targeting the American population ahead of the polls.
At times, the hoax news stories emanating from Veles even trumped those from credible journalism outlets. From ‘Pope Francis Endorses Trump’ to ‘ISIS Calls for Muslim Voters to Vote for Hillary’, the imaginative Macedonian ‘content writers’ apparently gave the 19 leading American journalism outlets a run for their money.
Studies suggest in the months leading up to the elections, 20 top-performing hoax election stories generated more engagement on Facebook than the 20 top-performing credible stories from the top news outlets combined, including the New York Times and Washington Post. Many came from Veles.
Political disinformation isn’t merely for political gains. Hoax stories from Veles were created to garner traffic by triggering emotions and generate click-based-revenue. The revenue was in the millions.
Mirko Ceselkoski, the ‘mentor’ of the youth behind this operation, proudly claimed that more than 1,200 of his ‘students’ were earning upwards of $10m a month. One can only imagine the profits that social media companies had made from the traffic generated by these hoax stories.
The influx of organised disinformation in the 2016 election amply demonstrates the potential impact of MDM (misinformation, disinformation, mal-information) on elections in general. MDM can be massively profitable for most parties involved. Similar trends have been noted in Pakistan.
The revelations of the former ‘troll-in-chief’ and investigations published by various journalists and researchers unpacking hate campaigns and identifying the political actors behind them, indicate a heavy potential influx of organised political MDM in the run-up to the elections in 2024, especially considering the factors that make Pakistan’s population susceptible to hateful MDM.
The Reuters Institute for the Study of Journalism found that “influencers” are overtaking credible journalists as news sources, and fewer people trust traditional media, whereas more people are turning to TikTok for news.
Recent studies by Media Matters for Democracy have made similar findings — 64 per cent of surveyed news consumers indicated they use ‘social media platforms’ as their go-to ‘trusted medium’ for news. This shift towards influencer-based content could substantially contribute to creating a conducive environment for political MDM.
There is ample evidence to prove that ‘hate sells’ in online spaces.
Firstly, in the context of journalism, a vast majority of influencer-based content in Pakistan is, at best, political commentary. However, this distinction between commentary/ opinion-making and journalism is often conveniently blurred, making it harder for consumers to identify MDM, especially if it comes from their trusted ‘influencer’.
Secondly, the practice of engaging social media influencers and engaging troll networks for social media campaigns, unlike political ads, has zero transparency.
‘Paid influencers’ masquerade as political supporters, creating a classic echo-chamber effect, furthering polarisation, regurgitating political MDM, and presenting a distorted worldview to consumers. Add our existing cultural, religious and political prejudices to this mix and we have a recipe for disaster.
To top all this, whistleblowers and researchers have produced ample evidence to prove that ‘hate sells’ in online spaces. The Facebook Papers, a trove of Facebook’s internal documents made public by whistleblower Frances Haugen, reveal that the platform has been failing in dealing with hate and incitement, and the results are most harmful in countries with pronounced fault lines and conflicts.
In India, for instance, “Facebook has been selective in curbing hate speech, misinformation and inflammatory posts, particularly anti-Muslim content”. Failure to act aside, serving personalised content on timelines based on individual consumption patterns, an intrinsic characteristic of AI-based algorithms, even if the content in question comprises credible journalism stories, could create a distorted worldview.
Among potential responses to counter MDM, the worst idea is “regulating or criminalising fake news”. A ‘law’ to deal with misinformation and associated harm is touted as a one-stop-shop solution to curb all forms of MDM.
A similar thought process is reflected in the Prevention of Election Crimes Act (Peca) Ordinance, 2022, a law promulgated to curb free speech under the garb of ‘combating fake news’ by the PTI, and the Pakistan Media Development Authority, a legal framework proposed to centrally regulate all forms of media.
Any law enacted specifically to make information practitioners accountable cannot function without substantially harming free speech. More importantly, Pakistan currently has many legal provisions to deal with the harms associated with MDM. Most also have a history of being abused, including criminal defamation under the Pakistan Penal Code, and Section 20 of Peca.
A potential response that could work is ‘inoculating’ masses for misinformation, and community-based ‘proactive’ fact-checking. The inoculation theory “explains how an attitude or belief can be made resistant to persuasion or influence”, similar to how a biological body becomes resistant to a viral contagion.
The concept can be used to inoculate the population against a certain kind of MDM — such as gendered disinformation — by pre-exposing them to ‘weakened doses’ of misinformation and techniques that go into its production and distribution. Researchers have also been experimenting with a ‘broad-spectrum’ vaccine for MDM through gamification of this process, putting users in the shoes of misinformation producers and encouraging a critical reflection on the tactics used, such as impersonation and the use of fake imagery to trigger emotions.
The inoculation approach has had a high success rate. Well-designed, targeted media and information literacy campaigns in all major local languages, and the translation and distribution of existing gamified tools — such as ‘Fake It To Make It’ or ‘Bad News’ — and their mass distribution through popular platforms, including TikTok, could make a substantial difference.
However, an intervention on this scale would require substantial resources that perhaps only governments can pull together. CSOs, while they can help in the effective design and distribution of such content, can only play a supporting role. This might sound clichéd, but the responsibility for inoculating masses for electoral disinformation ahead of election 2024 lies on the ECP and the caretaker governments. Are they doing enough?
The writer is a media strategist and trainer, and founder of Media Matters for Democracy, a media development organisation.
Published in Dawn, October 14th, 2023
Dear visitor, the comments section is undergoing an overhaul and will return soon.