Irfan Khan
Irfan Khan

• Experts say changes to platform ‘propping up’ fake news
• ‘Verified’ users lead discourse, rake in more eyeballs than news accounts

KARACHI: Truth is the first casualty in war. While the nature of conflicts keeps evolving, the accuracy of this military maxim has stood the test of time. In the past, the sources of propaganda and misinformation were limited — newspaper, radio and TV — but since social media, the cesspool of misinformation has been murkier than ever.

Like the conflicts before it, the Hamas attack on Israel on Oct 7 and the subsequent bombing by Israeli forces in Gaza led to a barrage of misinformation on social media, particularly X, formerly known as Twitter.

Some examples of these claims are the alleged beheading of babies by Hamas attackers, the use of “crisis actors” to show casualties, the publication of AI-generated images to show explosions and air strikes and the posting of visuals from past conflicts — all of which has been posted repeatedly despite multiple fact-checks.

The staggering pace at which misinformation is disseminated has left the news media lagging behind since the start of the conflict. By the time they put out fact-checks and detailed analyses, the narrative setting had already been taken over by the X accounts sharing unverified information.

“[I]t’s been really difficult to sift through what is actually genuine footage from what’s been going on in Israel and Gaza, and what is either clickbait or unrelated footage or something that is being shared for clicks, engagement or any sort of nefarious intent,” BBC’s Shayan Sardarizadeh told Reuters Institute.

Analysis of such claims on X shows that the explosion of disinformation on the platform has a meticulous pattern, and those disseminating it are using the algorithm to their advantage. Result: they are getting eyeballs.

A study by the University of Washington’s Center for an Informed Public identified seven X accounts that have dominated the discourse by posting huge amounts of content.

These accounts included Viseg­rád 24, Mario Nawfal, OSINTdefender, The Spec­tator Index, War Monitor, Collin Rugg and Censored Men. The study dubbed these accounts the “new elites” as they “exercise disproportionate power and influence” over what audiences on X read and watch.

The study compared their activity to the six most followed news accounts: CNN Brea­king News, CNN, NYTimes, BBC Breaking News, BBCWorld, and Reuters.

The researchers collected posts from these accounts with specific key terms (e.g., “Gaza,” “Israel,” and “Hamas”) and 500 likes between Oct 7 and 10.

Across the three days, the news accounts published 298 posts, which got 112 million views (an average of 376,000 per tweet).

By contrast, the “new elites” published 1,834 posts, raking in a staggering 1.6 billion views (an average of 872,000 per tweet).

The mind-boggling discrepancy in these numbers becomes more apparent when we compare the number of followers these two sets of accounts have. By mid-October, the six news accounts had a cumulative 298.1m followers, whereas the seven “new elites” had only 6.74m in total.

Yet their posts reached 10 times more people.

These accounts not only gained more views, but their followers increased faster than those of news accounts. So, from around 6.74m in the first week of the Gaza conflict, the number jumped by almost 898,000 to 7.63m by Nov 10. In the same period, the six news accounts gained around 600,000 followers.

The method to madness

While social media disinformation is in no way a new problem, the recent changes to X have certainly exacerbated the issue. Those who are leading the discourse on the platform are aware of this and use it to their advantage.

Paid X users — who have a blue check on their accounts — have been found responsible for more than 70 per cent of the “most viral false or unsubstantiated claims” spread on the platform regarding the Gaza conflict.

A US-based disinformation monitoring agency, NewsGuard, analysed the 250 most-engaged posts (measured by the number of likes, reposts, replies, and bookmarks) carrying false or unsubstantiated claims during the first week of the conflict (Oct 7 to 14). Of them, 186 or 74pc were posted by verified accounts.

The seven accounts identified by the University of Washington are all verified. Paying money to use X gives a user access to features not available to their unpaid peers and it also comes with a free promotion.

Earlier this year, X owner Elon Musk said that the platform will promote posts of verified users by putting them on the ‘For You’ timeline — where a user can also see posts from accounts they don’t follow.

These verified accounts post information sourced from media outlets and other publications. However, their original posts only give the source’s name and not the link to it. This is because the X algorithm demotes posts with links to an external source. In Mr Musk’s words, this is being done to “optimise time spent on X”.

X has also removed headlines from links, making them hard to distinguish from standard photos, unless you’re looking closely. Days before the Hamas attack, X also disabled the option to report general misinformation on the platform.

Add monetary incentives — for verified accounts based on engagement — to this equation, it appears as if the spread of misinformation is incentivised on the platform.

Eliot Higgins, founder of the investigative group Bellingcat, said in a recent interview that Mr Musk’s alterations to X have created an environment that’s ripe for disinformation and the spread of misinformation.

“It’s honestly the worst I’ve ever seen in any situation I’ve experienced, by a significant factor,” Mr Higgins said.

The OSINT ‘researchers’

OSINT, or open-source intelligence, refers to the use of openly available sources to find information about a person, event, place or happening.

According to digital investigation expert Giancarlo Fiorella, an open-source investigation includes the use of any openly available information on the internet: videos, images, and posts made on social media, published government documents, satellite imagery, platforms that track vessels and aircraft in real-time, and many more.

The extensive work involved in an open-source investigation and the specialised skills and tools required to carry out such an investigation lend it credibility. It makes use of information that needs digital sleuthing, something not everyone has the skill, and willingness to carry out.

The effectiveness of open source, especially during a conflict, becomes more vital when verified information is sparse and partisan narratives are propagated.

In this conflict, that credibility has eroded massively as accounts with OSINT in name and bios have posted inaccurate findings, which were later refuted.

“Except a handful of consistently objective OSINT accounts, the outbreak of IDF-Hamas conflict has witnessed an increase in partisan reporting by some renowned accounts which keep track of developments using flight and vessel tracking tools and open-source satellite imagery,” said Zaki Khalid, a Rawalpindi-based intelligence analyst and consultant.

Take the example of the strike on Gaza’s Ahli Arab Hospital on Oct 17. Within hours of the strike, social media platforms, particularly X, were flooded with contradictory claims on what caused the strike.

The accounts that blamed Hamas used Al Jazeera’s live footage showing a barrage of rockets being fired from Gaza along with another video that showed a projectile over Gaza exploding in the air. It was followed by an explosion on the ground seconds later.

This was due to the “demands of a fast news cycle” which are at odds with the “slow nature of open-source research”, said Mr Fiorella.

“[T]he whole world wanted answers immediately, and the open-source evidence just wasn’t there to give those answers immediately,” he said.

For him, the answer to “who was responsible” is still elusive.

After a detailed analysis of available footage, The New York Times reached the same conclusion and its findings didn’t answer “what actually did cause the Al-Ahli Arab Hospital blast, or who is responsible”.

The divisive conclusions reached by the multiple media outlets also failed to satisfy those seeking answers in black or white. The Associated Press blamed an “astray” rocket fired from Gaza, while Al Jazeera disputed the claim.

According to Mr Higgins of Bellingcat, when people don’t get definitive answers, “they seek out those who will provide them — generally based on political opinions rather than evidence.”

Published in Dawn, November 13th, 2023

Opinion

Editorial

Islamabad protest
Updated 20 Nov, 2024

Islamabad protest

As Nov 24 draws nearer, both the PTI and the Islamabad administration must remain wary and keep within the limits of reason and the law.
PIA uncertainty
20 Nov, 2024

PIA uncertainty

THE failed attempt to privatise the national flag carrier late last month has led to a fierce debate around the...
T20 disappointment
20 Nov, 2024

T20 disappointment

AFTER experiencing the historic high of the One-day International series triumph against Australia, Pakistan came...
Tribunals’ failure
Updated 19 Nov, 2024

Tribunals’ failure

With election tribunals having failed to fulfil their purpose, it isn't surprising that Pakistan has not been able to stabilise.
Balochistan MPC
19 Nov, 2024

Balochistan MPC

WHILE immediate threats to law and order must be confronted by security forces, the long-term solution to...
Firm tax measures
19 Nov, 2024

Firm tax measures

FINANCE Minister Muhammad Aurangzeb is ready to employ force to make everyone and every sector in Pakistan pay their...