Audio leaks purportedly from the Prime Minister’s Office have set tongues wagging across the political divide, and quite reasonably, raised questions on the state of security of the place that arguably determines the country’s fate.

The recordings, which contain the voices of what sounds very much like high-ranking current and former government officials, are seen by many as a peek into the political machinations that take place away from the public eye.

The PML-N has yet to deny the audios, instead terming it a “serious lapse”, but the most recent recordings, which feature the alleged voices of Imran Khan, Asad Umar and Shireen Mazari, have been dismissed as a PML-N-developed fake by the PTI chief and “cut and paste” by Mazari.

So could the audios be deepfake?

“There is always a possibility,” said Rafay Baloch, a cybersecurity researcher and white hat hacker, adding that it could be determined through digital forensics.

“However, it hasn’t reached a point where deepfake audios in all languages can be produced without a margin of error,” he told Dawn.com.

“Hence, all the famous deepfake videos you see will have a voice actor mimicking the voice of the individual being impersonated.”

In the video above, Dawn.com takes a deep dive into what deepfakes are, their potential for danger and how they can be combated.

So what is a deepfake, anyway?

A combination of “deep learning” and “fake”, deepfakes are hyper-realistic videos digitally manipulated to show people saying and doing things that they never actually did.

They are difficult to detect — as they use real footage — can have authentic-sounding audio, and are optimised to spread quickly on social media. And it’s easy for viewers to assume that the video they’re watching is genuine.

And how dangerous is it?

Deepfake voices have been used by criminals imitating executives to dupe employees into transferring money to them.

A study describes it as a “major threat to our society”, warning that, “various political players, including political agitators, hacktivists, terrorists, and foreign states can use deepfakes in disinformation campaigns to manipulate public opinion and undermine confidence in a given country’s institutions”.

Another threat comes from the potential to harass and blackmail women from this tech, as the likenesses of women are frequently used in such videos. As of last year, there were 85,000 deepfakes circulating online, 90 per cent of which depict non-consensual porn featuring women.

But there are also legitimate uses of deepfaking tech. They’re used in educational media and digital communications, games and entertainment, social and healthcare, material science, and various business fields such as fashion and personalised e-commerce.

More than anything, a healthy level of scepticism can help. Cyberattack investigator Giacopuzzi says his work has ultimately left him convinced that in today’s world, “we need to question everything”.

It’s good advice. We should question everything, particularly when it comes from social media.

Opinion

Editorial

Military option
Updated 21 Nov, 2024

Military option

While restoring peace is essential, addressing Balochistan’s socioeconomic deprivation is equally important.
HIV/AIDS disaster
21 Nov, 2024

HIV/AIDS disaster

A TORTUROUS sense of déjà vu is attached to the latest health fiasco at Multan’s Nishtar Hospital. The largest...
Dubious pardon
21 Nov, 2024

Dubious pardon

IT is disturbing how a crime as grave as custodial death has culminated in an out-of-court ‘settlement’. The...
Islamabad protest
Updated 20 Nov, 2024

Islamabad protest

As Nov 24 draws nearer, both the PTI and the Islamabad administration must remain wary and keep within the limits of reason and the law.
PIA uncertainty
20 Nov, 2024

PIA uncertainty

THE failed attempt to privatise the national flag carrier late last month has led to a fierce debate around the...
T20 disappointment
20 Nov, 2024

T20 disappointment

AFTER experiencing the historic high of the One-day International series triumph against Australia, Pakistan came...