Audio leaks purportedly from the Prime Minister’s Office have set tongues wagging across the political divide, and quite reasonably, raised questions on the state of security of the place that arguably determines the country’s fate.

The recordings, which contain the voices of what sounds very much like high-ranking current and former government officials, are seen by many as a peek into the political machinations that take place away from the public eye.

The PML-N has yet to deny the audios, instead terming it a “serious lapse”, but the most recent recordings, which feature the alleged voices of Imran Khan, Asad Umar and Shireen Mazari, have been dismissed as a PML-N-developed fake by the PTI chief and “cut and paste” by Mazari.

So could the audios be deepfake?

“There is always a possibility,” said Rafay Baloch, a cybersecurity researcher and white hat hacker, adding that it could be determined through digital forensics.

“However, it hasn’t reached a point where deepfake audios in all languages can be produced without a margin of error,” he told Dawn.com.

“Hence, all the famous deepfake videos you see will have a voice actor mimicking the voice of the individual being impersonated.”

In the video above, Dawn.com takes a deep dive into what deepfakes are, their potential for danger and how they can be combated.

So what is a deepfake, anyway?

A combination of “deep learning” and “fake”, deepfakes are hyper-realistic videos digitally manipulated to show people saying and doing things that they never actually did.

They are difficult to detect — as they use real footage — can have authentic-sounding audio, and are optimised to spread quickly on social media. And it’s easy for viewers to assume that the video they’re watching is genuine.

And how dangerous is it?

Deepfake voices have been used by criminals imitating executives to dupe employees into transferring money to them.

A study describes it as a “major threat to our society”, warning that, “various political players, including political agitators, hacktivists, terrorists, and foreign states can use deepfakes in disinformation campaigns to manipulate public opinion and undermine confidence in a given country’s institutions”.

Another threat comes from the potential to harass and blackmail women from this tech, as the likenesses of women are frequently used in such videos. As of last year, there were 85,000 deepfakes circulating online, 90 per cent of which depict non-consensual porn featuring women.

But there are also legitimate uses of deepfaking tech. They’re used in educational media and digital communications, games and entertainment, social and healthcare, material science, and various business fields such as fashion and personalised e-commerce.

More than anything, a healthy level of scepticism can help. Cyberattack investigator Giacopuzzi says his work has ultimately left him convinced that in today’s world, “we need to question everything”.

It’s good advice. We should question everything, particularly when it comes from social media.

Opinion

Editorial

Strange claim
Updated 21 Dec, 2024

Strange claim

In all likelihood, Pakistan and US will continue to be ‘frenemies'.
Media strangulation
Updated 21 Dec, 2024

Media strangulation

Administration must decide whether it wishes to be remembered as an enabler or an executioner of press freedom.
Israeli rampage
21 Dec, 2024

Israeli rampage

ALONG with the genocide in Gaza, Israel has embarked on a regional rampage, attacking Arab and Muslim states with...
Tax amendments
Updated 20 Dec, 2024

Tax amendments

Bureaucracy gimmicks have not produced results, will not do so in the future.
Cricket breakthrough
20 Dec, 2024

Cricket breakthrough

IT had been made clear to Pakistan that a Champions Trophy without India was not even a distant possibility, even if...
Troubled waters
20 Dec, 2024

Troubled waters

LURCHING from one crisis to the next, the Pakistani state has been consistent in failing its vulnerable citizens....