VIENNA: The world should estab­lish a set of rules to regulate AI weapons while they’re still in their infancy, a global conference said on Tuesday, calling the issue an “Oppenheimer moment” of the time.

Like gunpowder and the atomic bomb, artificial intelligence (AI) has the capacity to revolutionise warfare, analysts say, making human disputes unimaginably different — and a lot more deadly.

“This is our generation’s ‘Oppenheimer moment’ where geopolitical tensions threaten to lead a major scientific breakthrough down a very dangerous path for the future of humanity,” read the summary at the end of the two-day conference in Vienna.

US physicist Robert Oppenhei­mer helped invent nuclear weapons during World War II. Austria organised and hosted the two-day conference in Vienna, which brought together some 1,000 participants, including political leaders, experts and members of civil society, from more than 140 countries.

A final statement said the group “affirms our strong commitment to work with urgency and with all interested stakeholders for an international legal instrument to regulate autonomous weapons systems”.

“We have a responsibility to act and to put in place the rules that we need to protect humanity... Human control must prevail in the use of force”, said the summary, which is to be sent to the UN secretary general.

Using AI, all sorts of weapons can be transformed into autonomous systems, thanks to sophisticated sensors governed by algorithms that allow a computer to “see”. This will enable the locating, selecting and attacking human targets — or targets containing human beings — without human intervention.

Most weapons are still in the idea or prototype stages, but Russia’s war in Ukraine has offered a glimpse of their potential. Remotely piloted drones are not new, but they are becoming increasingly independent and are being used by both sides.

“Autonomous weapons systems will soon fill the world’s battlefields,” Austrian Foreign Minister Alexan­der Schallenberg said on Monday when opening the conference.

He warned now was the “time to agree on international rules and norms to ensure human control”. Austria, a neutral country keen to promote disarmament in international forums, in 2023 introduced the first UN resolution to regulate autonomous weapons systems, which was supported by 164 states.

‘Uncorrectable errors’

A Vienna-based privacy campaign group said it would file a complaint against ChatGPT in Austria, claiming the “hallucinating” flagship AI tool has invented wrong answers that creator OpenAI cannot correct.

NOYB ( “None of Your Business” ) said there was no way to guarantee the programme provided accurate information. “ChatGPT keeps hallucinating — and not even OpenAI can stop it,” the group said in a statement.

The company has openly acknowledged it cannot correct inaccurate information produced by its generative AI tool and has failed to explain where the data comes from and what ChatGPT stores about individuals, said the group.

Published in Dawn, May 1st, 2024

Opinion

Editorial

Short-changed?
Updated 24 Nov, 2024

Short-changed?

As nations continue to argue, the international community must recognise that climate finance is not merely about numbers.
Overblown ‘threat’
24 Nov, 2024

Overblown ‘threat’

ON the eve of the PTI’s ‘do or die’ protest in the federal capital, there seemed to be little evidence of the...
Exclusive politics
24 Nov, 2024

Exclusive politics

THERE has been a gradual erasure of the voices of most marginalised groups from Pakistan’s mainstream political...
Counterterrorism plan
Updated 23 Nov, 2024

Counterterrorism plan

Lacunae in our counterterrorism efforts need to be plugged quickly.
Bullish stock market
23 Nov, 2024

Bullish stock market

NORMALLY, stock markets rise gradually. In recent months, however, Pakistan’s stock market has soared to one ...
Political misstep
Updated 23 Nov, 2024

Political misstep

To drag a critical ally like Saudi Arabia into unfounded conspiracies is detrimental to Pakistan’s foreign policy.