VIENNA: The world should estab­lish a set of rules to regulate AI weapons while they’re still in their infancy, a global conference said on Tuesday, calling the issue an “Oppenheimer moment” of the time.

Like gunpowder and the atomic bomb, artificial intelligence (AI) has the capacity to revolutionise warfare, analysts say, making human disputes unimaginably different — and a lot more deadly.

“This is our generation’s ‘Oppenheimer moment’ where geopolitical tensions threaten to lead a major scientific breakthrough down a very dangerous path for the future of humanity,” read the summary at the end of the two-day conference in Vienna.

US physicist Robert Oppenhei­mer helped invent nuclear weapons during World War II. Austria organised and hosted the two-day conference in Vienna, which brought together some 1,000 participants, including political leaders, experts and members of civil society, from more than 140 countries.

A final statement said the group “affirms our strong commitment to work with urgency and with all interested stakeholders for an international legal instrument to regulate autonomous weapons systems”.

“We have a responsibility to act and to put in place the rules that we need to protect humanity... Human control must prevail in the use of force”, said the summary, which is to be sent to the UN secretary general.

Using AI, all sorts of weapons can be transformed into autonomous systems, thanks to sophisticated sensors governed by algorithms that allow a computer to “see”. This will enable the locating, selecting and attacking human targets — or targets containing human beings — without human intervention.

Most weapons are still in the idea or prototype stages, but Russia’s war in Ukraine has offered a glimpse of their potential. Remotely piloted drones are not new, but they are becoming increasingly independent and are being used by both sides.

“Autonomous weapons systems will soon fill the world’s battlefields,” Austrian Foreign Minister Alexan­der Schallenberg said on Monday when opening the conference.

He warned now was the “time to agree on international rules and norms to ensure human control”. Austria, a neutral country keen to promote disarmament in international forums, in 2023 introduced the first UN resolution to regulate autonomous weapons systems, which was supported by 164 states.

‘Uncorrectable errors’

A Vienna-based privacy campaign group said it would file a complaint against ChatGPT in Austria, claiming the “hallucinating” flagship AI tool has invented wrong answers that creator OpenAI cannot correct.

NOYB ( “None of Your Business” ) said there was no way to guarantee the programme provided accurate information. “ChatGPT keeps hallucinating — and not even OpenAI can stop it,” the group said in a statement.

The company has openly acknowledged it cannot correct inaccurate information produced by its generative AI tool and has failed to explain where the data comes from and what ChatGPT stores about individuals, said the group.

Published in Dawn, May 1st, 2024

Opinion

Editorial

Afghan strikes
Updated 26 Dec, 2024

Afghan strikes

The military option has been employed by the govt apparently to signal its unhappiness over the state of affairs with Afghanistan.
Revamping tax policy
26 Dec, 2024

Revamping tax policy

THE tax bureaucracy appears to have convinced the government that it can boost revenues simply by taking harsher...
Betraying women voters
26 Dec, 2024

Betraying women voters

THE ECP’s recent pledge to eliminate the gender gap among voters falls flat in the face of troubling revelations...
Kurram ‘roadmap’
Updated 25 Dec, 2024

Kurram ‘roadmap’

The state must provide ironclad guarantees that the local population will be protected from all forms of terrorism.
Snooping state
25 Dec, 2024

Snooping state

THE state’s attempts to pry into citizens’ internet activities continue apace. The latest in this regard is a...
A welcome first step
25 Dec, 2024

A welcome first step

THE commencement of a dialogue between the PTI and the coalition parties occupying the treasury benches in ...