Regulators have warned that AI is facing its ‘Oppenheimer moment,’ as nations are deploying ‘slaughter bots’ on battlefields.
The statements were made at a conference in Vienna Monday by Austrian Foreign Minister Alexander Schallenberg, who said: ‘This is the Oppenheimer Moment of our generation.
‘Now is the time to agree on international rules and norms.’
The reference was to J. Robert Oppenheimer, who helped invent the atomic bomb in 1945 before advocating for controls over the spread of nuclear arms after finding his creation could destroy the world.
The conference comes in the wake of an explosive report that the Israeli military outsourced life-and-death bombing target decisions to an AI dubbed ‘Lavender’ — news that the United Nations Secretary-General said left him ‘deeply troubled.’
‘This is the Oppenheimer Moment of our generation,’ warned Austrian Foreign Minister Alexander Schallenberg (above), whose government hosted a two-day conference on restricting AI in warzones. ‘Now is the time to agree on international rules and norms,’ he said
During his opening remarks Monday, Schallenberg described artificial intelligence (AI) as the most significant advancement in warfare since the invention of gunpowder over a millennia ago. Above left, the Thermonator, one Ohio-based firm’s $9,420 flame-throwing robot dog
At this week’s conference, one former AI investor for Google‘s parent company worried, ‘Silicon Valley’s incentives might not be aligned with the rest of humanity.’
During his opening remarks, Schallenberg described AI as the most significant advancement in warfare since the invention of gunpowder over a millennia ago.
The only difference was that AI is even more dangerous, he continued.
‘At least let us make sure that the most profound and far-reaching decision — who lives and who dies — remains in the hands of humans and not of machines,’ Schallenberg said.
The Austrian Minister argued that the world needs to ‘ensure human control,’ with the troubling trend of military AI software replacing human beings in the decision-making process.
The statements come just weeks after it was found the Israeli arm has been using an AI system to populate its ‘kill list’ of alleged Hamas terrorists, leading to the deaths of women and children.
A report from +972 magazine cited six Israeli intelligence officers, who admitted to using an AI called ‘Lavender’ to classify as many as 37,000 Palestinians as suspected militants — marking these people and their homes as acceptable targets for air strikes.
During his opening remarks Monday, Schallenberg described artificial intelligence (AI) as the most significant advancement in warfare since the invention of gunpowder over a millennia ago. Above left, the Thermonator, one Ohio-based firm’s $9,420 flame-throwing robot dog
Civilian, military and technology leaders from over 100 countries convened Monday in Vienna (above) in an effort to prevent, as physicist Anthony Aguirre put it, ‘the future of slaughter bots’
Costa Rica’s foreign minister, Arnoldo André Tinoco, voiced his concern at the conference that AI-powered weapons of war will soon be deployed by terrorists, and other non-state actors, which will require new legal framework. Above, a US Reaper drone
Lavender was trained on data from Israeli intelligence’s decades-long surveillance of Palestinian populations, using the digital footprints of known militants as a model for what signal to look for in the noise, according to the report.
But the tech has also been added to drones used in the Ukraine war, which are helping the nation seek out targets that are unloading ammunition without human guidance.
Austria’s top disarmament official Alexander Kmentt, who led the organization of the Monday conference, advised that traditional ‘arms control’ treaties would not work for software like AI.
‘We’re not talking about a single weapons system but a combination of dual-use technologies,’ Kmentt said. ‘A classical approach to arms control doesn’t work.’
Kmentt argued that currently existing legal tools, like export controls and humanitarian laws, would be a better and faster solution to the crisis, which is already in progress, rather than waiting to craft a new ‘magnum opus’ treaty.
Costa Rica’s foreign minister, Arnoldo André Tinoco, also voiced his concern at that AI-powered weapons of war will soon be deployed by terrorists, and other non-state actors, which will require new legal framework.
‘The easy availability of autonomous weapons removes limitations that ensured only a few could enter the arms race,’ he said.
‘Now students with a 3-D printer and basic programming knowledge can make drones with the capacity to cause widespread casualties. Autonomous weapons systems have forever changed the concept of international stability.’