Threat of 'killer robots' in Ukraine increasingly realistic, warns expert

BAKHMUT, UKRAINE - JULY 16: A Ukrainian drone operator named Chaynik from the 3rd Assault brigade lands his drone after a surveillance flight on July 16, 2023 near Bakhmut in the Donetsk Region of Ukraine. Ukraine has regained territory north and south of Bakhmut after the city was captured by Russian forces in May, following a yearlong battle. (Photo by Paula Bronstein/Getty Images)
A Ukrainian drone operator near Bakhmut in the Donetsk Region of Ukraine. Ukraine has described fully autonomous killer robots as a "logical and inevitable next step". (Getty Images)

A campaign group calling for a global treaty regulating the use of autonomous weapons systems has warned they pose an "increasingly realistic" threat to society.

Autonomous weapons systems - more familiarly known as 'killer robots' - are potentially already being used on the frontline, while an AI arms race is also unfolding on the battleground in Ukraine.

Countries including the US, China and Russia are believed to be investing heavily in the technology, with the focus on such machines increasingly a cause for concern.

In November 2023, a historic UN General Assembly resolution on autonomous weapons​​ systems called for an internationally recognised and legally binding treaty to regulate their use. In August this year, the Stop Killer Robots campaign group called for the international community to urgently address the emerging tech following a UN report on the issue that branded such machines "politically unacceptable and morally repugnant".

The Pope has also lent a cautionary voice earlier saying earlier this year that "no machine should ever choose to take the life of a human being".

Killer robots can range from armed drones to ‘loitering munitions’ that attack people entering certain areas.

Not only do ‘killer robots’ already exist, they have already attacked people.

In 2020, troops fighting for the interim Libyan government may have launched a Turkish Kargu-2 drone which attacked retreating rebels, according to a UN report (although the manufacturer has claimed it is not capable of such action).

AI is already being used in some of Ukraine's long-range drone strikes, too, targeting military facilities and oil refineries hundreds of kilometres inside Russia.

One Ukrainian official, speaking anonymously, told Reuters in July that the attacks sometimes involve a swarm of about 20 drones using a form of AI which maintains some human oversight to help spot targets or threats.

Ukraine’s deputy prime minister and technology chief Mykhailo Fedorov has described fully autonomous killer robots as a "logical and inevitable next step".

21 March 2019, Berlin: People take part in a campaign of the association
A 2019 march against killer robots in Berlin. (Getty Images)

Peter Asaro, spokesperson for the organisation Stop Killer Robots, told Yahoo News that there are grey areas around what makes a ‘killer robot’ and that even systems where humans still nominally make the decisions could cause issues.

"An autonomous weapon, or killer robot, is any weapon system that automates the selection and engagement of targets," he said. "These systems are usually not self-contained, and are typically a distributed system of sensors, data processing and a weapons platform working together to identify and attack targets.

"Even decision support systems that automatically search for and identify targets for human operators are a concern, as the operators may not be able or willing to second-guess the machine."

"The threat is increasingly realistic," Asaro said. "Especially given events that have unfolded in Ukraine over the past year. We have seen the real significance of drones in a lot of different kinds of applications. Most drones are not autonomous weapons, but I think it is fair to say that there are drones starting to be used there that have some type of autonomous targeting capability."

In 2017, leaders in the technology sector including Elon Musk wrote to the UN calling for autonomous weapons to be banned.

The proposal suggested that autonomous weapons could be banned under laws similar to those that ban chemical weapons. The group warned of a ‘third revolution in warfare’ - with the first two having been the invention of gunpowder and of nuclear weapons.

Pope Francis arrives for the weekly general audience on September 18, 2024 at St Peter's square in The Vatican. (Photo by Filippo MONTEFORTE / AFP) (Photo by FILIPPO MONTEFORTE/AFP via Getty Images)
Pope Francis has spoken out against autonomous weapons (Getty Images)

In November last year, the UN General Assembly adopted the first-ever resolution on autonomous weapons, stressing the “urgent need for the international community to address the challenges and concerns raised by autonomous weapons systems". It added that "an algorithm must not be in full control of decisions involving killing or harming humans".

The UN's advance report published in August warned of "widespread concern" that such weapons systems have the potential to "change warfare significantly and may strain or even erode existing legal frameworks".

UN Secretary-General Antonio Guterres is quoted in the report as saying that "the autonomous targeting of humans by machines is a moral line that must not be crossed".

"There is a real possibility for a treaty," Asaro said. "The global public is in broad agreement that we really do not want to live in a world where machines are given the authority to decide who lives and dies, and that we should ban and regulate these things.

"That is what has happened with other weapons that we thought were inevitable, that we thought were hi-tech and great. We once thought chemical weapons were going to be so much better than bullets and bombs, because they were clean and efficient, and would result in bloodless war. Then after World War One, when the public saw what chemical weapons did, the global public said, 'We don't really want that. We're going to ban these weapons. They're terrible and appalling'.

"Hopefully we can preemptively ban autonomous weapons before they develop into something that terrible, and before any militaries become overly reliant on them. And that is the real goal."

Autonomous weapons systems tend to be cheaper and easier than training troops to do the same things.

The US made clear its intentions in a speech by Deputy Secretary of Defense Kathleen Hicks in August 2023.

"To stay ahead, we’re going to create a new state of the art—just as America has before—leveraging attritable, autonomous systems in all domains—which are less expensive, put fewer people in the line of fire, and can be changed, updated, or improved with substantially shorter lead times.

"With smart people, smart concepts and smart technology, our military will be more nimble, with uplift and urgency from the commercial sector.

"So now is the time to take all-domain, attritable autonomy to the next level: to produce and deliver capabilities to warfighters at the volume and velocity required to deter aggression, or win if we're forced to fight," she said.

However, conflicts such as Ukraine are putting pressure on leaders and soldiers to improvise increasingly autonomous weapons, warns Asaro.

"When you are in a war, especially one that threatens your sovereignty or independence, you become desperate for anything you can get your hands on that might help," he said. "You are willing to try a lot of new things, and you are willing to push those boundaries further than you might otherwise in peacetime or conflicts that weren't so desperate for any kind of weapon."

In the UK the government was warned last year to proceed with caution on AI in autonomous weapons.

A report by the House of Lords 'Artificial Intelligence in Weapon Systems Committee' published in December 2023 said that while the Government had sought to be “ambitious, safe, responsible”, in its application of artificial intelligence (AI) in defence, aspiration has not lived up to reality.

It urged the government to "lead by example" in international engagement on the regulation of the weapons and "ensure human control at all stages of an AWS’s lifecycle".

If international law does not establish that human control over weapons systems is required, then hostile regimes will be able to use autonomous weapons on their citizens or start unjust wars.

There’s also a real possibility of terrorists using autonomous systems as weapons of mass destruction.

"There are also questions of the destabilisation of global and regional power, and potential arms races," warned Asaro. "Autonomous systems can be hacked and taken over and turned against others, so they pose an enormous cybersecurity risk. By requiring and ensuring human control over lethal decision making, we can avoid a whole range of serious problems."