Publications: Science Omega Review Europe Issue 2

Making the drone distinction: autonomous robot awareness

Predator drone
The more autonomous the technology becomes, the longer it takes to recognise any problems.
Professor Noel Sharkey
As global powers venture further into the arena of modern weaponry, Chairman of the International Committee for Robot Arms Control Professor Noel Sharkey explains to Editor Lauren Smith why greater awareness of autonomous robotic weapons is needed…

Technology development in the military has had a significant role to play for decades, but in recent years the exponential increase in technologies used to engage in armed conflict has become even more apparent. One particular area within this sector is the use of drones – or unmanned aerial vehicles (UAVs). Noel Sharkey, Professor of Robotics and Artificial Intelligence at Sheffield University in the UK, is now leading a campaign to raise regulation around increasingly autonomous robotic weapons, or ‘killer robots’.

When Sharkey first started researching military robots in 2005, there was little international discussion – much less so on any formal or ethical guidelines – about the regulation of robotic arms, leading him to co-found the International Committee for Robot Arms Control (ICRAC) in 2009, originally with a committee of four people that has since expanded to 21. Fully autonomous weapons, of the sort being developed by a number of key defence players, would undoubtedly contravene international humanitarian law, as Sharkey suggests to Editor Lauren Smith.

"The cornerstone of the Geneva Convention, and of all the laws of war, is the principle of distinction," he explains. "The principle of distinction means that the military must be able to discriminate between a combatant and a civilian, or anybody who is ‘hors de combat’, such as a wounded or mentally ill soldier. There are many people, even in combat, who it is inappropriate to kill and weapons must be able to be used properly in these circumstances. There’s no way any kind of computer system we have at the moment can make that kind of discrimination."


Attack and explosion
 

By way of example, Sharkey cites weapons such as the Israeli Harpy, a loitering rocket-fired munition that resembles a drone. "It hovers in the air for some time searching for radar signals," he describes, "and when it finds one, it consults a database. If the radar signature is not recognised, the presumption is that it is an enemy radar, connected to an anti-aircraft installation. This can then lead to attack and explosion.

"So, you could say that it can discriminate between an enemy or a friendly installation, but it does not conform to the principle of distinction because it misses more subtleties. It assumes that an unrecognised radar is connected to an enemy installation – I know of no incidents where there have been mishaps, but the radar could also be on the roof of a hospital or a school and the Harpy would not be able to recognise that. It would not be able to distinguish a civilian building. The point is that it can make discrimination, but it isn’t compliant with the principle of distinction."

Another key element of the rules of war is the principle of proportionality. This essentially means that when an attack is launched, there may be some probability of civilian casualties, but the number of civilians killed or injured, and other collateral damage, must be directly proportional to the military advantage gained. Sharkey believes that there are currently no robot systems that have the level of reasoning and human judgement to be able to conform to that, nor does he expect this to change in the near future, hence the need to ensure compulsory human engagement in the process.

He is adamant that international legislation must be put in place to avert the current course of progress: to prevent autonomous weapons that, once launched, can select and attack targets without further human intervention.

"The kind of thing we are worried about, in the shortest term, is that there are 76 countries, which we know about, that have drone programmes, and there may well be more with the technology under way," says Sharkey.  "Not all of them have armed drones, but most are pushing towards getting them.

"The worry is that as the US develops autonomous drones, it will initiate an arms race for other people to get them. It’s very blinkered of the US to think they will be the only country to pursue this. The US is not talking about one individual drone conducting strikes, but swarms of them for air-to-air combat and more – and once multiple countries have them functioning, I’m concerned about lowering the threshold for going into conflict."

No troops on the ground
 

He believes that the use of existing drones has already lowered this, as illustrated with a recent example from the US. "Usually, within 60 days of conflict initiation, the President must ask Congress’ permission to continue – the War Powers Resolution. When Obama committed drones to Libya this didn’t happen and the stated reason was that no US troops were involved on the ground and there was no chance of casualties. This means that drones can be committed at any time.

"With autonomous robots, there are no people involved at all once they have been launched, so conflict could be started and if both sides use autonomous drones, there are no body bags coming back to deter each home country; normally a major deterrent from conflict. It may also trigger unintended hostilities in other regions and overall I believe it will destabilise world security."

Another major problem, as Sharkey sees it, is that if you take any two incompatible systems, their behaviour in conjunction is entirely unpredictable. "If two groups of autonomous robots were to meet, each with particular strategies and actions built into their programmes," he says, "it is entirely unpredictable how they will interact. I’m not concerned about a Terminator-style scenario with robots turning on us – my problem is not super-smart robots, it is really stupid robots."

Hopeware, not software
 

Supporters of more autonomous systems may suggest that humans are more likely to introduce errors than well-programmed robotics, but Sharkey is adamant that there is absolutely no evidence to support this.

"People who are defending this, who want money for research or profit from manufacturing, are saying that at some point in time robots will be much more accurate than people, for example robots could respond much quicker to a sniper. But this is what can be called hopeware, instead of software. In other words, you’re making promises about things that we don’t know to be the case," he says. 

"To illustrate this, if someone fires a rifle, there is a flash and a robot could be programmed to turn very quickly and fire into that flash, but the question is, is it really a sniper? Or is it a goatherd shooting a wolf, or is it a firework, set off by terrorists to deliberately throw the robot to get it to kill civilians? It’s easy to set up scenarios in a laboratory or structured environment, but we need to be thinking more about real world, unstructured and unpredictable environments. The enemy can adapt and that’s why you need a human in the loop to be responsible."

Another principle of the laws of war is that there must always be a person who is accountable. Since a robot obviously can’t be, it raises the complex question of where accountability would lie. 

"For a missile, or any kind of weapon, the responsibility lies with the commander who sent it into the mission. With an autonomous robot, there are so many other responsibilities: software coding errors, computing malfunctions, the impact on function of being hit by a bullet, all of the components made by many different companies, and many other variables. It’s essential that someone is held accountable and it may not be fair for a commander to shoulder all of it."

Cyberattack
 

Although Sharkey doesn’t believe that an increase in automated drones would necessarily leave countries more open to cyberattack in warfare, suggesting that cyberwar is already happening on a massive scale anyway, he does believe that more attention must be paid to verifying and minimising potential failures, one avenue of which may come through cyberattacks.

"Once you have the appropriate codes and access, you could potentially cyberattack any drones. Lots of countries are penetrating US security constantly and getting access to the control boards for robots, for example, which they can then reconstruct and work out how to hack them. A major worry with an autonomous drone is that someone could hack into it, take it over and use it for other purposes."
I’m not concerned about a Terminator-style scenario with robots turning on us – my problem is not super-smart robots, it is really stupid robots.
Professor Noel Sharkey

Just last year, a professor in Texas showed that he could spoof a drone by building a ground station sending a stronger GPS signal to the drone than the one it was ordinarily receiving, making it believe that it was at a different location than in reality. It illustrated how such approaches could have been used to crash it into a building, whilst the drone was still under the impression that it was in a safe area, such as over the ocean.  

"The more autonomous the technology becomes, the longer it takes to recognise any problems," he explains. "This links back to potential issues with error or espionage in the supply chain. If something odd starts to happen in a manned fighter plane, the pilot will immediately recognise it and take steps to address it. A man-in-the-loop drone makes this a little harder, but there will still be someone there looking through a camera who can take action. With complete automation, it could be quite some time longer before any spoofing or hijacking would even be noticed and it could be too late to prevent disastrous consequences."

A vital distinction
 

Sharkey is far from scaremongering about the use of robotics for military application, highlighting instead that as a society – from the researchers, to policymakers and the general public – greater awareness is needed about the potential limitations of such technologies.

"I’ve worked in autonomous robots for many years and we are not trying to ban them, even in the military," he clarifies. "That would be impossible. They are already being used for so many things, from sewage work to farming and environmental research. Autonomous robots are often a very good thing and we’re not looking to stop that. They can do so much of the dull or dangerous and dirty work that people can’t or don’t want to do."

But the distinction is vital. "What we’re trying to do, in simple terms, is draw a hard line on prohibiting delegation of the decision to kill to a robot. You can build autonomous robots, but don’t develop them to be able to kill humans on their own."


Professor Noel Sharkey
Chairman
International Committee for Robot Arms Control

http://icrac.net


MORE ARTICLES FROM SCIENCE OMEGA REVIEW EUROPE ISSUE 2

Previous: A home remedy?
Graham Wylie, CEO of The Medical Research Network, champions the benefits of home healthcare for patients, health services and the future of clinical research in Europe...

Next: Manufacturing the conditions to 're-industrialise' Europe
Orgalime Director General Adrian Harris highlights the importance of creating the right conditions for manufacturing investment and the ‘re-industrialisation’ of Europe...
COMMENTS


(NOT DISPLAYED)




YOUR COMMENT WILL BE APPROVED BY A MODERATOR
HTML CODE IS NOT PERMITTED.
RELATED CONTENT
  • islet transplantation
  • chemical body burden
  • bioproduct clusters
  • mind-reading technologies
  • grizzly bear
  • electrospinning
  • NSF budget
  • dolphin memory
Lets face it, so long as Theresa May is home secretary we're not likely to see any progressive change in drug policy in the UK. I mean, the woman just made Qat a class C drug against solid scientific evidence. Reminds me of the reclassification of cannabis as mentioned in this article. Can someone please give these politicians a good shake and make them see that what they're doing is extremely counterproductive! Argh!!


Commented Anonymous on
Cannabis psychosis: are politicians making the situation worse?

publicservice.co.uk Ltd, Ebenezer House, Ryecroft, Newcastle-under-Lyme, Staffordshire ST5 2UB
Tel: +44 (0)1782 741785, Fax: +44 (0)1782 631856, www.publicservice.co.uk
Registered in England and Wales  Co. Reg No. 4521155   Vat Reg No. 902 1814 62