Case for armed robot laws is mounting

21 Nov

Israel’s unmanned armed Guardium vehicle still has a human in the loop, for now.

Human Rights Watch has released a new report that is pretty much self-explanatory: Losing Humanity: The Case Against Killer Robots. In it, the advocacy group argues for a ban on fully autonomous, armed machines, in fear that their development will ultimately result in a Terminator-like situation where robots end up killing innocent humans.

The group believes such machines are only a few decades away, according to a statement:

Fully autonomous weapons do not yet exist, and major powers, including the United States, have not made a decision to deploy them. But high-tech militaries are developing or have already deployed precursors that illustrate the push toward greater autonomy for machines on the battlefield. The United States is a leader in this technological development. Several other countries – including China, Germany, Israel, South Korea, Russia, and the United Kingdom – have also been involved. Many experts predict that full autonomy for weapons could be achieved in 20 to 30 years, and some think even sooner.

As per that last part, the group’s estimate is probably way off with full autonomy likely to come much sooner. Armed flying drones have been taking to the skies in Afghanistan and Iraq for the better part of a decade, while Israel is currently using armed ground robots such as the Guardium, likely in its current conflict in Gaza. In each case, there’s a human operator in the loop, but that’s likely to change soon.

One of the flaws in Human Rights Watch’s argument is its belief that robots have no way of distinguishing enemies and civilians. As Noel Sharkey, a professor of artificial intelligence and robotics at the University of Sheffield, tells Time magazine, “It would be impossible to tell the difference between a little girl pointing an ice cream at a robot, or someone pointing a rifle at it.”

It’s actually not that difficult, with even commercially available image-recognition software like Google’s photos starting to achieve accurate differentiation. One of the startups I visited in Israel last month, AnyClip, is also doing a variation of this sort of thing, where movies can be searched for specific items. Here’s what typing in RPG (as in rocket-propelled grenade) brings up. If the Israel Defense Force isn’t looking into what AnyClip and similar companies are doing, it’s not doing its job.

Meanwhile, the Pentagon’s mad science wing – the Defense Advanced Research Projects Agency – is putting a priority on the development of so-called threat-recognition systems, with its program already bearing fruit. With this sort of technology destined for robots, it’s a safe bet armed that militaries will be pushing the envelope with autonomous armed machines much faster than Human Rights Watch believes.

The group isn’t wrong in calling on limits to this sort of thing. An outright ban, however, is unlikely to work since different governments have different needs at different levels of urgency. Some will develop such killer robots regardless of whether the international community frowns on it or not.

One thing I learned in Israel, which is a hotbed of military robot development, is that the country suffers from a collective feeling of being surrounded and outnumbered by generally hostile neighbours. If fully robotic soldiers can even those odds somewhat, it will be hard to sway the country from that path.

International rules outlining accountability, however, would be a better place to start. Governments in the U.S. and Israel are already trying to sidestep responsibility for the damage existing drone strikes are doing, so there need to be clear rules for who is ultimately responsible for anyone’s death if humans are indeed going to be pulled out of the loop, which seems inevitable.

War is about to go the next level, but that doesn’t mean some of the fundamental old rules shouldn’t still apply.


Posted by on November 21, 2012 in israel, robots, u.s., uav, war


5 responses to “Case for armed robot laws is mounting

  1. Marc Venot

    November 21, 2012 at 12:34 am

    “Science without conscience is but the ruin of the soul.” Rabelais
    I. Asimov wrote what 3 laws robots must follow but of course the temptation to trespass is huge when you have enough lobbying power.

  2. Justin Amirkhani

    November 21, 2012 at 12:37 am

    I for one welcome our evil robot overlords. God made man and we made him extinct. It’s time we continue the pattern.

  3. The Great Antagonizer

    November 21, 2012 at 1:26 am

    I want one!

  4. russellmcormond

    November 21, 2012 at 2:30 pm

    Taking the horrors out of war will only enable them to become more horrific. Remote-control weapons have already de-humanized state-sanctioned murder and made it much easier to do and forget that the target isn’t a computer generated graphic but a real person who isn’t a player in some video game directed by incompetent politicians.

  5. Torontoworker

    November 21, 2012 at 6:17 pm

    This is what you get by employing the video game generation instead of men who have been in battle, have been shot at – have seen their pals blown up and KNOW the price of failure, the waste of humans on the battlefield. Now we have these young men, many who learned how to drop munitions from 20 thousand feet from an F16 who ‘graduate’ (budget cuts) to sitting in a trailer in Las Vegas firing Hellfire missiles from drones at SUV’s on behalf of ‘someone’ (CIA?) whom order them to ‘waste’ the vehicle and all who ride inside it. Do they KNOW these people are threats? Is everyone inside the vehicle a terrorist? They don’t know and don’t care as they have the, ‘sucks to be you’ attitude. Not for one minute do they see any parallels between flying 757’s into buildings and firing missiles at houses and vehicles not knowing who is in the street at that moment or in the vehicles. Execution without warrant. They high five and shout out when they ‘nail’ the vehicles until of course a 60 minutes film crew show their joy at this ‘special’ type of gaming and then they are told to refrain from showing joy, well at least when camera’s are not around. Now we are even going to take out the suspect due diligence that we are told that is done by the military when they kill people and transfer that decision to artificial intelligence? Oh great, software will determine who lives and whose bodies become shredded by depleted uranium sabot rounds. Who do we arrest when it all goes wrong and the bus filled with school children are evaporated because the bus driver didn’t understand commands to stop or they panic? Do we arrest the beta testers? The hardware designers? How about these weapons companies along with their shareholders be forced to bury all the innocent victims of any software ‘anomalies’ themselves – march the programers and CEO’s right out into the villages with shovels. Let them SEE what real death looks like. Let them SMELL death and what they have caused. Let them use their imagination – their own families on the receiving end of the infrared sensors and fuzzy logic programs being executed in the words of RoboCop – with extreme prejudice.

%d bloggers like this: