Live from TEDGlobal

How about we *not* make killer robots: Daniel Suarez at TEDGlobal 2013

Posted by:
TG2013_026984_D41_9028

Photo: James Duncan Davidson

Technology thriller author, most recently of the book Kill Decision, Daniel Suarez has been thinking about drones for some time. But as he says, as he stands onstage at TEDGlobal 2013, he’s not here to talk fiction. “I’m here to talk about very real autonomous combat drones,” he says, adding that he doesn’t mean remotely piloted drones such as the Predator or Reaper, where a human being still makes combat decisions. Instead, he’s talking about “fully autonomous robotic weapons which make lethal decisions about people on their own.” The technical term for this, he explains, is “lethal autonomy,” and the fact that there’s a technical term for it should tell us this is something worth minding.

TG2013_026883_DSC_1771

Photo: James Duncan Davidson

“As we migrate lethal decisionmaking from humans to software, we risk not only taking the humanity out of war but also changing our social landscape entirely,” says Suarez. “The way humans resolve conflict shapes our social landscape.” He spins through a quick history lesson of combat innovations, from the armor worn by a knight on horseback, to gunpowder and the cannon, to the nation-state, by which time leaders were forced both to rely on and share power with the people. Autonomous robotic weapons, he says, are just such a step forward. “But by requiring very few people to go to war, they risk re-centralizing power into very few hands, possibly reversing a five-century trend towards democracy.” Knowing this, we must take decisive steps to preserve our democratic institutions — and soon. Suarez details three powerful factors driving the ever-faster development of drones:

1. Visual Overload

In 2004, the U.S. drone fleet produced 71 hours of video surveillance for analysis. By 2011, that figure was 300,000 hours annually, and that number is only going to increase. Cameras such as the Gorgon Stare produce so much footage no human could possibly review it all. “That means we’ll have to program visual intelligence software to review it; and that means very soon drones will tell humans what to look at, not the other way around,” says Suarez. You can feel the weight of this thought sink in around the room.

2. Electronic Warfare

In 2011, GPS signal on a U.S. RQ-170 sentinel drone was confused by a “spoofing attack” and captured. Moving forward, drones will be programmed so that such interference cannot happen. “They will know their objective and they will react to circumstances without human guidance,” says Suarez. Drones won’t rely on outside influence, in other words, but will make their own decisions.

3. Plausible Deniability

In a world where cyber-espionage is not confined to science fiction, who knows who does what where, or who knocked off which company, when. “Sifting through the wreckage of a suicide drone attack, it’ll be very difficult to say who sent that weapon,” says Suarez soberly, to glum murmurs from the audience. “This raises the very real possibility of anonymous war. This could tilt the geopolitical balance on its head and make it very difficult for a nation to turn its firepower against an attacker. That could shift the balance away from defense and toward offense, making military action a viable option for not only small nations but for private enterprise, powerful individuals. It could create a landscape of rival warlords, undermining the rule of law and civil society.” Gulp.

Suarez isn’t done with the bad news for those living in the developed world. Not only do they have no advantage over those in developing nations, they might actually be at a disadvantage. Big data, he says, makes those of us living in the west vulnerable, perfect targets for autonomous weapons. Where a marketer might use data to send you personalized product samples or services, a repressive government could use it to eliminate political opposition. “Popular movements agitating for change could be detected early and their leaders eliminated before their ideas achieve critical mass,” says Suarez. You can hear that proverbial pin drop; the audience is both spellbound and open-mouthed.

TG2013_027057_D41_9101

Photo: James Duncan Davidson

Now the call to action. “We need an international treaty on robotic weapons. In particular, a ban on the development and deployment of robotic weapons,” says Suarez, pointing out that treaties already exist for biological and nuclear weapons, and robotic weapons might well be just as dangerous as those. (Do also see the Campaign to Stop Killer Robots, sponsored by former TED speaker, Jody Williams.) “We need an international legal framework for robotic weapons, and we need it before there is a devastating attack or incident which causes nations to rush to adopt weapons before thinking through the consequences.”

The danger is that drones “concentrate too much power in too few hands. They would imperil democracy itself,” he says. “No robot should have an expectation of privacy in a public place.” Instead, they should be stamped with an identifying marker in the factory. And an international treaty would help us to take advantage of any benefits from autonomous vehicles, while preserving civil society. “Let’s not succumb to the temptation of automated war,” he concludes. “Let’s make sure killer robots remain fiction.”

Daniel Suarez’s talk is now available for viewing. Watch it here »