Tech

Drones, war, science fiction, cybercrime. A conversation with an expert in the future of warfare

Nov 20, 2013 /

QWA-category-Drones

In 2009, political scientist P.W. Singer gave a TED Talk in which he predicted that the growing use of robots in war would change the realities of combat. Four years, and thousands of US drone strikes later, he talks to journalist Matthew Power; an edited version of their conversation follows.

Matthew Power: In a recent article you discussed how science fiction writers like H.G. Wells and Arthur Conan Doyle predicted technologies that then became part of our world. And now you’re working on a novel. Do you see sci-fi today predicting developments on a further horizon than we’re talking about?

P.W. Singer: The interplay between science fiction and the real world is a force that has been there for centuries. At one point, it was through writers like H.G. Wells, because the novel was the main vector for entertainment. Then we moved on to movies and TV shows — think of how powerful Star Trek was in influencing where technology would head next. Now it’s gaming. It’s like what happened in those great old episodes of Star Trek, where they envisioned something futuristic like a handheld communicator and then someone watching in a lab would see it and said, “I’ll make that real.” And now that’s the same for gaming. I was a consultant for the video game Call of Duty: Black Ops II, and I worked on a drone concept for the game, a quadcopter called Charlene. Now defense contractors are trying to make Charlene real. So it flips the relationship. Previously, the military would research and develop something and then spin it out to the civilian sector. Now the military is faced with a challenge of how to spin in technology.

And of course the pace of development, on both the military and civilian sides, is extraordinarily fast. In your 2009 TED Talk you described the Predator as the “Model-T of drone technology.”

Drone technology has gone from being something that’s abnormal to the new normal. It’s gone from, “Hey, there’s this new thing you might not have heard about called a Predator Drone” to “Actually, guess what, it’s out of date.” There are many parallels with drones and what happened with flying machines. First they were science fiction. Then they became science reality, the airplane. Then they were adapted for war. Now, like with the plane coming out of World War I, you see these roles moving over to the civilian side in terms of using the drone for observation or surveillance, whether it’s law enforcement using it, or journalism, or search-and-rescue. Drones have gotten smarter, able to do more on their own, which in turns make them easier to use and opens them up for more users. So you’re now seeing kids flying them. Go back to airplanes. They were flying literally tens of thousands of airplanes in WWI, but it wasn’t until 1919 that civilians came up with the idea of using an airplane to move cargo back and forth. That led to the multi-trillion dollar air transport industry. That came out of someone crossing innovation with profit-seeking. As exciting as a drone for search-and-rescue is, or environmental monitoring or filming your kids playing soccer, that’s still doing surveillance. It’s what comes next, using it in ways that no one’s yet imagined, that’s where the boom will be.

You have made the point that as a new technology like the drone evolves, Moore’s Law and Murphy’s Law play out at the same time. What do you see as the cautionary tales from this robotic revolution? What things do we have to be careful of?

There are two ways to illustrate how much of a game-changer, how much of a disruptive technology, drones are. One is the vast array of new capabilities that we didn’t imagine were possible a generation earlier. Then there are the new political, legal and ethical dilemmas that we have to figure out. And none of those fall into the normal partisan debate. That’s perhaps the best illustration of something new today; that it doesn’t fall under a clean Republican or Democrat or right/left line. On the domestic side, we have all the new kinds of laws that are needed in areas like privacy. And it is not just laws needed, but norms of behavior as well. That is, some of our new problems to solve may require new law. Others may require policy. Others may require a new accepted way of doing business. This will affect everything from our privacy to legal jurisdictions to insurance. That’s one of the fundamental questions to figure out as you open up the airspace to drones, or as you start to use something like robotic cars. It’s not just, “can the car drive?” It’s “who’s responsible for it?”

On the military side, you find this in the questions drones are pushing about the laws of war. How do we deal with weaponry where the human role is not just moving geographically but also moving chronologically? Where the decision might take place thousands of miles away, and maybe even days, weeks, months, or even years before the weapon is used. That is what’s happening with robotics, and also with cyber. With Stuxnet, the cyberweapon that sabotaged Iranian nuclear research, for example the “shoot decision” was made months before the weapon “fired.”

Speaking of cyber, your next book, Cyber Security and Cyber War: What Everyone Needs to Know, explores some of the most-discussed technological anxieties of our day. How did you approach the topic?

Cyber is an issue that keeps popping up in the news again and again, and I think we all know it’s important, and yet the vast majority of us don’t understand anything about it. The IT crowd are the only ones who seem to understand it, and yet it affects all of us whether we care about politics or military or law or business or journalism or even just “how do I protect my kids online?” And so the format of this book is to walk through the top 50 questions that people have, or should have: “How does the internet actually work, and what does that mean for cybersecurity?”; “Who is this group Anonymous?”; “Cyber-terrorism: is that a real thing or not?”; “Is there a cyber-industrial complex?” and so forth. These technologies are very disruptive, and they’re raising deep questions.

Edward Snowden’s massive leak of classified NSA documents is perhaps the biggest news story of the year. What is your take on his actions and the revelations that have been made about the NSA’s surveillance and espionage programs?

The way to think about it is basically this: What’s legal is not always what’s smart, and vice versa. And that definitely applies to some of the things that the NSA was doing. The problem of the Snowden revelations and how we have to wrestle with them is that they have encompassed so many different things; there are some parts of it that are justifiable and thoughtful and smart, and the normal course of events of what an espionage agency does, such as spying on your enemies. Then there are other parts that are questionable legally, such as how much data was collected on US citizens. Whether they were in the spirit of the law or the letter of the law is another issue, but clearly they pushed the envelope.

And then there are the parts that were unstrategic, which is a nice way of saying shortsighted and even stupid, like spying on close allies. The double legacy of this is: One, it’s hollowed out the American ability to operate effectively in ensuring the future of the internet itself, in the way we would hope it would be. That has huge long-term consequences. And the second is, it’s been and will be a hammer-blow to American technology companies. The cloud computing industry, for example, had a recent estimate that they’ll lose $36 billion worth of business because of this.

You recently worked on a project for the Pentagon called NeXTech, exploring the ways emerging technologies will affect the global strategic picture for the United States. One of the exercises conducted was an “ethics war game.” How does that work?

An ethics war game sounds like an oxymoron, but essentially we brought together three groups. The first was ethicists and philosophers, primarily from university settings. The second group was those working in the law: military lawyers, human rights activists and international lawyers. The third group was policy folks, people who work within the bureaucracy. We had them consider the various ways these new technologies might be used in war, crime, terrorism, etc., and asked, “Okay, what do you think is right? What do you think is wrong? What do you think is allowable? What is not?”

What did you discover through the exercise?

What’s fascinating in that setting was, what was ethical was not always aligned with what was viewed as legal under the old laws, and that wasn’t always aligned with what was seen as the right policy to undertake. It was fascinating to see those disputes play out. It was not just limited to robotics. We were looking at other technologies: cyber, human performance modification, directed energy (lasers), 3D printing. These were all the different challenges that loomed from it. We are entering a time of massive disruption. There are huge possibilities — technical, political, legal, ethical — but also huge perils. That’s one of the challenges for an organization like TED. On one hand it wants to raise awareness of all the new things that are possible in the world, but it also has to be sure to avoid triumphalism or an immature notion that technology is somehow the solution to all our problems. Whether it is a stone or a drone, technology is merely a tool that you can use for both good and bad purposes.

Clearly we are entering a brave new world with our technology. Do you feel that our ethical and moral consciousness has kept pace?

Technological change moves at an exponential pace. Unfortunately, government moves at a glacial pace. So it’s not just that they’re disconnected, the distance between them is growing wider and wider. The illustration in the world of cyber would be this: as late as 2001, the director of the FBI did not have a computer, did not have an email account. 2001! But that is not the example that is most telling. We found out, as late as 2013, the Secretary of the Department of Homeland Security, the agency ostensibly in charge of cyber-security, did not use email, and not out of a sense of it being insecure. She just didn’t think email was useful. As late as 2013, one of the Supreme Court Justices revealed that eight out of the nine justices don’t use email or computers. So the institution in charge of securing us on cyberspace is headed by someone who, simply put, looks at the technology a lot like my grandmother looked at VCRs. And the institution that will ultimately decide many of the most important legal and ethical dilemmas, looks at it how I look at my son’s iPad. There’s so much change happening, yet the institutions organizationally, and the individuals at the top of them who have decision-making power, have a tough time keeping up with it.

What are your thoughts on the psychological impacts on those conducting this new variety of technological warfare, where they are often distanced from the battles they’re fighting?

Humankind has been at war, in terms of recorded history, for over 5,000 years, probably longer than that. It wasn’t until literally just a couple years ago that we recognized the existence of post-traumatic stress disorder. Yet even though we now recognize it, we still don’t understand it. We still don’t understand its dynamics or the best ways to treat it. And that’s in terms of the “normal” type of war. So to think that we can have a handle on the same thing playing out in this new kind of remote and distanced war would be folly.

So how is PTSD different for, say, drone pilots?

What we have at best are theories and some very early studies. When you get to the treatment side, is it really PTSD or not? Is it from seeing trauma? Seeing it is the same in some ways as experiencing it. The drone pilot and operator may actually see more of the trauma than, say, a bomber pilot in the past. A bomber pilot drops a bomb, and flies away. The drone pilot sees the target up close for days, hours, beforehand, sees what happens. Another issue may be the fact that they see that, and then they walk outside the box and they’re in America. They’re not at war. 20 minutes later they’re sitting at the dinner table with their kids. They are dealing with all the pressures of home and war simultaneously in a way that someone physically in a war zone doesn’t.

It’s amazing that none of these technologies can actually prevent us from engaging in war, but it seems that’s your broader point: that these are flaws in humanity that go beyond any kind of technology, that conflict is an eternal thing.

These are amazing technologies. We’re doing incredible, wonderful things. You can’t help but get excited about this. It’s the height of human creativity, maybe even creating a new species. But at the end of the day, if we’re really honest with ourselves, we’re doing it to get better at killing one another. And I don’t see any changes with that. The hows, the whos, the wheres, even the whens of war are changing, but the fundamental whys remain the same.

P.W. Singer is the author of Wired for War, which explores the evolution of robotics in armed conflict; Children at War, about the spreading use of child soldiers; and Cyber Security and Cyber War: What Everyone Needs to Know. He is a senior fellow at the Brookings Institution, where he directs the Center for 21st Century Security and Intelligence. Matthew Power was a freelance journalist and contributing editor at Harper’s Magazine whose works included “Confessions of a Drone Warrior.” He died in March 2014.