Opinion

Bernd Debusmann

More drones, more robots, more wars

By Bernd Debusmann
January 31, 2012

Sometime in the next three decades, the U.S. military will be able to field robots that can make life-and-death decisions, operating without human supervision thanks to software and superfast computers.

But the technology to get to that point is running far ahead of considerations of the ethics of robotic warfare.

Or, as Peter Singer, a Brookings Institution scholar who has written widely on military robots has put it — technology grows at an exponential pace, human institutions at a linear, if not glacial, pace. That echoes an observation by the late science fiction writer Isaac Asimov that “science gathers knowledge faster than society gathers wisdom.”

The subject merits debate after the January 26 announcement that the Pentagon is planning to trim America’s armed forces by 100,000 while boosting the global fleet of armed drones, America’s most effective tool for the targeted killing of anti-American militants. So far, the drones are remotely operated, by pilots on bases in the United States.

But for a glimpse of how U.S. military thinkers see the future of the drone program, an 82-page report by the Air Force is recommended reading. Entitled “Unmanned Aircraft Systems Flight Plan 2009-2047“, it says that “advances in AI (Artificial Intelligence) will enable systems to make combat decisions and act within legal and policy constraints without necessarily requiring human input.”

Rather than just supporting humans in what the military calls the OODA loop (for observe, orient, decide, and act), drones will be able to “fully participate” in each step of the process. Humans will no longer be “in the loop” but “on the loop” — able to veto decisions taken by the flying robot — if time permits in the split-second environment of combat.

While they make more headlines than other systems, drones are just part of an American inventory that has grown explosively over the past decade and includes ground-based robots whose tasks range from defusing improvised explosives devices and shooting down incoming artillery shells to evacuating wounded soldiers. From virtually zero, the drone fleet grew to more than 7,500 and ground based robots to an estimated 15,000.

“Authorizing a machine to make lethal combat decisions is contingent upon political and military leaders resolving legal and ethical questions,” the paper states. “Ethical decisions and policy decisions must take place in the near term in order to guide the development of future capabilities, rather than allowing the development to take its own path.”

In other words, let’s sort out ethics and policies before letting the robotics genie fully out of the bottle. It’s a point made with increasing alarm by a number of civilian scientists, robotics experts and ethicists who fear, among other things, that sending more robots and fewer humans into wars will make starting them easier.

REMOVING BARRIERS TO WAR

“We possess a technology that removes the last political barriers to war,” Singer, author of Wired for War, wrote in an essay in the New York Times this month. “The strongest appeal of unmanned systems is that we don’t have to send someone’s son or daughter into harms way. But when politicians can avoid the political consequences of the condolence letter — and the impact that military casualties have on voters and on the news media — they no longer treat the previously weighty matters of war and peace the same way.”

This is a view shared by the International Committee for Robot Arms Control (ICRAC), a group formed in 2009 to press for an international debate on the regulation and control of armed military robots. ICRAC believes that the robotics revolution of warfare deserves the kind of debate that led to treaties on the use of poison gas or the ban on landmines.

None of the questions that prompted the formation of the group have been answered. For example: who would be accountable if an autonomous robot killed civilians? The manufacturer? The field commander in whose area the robot operates? The programmers who wrote the software? The procurement officer? The president?

The Geneva-based International Committee of the Red Cross has begun looking into the implications of robots in war but those favoring more regulations should not expect support from the administration of Barack Obama, who has presided over a dramatic increase in the number of drone strikes on targets in Pakistan since he took office in 2009.

That campaign, run by the Central Intelligence Agency (CIA) rather than the military, killed dozens of al Qaeda fighters and other militants using the rugged mountains on the Pakistani side of the border with Afghanistan as a safe haven. The strikes also killed civilians and stoked anti-American hatred in a country of 180 million that is of strategic importance to the United States. There has been similar blow-back in Yemen and Somalia.

This is one of the reasons why some prominent experts on military robots favor slowing the pace of development. In December, philosopher Patrick Lin of the California Polytechnic State University ended a briefing to CIA officials with a line robotic warfare enthusiasts might do well to remember:

“Integrating ethics may be more cautious and less agile than a ‘do first, think later’ (or worse ‘do first, apologize later’) approach but it helps us win the moral high ground – perhaps the most strategic of battlefields.”

PHOTO: U.S. Air Force First Lieutenant Zachary Goff (L), and Chris Allen, a student from Ohio State University, operate the control console to run a test flight of a drone at the Micro Air Vehicles lab at Wright Patterson Air Force Base in Dayton, Ohio, July 11, 2011. REUTERS/Skip Peterson

Comments
8 comments so far | RSS Comments RSS

Why not replace politicians with robots? Robots will make better decisions.

Posted by middleterm | Report as abusive
 

I was born in 1977 and I have begun to take technological progress for granted. My concern is where this this will leave us in 10 years, but rather in a 100 years or a 1000 years? As much as many see technology as the answer to all problems my experience is that is at the same time the breeder of new problems. Where would we be today if the Cuban Missile Crisis had ended differently?

Posted by BidnisMan | Report as abusive
 

edit “…is NOT where this will leave us…”

Posted by BidnisMan | Report as abusive
 

@BidnisMan Or more importantly where would we be if operational decisions during the Cuban Missile Crisis were made by robots.

Posted by Mavvvy | Report as abusive
 

Once it get’s to the point where robots can build and manage themselves under the directions of a few sociopaths at the top (and they’re all sociopaths at the top), things will get pretty nasty. Then again, America only votes for the pro war types, so there you have it, choose your own adventure. More war, coming right up.

Posted by hahax | Report as abusive
 

The writer…ahem…drones on about drones without asking critical and honest questions. Such as, why are drones preferred? Would a comparison with warfare absent the use of drones be any better than the current situation?

Here is a reality which Bernd never seems to face: Wars exist for reasons. We can all certainly deplore war. Violent death is by all reasonable and civilized consensus a horrible thing to be strenuously avoided. We can take mild comfort that the total world wars of the past have not (yet) been repeated. As much as we might wish that humans move beyond coercing and killing one another, it will inevitably continue for the foreseeable future.

A question I would like Bernd to answer (even if only to himself)is this: Has ever there been a just cause to wage war? Even the world’s chief pacifist, the Dalai Lama, has admitted that it was necessary to defeat the Axis during WW2. Other wars are not as obvious, but to any who are not blind the right of a people to defend themselves from conquest and attack is self-evident. Bernd, you have obviously been scarred by your experiences with bloodshed. Anyone would. Yet a generation of English and American politicians who’d also been scarred by their experiences in “The Great War” failed to adequately anticipate the rise and nature of Nazi Germany, leading to over 70 million preventable deaths. This is where absolutism in the name of peace leads, Bernd. Fancy slogans such as “You cannot simultaneously prevent and prepare for war.” are wrong. Einstein was also wrong about Quantum Mechanics. Genius that he was, his word is not gospel.

To return to the question of the use of autonomous decision making drones in combat, while it is certainly appropriate to question the potential for problems and seek to develop protocols, it is myopic in the extreme to refuse to consider their use, presumably in the name of humanity. I read Singer’s book the week it came out, years ago. He raises many valid points of concern. But he acknowledges, as Bernd does not, that drones are no different than any other invention of war. Furthermore, the use of the term artificial intelligence is perhaps the most abused in the lexicon, summoning images of the Terminator or the Matrix. Artificial intelligence has not been achieved except in the most primitive lab confined instances. All programs are simply sets of instructions given by humans, no more. Singer makes this perfectly clear in his book.

The last point I would like to counter is the idea that if the sons and daughters of Americans will not be at risk in drone warfare it will make Americans more willing to wage ware is only half true. Another name for a half truth is a lie. Americans have certainly proven their willingness to allow their soldiers to be killed. Probably no first world country today could have sustained the American casualty rates in Iraq for years and yet remained in such a needless conflict. Americans had little historical or cultural or even strategic commitment to Iraq. Popular enthusiasm for the conflict peaked at 70% at the start of the invasion and plummeted within months until almost no Americans thought it was worth it, yet they continued to accept thousands of dead and maimed year after year after year.

Although there certainly have been innocent civilians killed in drone attacks, if one makes an honest comparison to how many would have been killed using pre drone methods, it becomes obvious that drone use thus far has been a less lethal option for innocent civilians and a more lethal option against enemy combatants. And yes, America has killed people it otherwise might not have been able to kill, such as the Al Queda leadership in the Swat and Yemen. To anyone who is truly informed of the danger to innocent civilians that these men posed, that can only be seen as saving more lives.

This might involve acknowledging that America’s battle with mass murdering terrorists is legitimate, however. I sense that this is after all the biggest hurdle for Bernd and like minded. Despite all the suicide attacks, bombings, and open declarations of war on the part of fundamentalist Islamic jihadists, many in the west are so committed to their own hatred of America that they actually believe if we simply stopped defending ourselves we wouldn’t need to defend ourselves anymore. I am absolutely willing to engage in harsh criticism of America’s many abject failings; however, the struggle against Al Queda is not imaginary and truly massive numbers of innocent people’s lives hang in the balance.

Posted by VoltairesGripe | Report as abusive
 

“it says that “advances in AI (Artificial Intelligence) will enable systems to make combat decisions and act within legal and policy constraints without necessarily requiring human input.”

Everything the Nazi regime did was legal to the Nazi regime.

They shouldn’t slow it down – they should kill the program. Won’t local police forces want the same capability or the nearest affordable spinoff?

I could never really stand that movie Robocop. They wouldn’t let that poor piece of a flesh die in two(?) sequels? If the fans didn’t get the point after the first one there is no hope for them.

Posted by paintcan | Report as abusive
 

welcome to the movies, looks like it won’t be god taking us out this time LOL, but a pissed off toaster.

May you die quickly. (Beep) your food is now ready.

we as a race deserve what ever horror we unleash on our selves. Dont say your innocent, you did nothing to stop them.

Posted by AWR66 | Report as abusive
 

Post Your Comment

We welcome comments that advance the story through relevant opinion, anecdotes, links and data. If you see a comment that you believe is irrelevant or inappropriate, you can flag it to our editors by using the report abuse links. Views expressed in the comments do not represent those of Reuters. For more information on our comment policy, see http://blogs.reuters.com/fulldisclosure/2010/09/27/toward-a-more-thoughtful-conversation-on-stories/
  •