P.W. Singer on the Ethics of New Military Technologies

Peter Warren Singer is Senior Fellow and Director of the 21st Century Defense Initiative at the Brookings Institution. He is the youngest scholar named Senior Fellow in Brookings' 90-year history. In 2005, CNN named him to their "New Guard" List of the Next Generation of Newsmakers. Singer has also been recognized by the Financial Times as "Guru of the Week" for the thinker that most influenced the world that week and by Slate Magazine for "Quote of the Day." In his personal capacity, Singer served as coordinator of the Obama-08 campaigns' defense policy task force.

His first book Corporate Warriors: The Rise of the Privatized Military Industry pioneered the study of the new industry of private companies providing military services for hire, an issue that soon became important with the use and abuse of these companies in Iraq. His next book, Children at War explored the rise of another new force in modern warfare, child soldier groups. Dr. Singer's "fascinating" (New York Post) and "landmark" (Newsweek) work was the first book to comprehensively explore the compelling and tragic rise of child soldier groups and was recognized by the 2006 Robert F. Kennedy Memorial Book of the Year Award.

His third book, Wired for War looks at the implications of robotics and other new technologies for war, politics, ethics, and law in the 21st century. Described as: "An exhaustively researched book, enlivened by examples from popular culture" by the Associated Press and "awesome" by Jon Stewart of the Daily Show,  Wired for War made the New York Times non-fiction bestseller list in its first week of release. It has already been featured in the video game Metal Gear Solid 4: Guns of the Patriot, as well as in presentations to audiences as diverse as the Air Force Institute of Technology to the National Student Leadership Conference.

Prior to his current position, Dr. Singer was the founding Director of the Project on U.S. Policy Towards the Islamic World in the Saban Center at Brookings. He has also worked for the Belfer Center for Science and International Affairs at Harvard University, the Balkans Task Force in the U.S. Department of Defense, and the International Peace Academy. Singer received his Ph.D. in Government from Harvard University and a BA from the Woodrow Wilson School of Public and International Affairs at Princeton University.

  • Transcript

TRANSCRIPT

Question: Do we need to re-evaluate the ethics of war?

Singer:    One of the challenges of these new technologies is not just that we’re living in denial over what’s going on, but that we’re often sort of overwhelmed by them.  And one of the groups that I met with for the book were people at the Red Cross and another one was the Human Rights Watch.  There’s an interesting thing that someone at the International Red Cross said to me about this where he said, “You know what, there’s so much bad going on in the world right now, how can we waste time on thinking about the laws that surround these new things like unmanned systems and robotics and its completely valid answer.”  There is a ton of bad things going on in the world.  But, for me, what’s interesting is you could have said the exact same thing in 1939 or 1944.  There’s so much bad going on in the world, why should we waste time trying to figure out what’s going on with this new thing called atomic bombs.  That’s just, you know, fantasy.  That’s just science fiction.  So the result is we weren’t ready for it.  There’s a broader question of the ethical issues that surround these systems is does it make war crimes more or less likely.  That is, some people think you can put morality into these machines.  You can put ethical codes into them, sort of equivalent to like Isaac Asimov’s Three Laws, and so you could have as one scientist put it, more ethical killing devices and that they would follow rules that maybe soldiers wouldn’t.  So, for example, a lot of war crimes happen when a soldier’s buddy is killed and rage takes over and they lash out against local civilians.  Robots don’t have emotions so you wouldn’t have that happen.  They can also be a lot more discriminating.  They… a soldier has to burst into a room in a microsecond figure out who’s the threat or not and shoot and shoot, and sometimes they make mistakes.  A robot can go into a room, take its time, figure out who’s the threat and you don’t have to worry about the person dying behind them.  That’s true, but, machines can’t be moral by the very definition of morality.  They don’t have empathy.  They don’t have a sense of guilt.  So to a machine, 80-year-old grandmother in a wheelchair is just the same as a T-80 tank except for a couple of 1s and 0s that are different in the programming language, and that’s got to be disturbing to us in a certain way.


×