Skip to content
Who's in the Video
Peter Warren Singer is Senior Fellow and Director of the 21st Century Defense Initiative at the Brookings Institution. He is the youngest scholar named Senior Fellow in Brookings' 90-year history.[…]
Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

The author speaks to the absence of morality in a fully mechanized war

Question: Do we need to re-evaluate the ethics of war?

Singer:    One of the challenges of these new technologies is not just that we’re living in denial over what’s going on, but that we’re often sort of overwhelmed by them.  And one of the groups that I met with for the book were people at the Red Cross and another one was the Human Rights Watch.  There’s an interesting thing that someone at the International Red Cross said to me about this where he said, “You know what, there’s so much bad going on in the world right now, how can we waste time on thinking about the laws that surround these new things like unmanned systems and robotics and its completely valid answer.”  There is a ton of bad things going on in the world.  But, for me, what’s interesting is you could have said the exact same thing in 1939 or 1944.  There’s so much bad going on in the world, why should we waste time trying to figure out what’s going on with this new thing called atomic bombs.  That’s just, you know, fantasy.  That’s just science fiction.  So the result is we weren’t ready for it.  There’s a broader question of the ethical issues that surround these systems is does it make war crimes more or less likely.  That is, some people think you can put morality into these machines.  You can put ethical codes into them, sort of equivalent to like Isaac Asimov’s Three Laws, and so you could have as one scientist put it, more ethical killing devices and that they would follow rules that maybe soldiers wouldn’t.  So, for example, a lot of war crimes happen when a soldier’s buddy is killed and rage takes over and they lash out against local civilians.  Robots don’t have emotions so you wouldn’t have that happen.  They can also be a lot more discriminating.  They… a soldier has to burst into a room in a microsecond figure out who’s the threat or not and shoot and shoot, and sometimes they make mistakes.  A robot can go into a room, take its time, figure out who’s the threat and you don’t have to worry about the person dying behind them.  That’s true, but, machines can’t be moral by the very definition of morality.  They don’t have empathy.  They don’t have a sense of guilt.  So to a machine, 80-year-old grandmother in a wheelchair is just the same as a T-80 tank except for a couple of 1s and 0s that are different in the programming language, and that’s got to be disturbing to us in a certain way.


Related