Skip to content
Who's in the Video
Jerry Kaplan is widely known in the computer industry as a serial entrepreneur, inventor, scientist, and author. He is currently a Fellow at The Stanford Center for Legal Informatics. He also[…]
Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

Just like automated vehicles, robots and advanced AI will require new sets of laws to define the extent of owner liability and accountability. Creating these laws will require an important ethical discussion: Who is at fault when a robot misbehaves? According to author Jerry Kaplan, there is a precedent for creating codes and consequences for robots that do not apply to others. Take, for example, the fact that criminal charges can be brought against corporations rather than the people operating beneath the corporate shell. Similarly, we can develop laws that would allow robots and their programming to stand trial.

Jerry Kaplan: There’s a whole other set of issues about how robots should be treated under the law. Now the obvious knee-jerk reaction is well you own a robot and you’re responsible for everything that it does. But as these devices become much more autonomous, it’s not at all clear that that’s really the right answer or a good answer. You go out and you buy a great new robot and you send it down the street to go pick you up a Frappuccino down at Starbucks and maybe it’s accidental, but it’s standing at the corner and it happens to bump some kid into traffic and a car runs the kid over. The police come and they’re going to come and arrest you for this action. Do you really feel that you’re as responsible as you would be if you had gone like this and pushed that kid into traffic? I would argue no you don’t. So we’re going to need new kinds of laws that deal with the consequences of well-intentioned autonomous actions that robots take. Now interestingly enough, there’s a number of historical precedents for this. You might say well how can you hold a robot responsible for its behavior? You really can actually and let me point out a couple of things.

The first is most people don’t realize it. Corporations can commit criminal acts independent of the people in the corporation. So in the Deepwater Horizon Gulf coast accident, as an example, BP oil was charged with criminal violations even though people in the corporation were not necessarily charged with those same criminal violations. And rightfully so. So how do we punish a corporation? We punish a corporation by interfering with its ability to achieve its stated goal; make huge fines as they did in that particular case. You can make the company go out of business. You can revoke its license to operate, which is a death penalty for a corporation. You can have it monitored as they do in antitrust cases in many companies. IBM, Microsoft, I think, have monitors to make sure they’re abiding by certain kinds of behavioral standards. Well that same kind of activity can apply to a robot. You don’t have to put a robot in jail, but you can interfere with what it’s trying to do. And if these robots are adaptable, logical, and are learning. They’ll say well I’ll get it, you know. I can’t do that because my goal is to accomplish something in particular and if I take this particular action, that’s actually going to be working against my interest in accomplishing that situation.

So rehabilitation and modification of robot behavior, just as with a corporation is much more logical than you might think. Now another interesting historical precedent is prior to the Civil War there were a separate set of laws that applied to slaves. They were called the slave codes. And slaves were property. But interestingly enough, the slave owners were only held liable under certain conditions for the actions of their slaves. The slaves themselves were punished under — if they committed crimes. And so we have a historical precedent for the kinds of ways in which we can sort this out so that you are not in constant fear that your robot is going to bump into somebody and you’re going to go to jail for 20 years for negligent homicide or whatever it might be.


Related