Michael-schrage-hs
Michael Schrage
Research Fellow, MIT Center for Digital Business
03:11

The U.S. Government Wants Apple to Unlock Its iPhone — so Do the World's Authoritarian Regimes

To embed this video, copy this code:

In the dispute over whether Apple should be forced to unlock the San Bernardino shooter's iPhone, there is ultimately no satisfactory decision. Either law enforcement will be prevented from collecting essential information or a Pandora's Box will be opened, possibly allowing the world's harsher governments to collect their citizens' private data at will. The dilemma is captured by the famous legal phrase "hard cases make bad law."

Michael Schrage

Michael Schrage examines the various roles of models, prototypes, and simulations as collaborative media for innovation risk management. He has served as an advisor on innovation issues and investments to major firms, including Mars, Procter & Gamble, Google, Intel, BT, Siemens, NASDAQ, IBM, and Alcoa. In addition, Schrage has advised segments of the national security community on cyber conflict and cybersecurity issues. He has presented workshops on design experimentation and innovation risk for businesses, organizations, and executive education programs worldwide. Along with running summer workshops on future technologies for the Pentagon's Office of Net Assessment, he has served on the technical advisory committee of MIT's Lincoln Laboratory. In collaboration with the Center for Strategic and International Studies (CSIS), Schrage helped launch a series of workshops sponsored by the Department of Defense on federal complex systems procurement. In 2007, he served as a judge for the Industrial Designers Society of America's global International Design Excellence Awards.


 

Transcript

Michael Schrage: There’s a very famous phrase in the legal community — hard cases make bad law. And the circumstances that Apple and the FBI and the Justice Department find themselves in — certainly not by design; it’s a horrible tragedy that led to it — but this is a wonderful example of a very hard case. You have, without question, somebody who has done an evil, evil, murderous thing and they have used a device that contains information that might be not just marginally, extraordinarily useful to law enforcements all over the world, certainly to the United States in either solving aspects of this crime or preventing future atrocities from occurring. No question. Apple says that it is not — and this happens to be an Apple device — Apple has designed their devices so that people can protect their information. And now a federal judge has ordered Apple to help crack the phone and gain access to that information.

And Apple, in a very interesting letter from its CEO Tim Cook, has said well we don’t, we’re rejecting the judge’s order to help crack this. Why? Why? Because the problem that emerges is do people have any expectation or reasonable expectation of privacy when they use technologies on networks. By acceding to the judge’s request and the FBI’s request, a message would be sent to people all over the world — China, Europe, Latin America, the U.S. — that if a rule of law, if a judge — Chinese judge, Brazilian judge, Russian judge — says that thing that you’ve encrypted on your device: we want access to it. It would basically mean you have no privacy if a rule of law — and let’s be very blunt here: Chinese and Russian rule of law standards are different than American or British or German rule of law standards. Nobody could count on their devices to protect them in any other circumstances. To my mind, that is the definition of a hard case. I am extraordinarily sympathetic to Apple. I’m extraordinarily sympathetic to the FBI and the Justice Department. I am even more sympathetic to the families of the people who were hurt and killed in that attack, that terrorist attack. But the reality is this is one of those circumstances where there is no good answer. And whatever answer is chosen is the wrong one.

 


×