Walter Isaacson on Alan Turing and the Imitation Game
Biographer Walter Isaacson talks British mathematician Alan Turing, the inventor of the Turing Test and one of the subjects of Isaacson's newest book, "The Innovators."
Walter Isaacson is a renowned biographer, CEO of the Aspen Institute, and previously the chairman of CNN and managing editor of TIME magazine. He is the author of Einstein: His Life and Universe, Wise Men: Six Friends and the World They Made, Benjamin Franklin: An American Life, Steve Jobs, and most recently Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution.
Walter Isaacson: It’s great to trace things back to Alan Turing. You know he’s in Bletchley Park, England. He had come up with the concept of the universal computing machine but then he has to help put it in practice to break the German wartime code. So he comes up with a device called the bomb and then colossus and these are machines that can break the code and he starts thinking about the difference between human imagination and machine intelligence. And it goes back to what he calls Lady Lovelace’s objection. It goes back to Ada Lovelace a hundred years earlier who had said machines will be able to do everything except think. And so Turing comes up with what he calls the imitation game. Now we call it the Turing test in which he tries to figure out how would you tell the difference between a human and a machine. How would you know the machine’s not intelligent. He says well put a human and a machine in a different room, we send them in questions and if after a while you can’t tell which one’s a machine and which one’s a human, then it makes no sense to say the machine isn’t thinking.
Now you can have philosophical arguments about whether or not that’s a good test but ever since then, it’s been about 65 years since he came up with that concept, we’ve been trying to invent machines that would pass the Turing test or the imitation game. Every now and then you read about a machine that can sort of do conversational gambits and maybe confuse a person for five minutes or so and sort of try to pass the Turing test. But surprisingly we found it very difficult to have machines that can really carry on a conversation and be confused with a human. You can usually tell the machine from the human. A different way of looking at the way the computer age evolved is sort of Ada Lovelace’s way which is that computers and humans will evolve symbiotically. They’ll be partners. We will get more intimately connected to our machines and the machines will amplify our intelligence and our creativity will amplify what the machines could do. And we don’t need to try to create robots that’ll work without us. It’s kind of cooler to create this partnership of humans and technology or as she put it the humanities and engineering. So those are really the two schools of thought in computer programming. And every now and then you hear people say the singularity’s coming or we’re about to get to the age of artificial intelligence and machine learning. And I suspect it may come but it’s always about 20 years away. And in the meantime it’s sort of the Ada Lovelace vision rather than the Alan Turing vision. The vision of having machines that connect to us more intimately rather than replace us and don’t need us anymore.
Directed/Produced by Jonathan Fowler, Elizabeth Rodd, and Dillon Fitton
Biographer Walter Isaacson compares Alan Turing's computing philosophy with that of Ada Lovelace a hundred years prior. Turing, the subject of the new film "The Imitation Game," is also featured prominently in Isaacson's new book "The Innovators."
Once a week.
Subscribe to our weekly newsletter.
What is human dignity? Here's a primer, told through 200 years of great essays, lectures, and novels.
- Human dignity means that each of our lives have an unimpeachable value simply because we are human, and therefore we are deserving of a baseline level of respect.
- That baseline requires more than the absence of violence, discrimination, and authoritarianism. It means giving individuals the freedom to pursue their own happiness and purpose.
- We look at incredible writings from the last 200 years that illustrate the push for human dignity in regards to slavery, equality, communism, free speech and education.
The inherent worth of all human beings<p>Human dignity is the inherent worth of each individual human being. Recognizing human dignity means respecting human beings' special value—value that sets us apart from other animals; value that is intrinsic and cannot be lost.</p> <p>Liberalism—the broad political philosophy that organizes society around liberty, justice, and equality—is rooted in the idea of human dignity. Liberalism assumes each of our lives, plans, and preferences have some unimpeachable value, not because of any objective evaluation or contribution to a greater good, but simply because they belong to a human being. We are human, and therefore deserving of a baseline level of respect. </p> <p>Because so many of us take human dignity for granted—just a fact of our humanness—it's usually only when someone's dignity is ignored or violated that we feel compelled to talk about it. </p> <p>But human dignity means more than the absence of violence, discrimination, and authoritarianism. It means giving individuals the freedom to pursue their own happiness and purpose—a freedom that can be hampered by restrictive social institutions or the tyranny of the majority. The liberal ideal of the good society is not just peaceful but also pluralistic: It is a society in which we respect others' right to think and live differently than we do.</p>
From the 19th century to today<p>With <a href="https://books.google.com/ngrams/graph?year_start=1800&year_end=2019&content=human+dignity&corpus=26&smoothing=3&direct_url=t1%3B%2Chuman%20dignity%3B%2Cc0" target="_blank" rel="noopener noreferrer">Google Books Ngram Viewer</a>, we can chart mentions of human dignity from 1800-2019.</p><img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8yNDg0ODU0My9vcmlnaW4ucG5nIiwiZXhwaXJlc19hdCI6MTY1MTUwMzE4MX0.bu0D_0uQuyNLyJjfRESNhu7twkJ5nxu8pQtfa1w3hZs/img.png?width=980" id="7ef38" class="rm-shortcode" data-rm-shortcode-id="9974c7bef3812fcb36858f325889e3c6" data-rm-shortcode-name="rebelmouse-image" />
American novelist, writer, playwright, poet, essayist and civil rights activist James Baldwin at his home in Saint-Paul-de-Vence, southern France, on November 6, 1979.
Credit: Ralph Gatti/AFP via Getty Images
The future of dignity<p>Around the world, people are still working toward the full and equal recognition of human dignity. Every year, new speeches and writings help us understand what dignity is—not only what it looks like when dignity is violated but also what it looks like when dignity is honored. In his posthumous essay, Congressman Lewis wrote, "When historians pick up their pens to write the story of the 21st century, let them say that it was your generation who laid down the heavy burdens of hate at last and that peace finally triumphed over violence, aggression and war."</p> <p>The more we talk about human dignity, the better we understand it. And the sooner we can make progress toward a shared vision of peace, freedom, and mutual respect for all. </p>
We’ve mapped a million previously undiscovered galaxies beyond the Milky Way. Take the virtual tour here.
See the most detailed survey of the southern sky ever carried out using radio waves.
Astronomers have mapped about a million previously undiscovered galaxies beyond the Milky Way, in the most detailed survey of the southern sky ever carried out using radio waves.
A new study shows our planet is much closer to the supermassive black hole at the galaxy's center than previously estimated.
Credit: NAOJ<p><em>Arrows on this map show position and velocity data for the 224 objects utilized to model the Milky Way Galaxy. The solid black lines point to the positions of the spiral arms of the Galaxy. Colors reflect groups of objects that are part of the same arm, while the background is a simulation image.</em></p>
With just a few strategical tweaks, the Nazis could have won one of World War II's most decisive battles.
- The Battle of Britain is widely recognized as one of the most significant battles that occurred during World War II. It marked the first major victory of the Allied forces and shifted the tide of the war.
- Historians, however, have long debated the deciding factor in the British victory and German defeat.
- A new mathematical model took into account numerous alternative tactics that the German's could have made and found that just two tweaks stood between them and victory over Britain.
Two strategic blunders<p>Now, historians and mathematicians from York St. John University have collaborated to produce <a href="http://www-users.york.ac.uk/~nm15/bootstrapBoB%20AAMS.docx" target="_blank">a statistical model (docx download)</a> capable of calculating what the likely outcomes of the Battle of Britain would have been had the circumstances been different. </p><p>Would the German war effort have fared better had they not bombed Britain at all? What if Hitler had begun his bombing campaign earlier, even by just a few weeks? What if they had focused their targets on RAF airfields for the entire course of the battle? Using a statistical technique called weighted bootstrapping, the researchers studied these and other alternatives.</p><p>"The weighted bootstrap technique allowed us to model alternative campaigns in which the Luftwaffe prolongs or contracts the different phases of the battle and varies its targets," said co-author Dr. Jaime Wood in a <a href="https://www.york.ac.uk/news-and-events/news/2020/research/mathematicians-battle-britain-what-if-scenarios/" target="_blank">statement</a>. Based on the different strategic decisions that the German forces could have made, the researchers' model enabled them to predict the likelihood that the events of a given day of fighting would or would not occur.</p><p>"The Luftwaffe would only have been able to make the necessary bases in France available to launch an air attack on Britain in June at the earliest, so our alternative campaign brings forward the air campaign by three weeks," continued Wood. "We tested the impact of this and the other counterfactuals by varying the probabilities with which we choose individual days."</p><p>Ultimately, two strategic tweaks shifted the odds significantly towards the Germans' favor. Had the German forces started their campaign earlier in the year and had they consistently targeted RAF airfields, an Allied victory would have been extremely unlikely.</p><p>Say the odds of a British victory in the real-world Battle of Britain stood at 50-50 (there's no real way of knowing what the actual odds are, so we'll just have to select an arbitrary figure). If this were the case, changing the start date of the campaign and focusing only on airfields would have reduced British chances at victory to just 10 percent. Even if a British victory stood at 98 percent, these changes would have cut them down to just 34 percent.</p>
A tool for understanding history<p>This technique, said co-author Niall Mackay, "demonstrates just how finely-balanced the outcomes of some of the biggest moments of history were. Even when we use the actual days' events of the battle, make a small change of timing or emphasis to the arrangement of those days and things might have turned out very differently."</p><p>The researchers also claimed that their technique could be applied to other uncertain historical events. "Weighted bootstrapping can provide a natural and intuitive tool for historians to investigate unrealized possibilities, informing historical controversies and debates," said Mackay.</p><p>Using this technique, researchers can evaluate other what-ifs and gain insight into how differently influential events could have turned out if only the slightest things had changed. For now, at least, we can all be thankful that Hitler underestimated Britain's grit.</p>
Apple sold its first iPod in 2001, and six years later it introduced the iPhone, which ushered in a new era of personal technology.