Howard Gardner is a developmental psychologist and the John H. and Elisabeth A. Hobbs Professor of Cognition and Education at the Harvard Graduate School of Education. He holds positions as Adjunct Professor of Psychology at Harvard University and Senior Director of Harvard Project Zero.
Among numerous honors, Gardner received a MacArthur Prize Fellowship in 1981. In 1990, he was the first American to receive the University of Louisville's Grawemeyer Award in Education and in 2000 he received a Fellowship from the John S. Guggenheim Memorial Foundation. In 2005 and again in 2008 he was selected by Foreign Policy and Prospect magazines as one of 100 most influential public intellectuals in the world. He has received honorary degrees from twenty-two colleges and universities, including institutions in Ireland, Italy, Israel, and Chile.
The author of over twenty books translated into twenty-seven languages, and several hundred articles, Gardner is best known in educational circles for his theory of multiple intelligences, a critique of the notion that there exists but a single human intelligence that can be assessed by standard psychometric instruments. During the past twenty five years, he and colleagues at Project Zero have been working on the design of performance-based assessments, education for understanding, and the use of multiple intelligences to achieve more personalized curriculum, instruction, and assessment. In the middle 1990s, Gardner and his colleagues launched The GoodWork Project. "GoodWork" is work that is excellent in quality, personally engaging, and exhibits a sense of responsibility with respect to implications and applications. Researchers have examined how individuals who wish to carry out good work succeed in doing so during a time when conditions are changing very quickly, market forces are very powerful, and our sense of time and space is being radically altered by technologies, such as the web. Gardner and colleagues have also studied curricula. Gardner's books have been translated into twenty-seven languages. Among his books are The Disciplined Mind: Beyond Facts and Standardized Tests, The K-12 Education that Every Child Deserves (Penguin Putnam, 2000) Intelligence Reframed (Basic Books, 2000), Good Work: When Excellence and Ethics Meet (Basic Books, 2001), Changing Minds: The Art and Science of Changing Our Own and Other People's Minds (Harvard Business School Press, 2004), and Making Good: How Young People Cope with Moral Dilemmas at Work (Harvard University Press, 2004; with Wendy Fischman, Becca Solomon, and Deborah Greenspan). These books are available through the Project Zero eBookstore.
Currently Gardner continues to direct the GoodWork project, which is concentrating on issues of ethics with secondary and college students. In addition, he co-directs the GoodPlay and Trust projects; a major current interest is the way in which ethics are being affected by the new digital media.
In 2006 Gardner published Multiple Intelligences: New Horizons, The Development and Education of the Mind, and Howard Gardner Under Fire. In Howard Gardner Under Fire, Gardner's work is examined critically; the book includes a lengthy autobiography and a complete biography. In the spring of 2007, Five Minds for the Future was published by Harvard Business School Press. Responsibility at Work, which Gardner edited, was published in the summer of 2007.
Question: Is our culture biased towards one type of intelligence over another?
Howard Gardner: Well the theory claims we all had these eight intelligences and people are different from one another in their profile of intelligences and there’s no necessary link between one intelligence and the other. It also is based on the assumption that we wouldn’t have these intelligences if they haven’t been valuable in human evolution. An example I like to use is that the—we developed the natural intelligence so we knew what to eat and what not to eat, to be able to pay attention to which animals to run away from and which animals to hunt and of course which plants to eat and which ones—there’s a reason why we're sensitive to the world of nature.
Now most of us, particularly people who watch this, they go to super markets and they don't have to know anything about the wild, but I think that the neural networks which evolved to help us get around in the Savannah’s of East Africa 50,000 years ago, they're now being used for consumer society and we decide which shoes to buy and which car to buy and we're looking at the same kinds of things that our ancestors did but we’re doing it in terms of walking through the mall rather than walking through the—running though the Savannah and hoping we won’t get eaten by some kind of a creature.
As history unfolds, as cultures evolve, of course the intelligences which they value change. I would say, until a hundred years ago, if you wanted to have higher education, linguistic intelligence was important. I teach at Harvard and a 125 years ago, 150 years ago, the entrance exams were Latin, Greek and Hebrew. If, for example, you were dyslexic, that would be very difficult because it would be hard for you to learn those languages, which are basically written languages. People don’t speak Greek when they were in ancient Greek.
Over the last century, clearly the logical mathematical intelligence is something we pay a lot of attention to and the linguistic intelligence is a little bit more of an option. But once one looks at the world of occupations, we have hundreds of occupations and I think the reason that Dan Goleman’s work on social and emotion intelligence has got so much attention is because while your IQ, which is sort of language logic, will get you behind the desk, if you don’t know how to deal with people, if you don't know how to read yourself, you know you’re going to ending up just staying at that desk forever or eventually being asked to make room for somebody who does have social or emotional intelligence.
When the singularity occurs and the machines are smarter than we are, then it’s the artistic kinds of intelligence or intelligence used artistically to be more precise, which will come to the fore.
Question: Is the theory of Multiple Intelligences reformist?
Howard Gardner: I think you can talk about reformism in two sentences, one is it’s clear that when I developed this theory in the late 1970s I was trying to reform the way psychologists and other people think about intelligence. So certainly I had an iconoclastic or reformist inclination there. I was kind of surprised the psychologists didn’t all line up in a row and say, "You’re right. We’ve been wrong for 100 years." That’s somewhat facetious but I was surprised in how much interest there was within the educational world, and there I would say gradually, I switched from simply saying this is how I think the mind is organized and how it has developed to I think there are maybe there are things differently in education because of the theory.
Then really in the last 15 years, I think I’ve become much more reformist because I have been concerned about the ethical dimensions of our society. That doesn’t grow in any natural way out of multiple intelligences theory. If I look at it somewhat autobiographically, as a young person I was very much involved with music. I was a serious pianist and while I never thought about a career in music, music was and has been very important to me. Then when I got to college I became interested in the art forms and then I spent a year in England as a Fellow and I really immersed myself in drama. It’s great to do in theater in London and art galleries and sort of expanded my artistic horizons and then when I went to graduate school in psychology I was stunned at how the arts were never mentioned.
To be a developed person cognitively meant to be a scientist and to think scientifically. We could speculate about why that’s so, but the serious book I wrote was called the Arts and Human Development. And what I said in that book, this was in the early '70s, is all the developmental psychology has thought of science as the apotheosis of human development, yet science is a modern Western invention and we might well never have invented science, if we had not Galileo and Copernicus and Newton. On the other hand, arts exist in just about every society and they’re very important, so can we conceptualize development in terms of the arts as well as the sciences.
Question: Is intelligence determined more by nurture or nature?
Howard Gardner: Well, as you probably know, as the viewers probably know, nowadays nobody takes extreme positions on that issue. Maybe someday the press will learn not to take extreme positions on the issue. And I certainly believe that every intelligence has genetic components; how else would it exist? And every intelligence has certain heritability; that’s the technical term for how much of the variation of population has to do with who your biological grandparents were, because that’s a better set of genes than your parents because you have four sets rather than two.
We don’t know what the heritability is of most intelligences, but from a lot of research we know that on the average human traits are about 0.5 heritable. That means that genes make a big contribution but so do parents, culture, the media, peers and so on.
I guess I never put it this way before, but maybe what I would say is the intelligences that you favor are probably ones where you have a genetic predisposition, but how use those intelligences is going to be overwhelmingly determined by the culture in which you were born and your parents and what they value and whether you get along with your parents and that kind of thing. So the deployment of intelligences is probably largely a nurture factor. But if say if the Bach family had a lot of genes going for it in the music area and probably that was pretty likely that they were going end up being musicians even if they hadn’t—even if they been so to speak separated at birth and they’d been raised in another kind of family.
Question: If there are so many ways to be smart, what does it mean to be "stupid"?
Howard Gardner: The first thing I would say is that life isn’t fair and some people are going to be strong in a lot of intelligences and some people aren’t. I think of the intelligences as a set of computers. If you wanted to summarize my theory in a sentence, we used to think there was just one general computer in here and if you were good at one thing, you’d be good at everything. If you’re lousy in one thing, you’re smarter across the board. Stupid across the board. I think the step I took, I would call it an advance is you can be very smart with language, average with music, lousy with understanding other people, or vice versa. There’s no necessarily correlation between the two.
I think stupid has two very different connotations. One is that your computer isn’t very good. For example, I’m not biologically very good spatially, but the truth is with a map and a position determiner and some special attention to the environment I can do perfectly well, but I suppose if there were a test of spatial intelligence I wouldn’t do very well.
So, one meaning for stupid is it takes you a long to do what it takes other people who are smarter in that intelligence. I’m very musical, especially when I was younger, I heard something once, not only could I remember it, I couldn’t forget it. So that’s smart in kind of a technical sense.
But the other sense of stupid, but I think is much more important, is how do you go about leading your life? Do you know what you’re trying to do? Can you achieve it? When you make mistake, do you make same mistake again? Or do you simply stick in a rut? That has to do with your own understanding of yourself, what you’re trying to achieve; what I call intra-personal intelligence. I much rather to have somebody who was stupid in the first sense but had a good sense of how to negotiate their way through life, than somebody who had the computers going full blast but kept knocking their head against the wall.
I make fun of Mensa—I don’t know a great deal about Mensa, that’s the high IQ group—but I say, "To get into Mensa, you have to have a high IQ, and once you get in, you spend your time congratulating people who are in Mensa with you." To me that’s a pretty stupid way to spend your life.
Recorded On: September 3, 2009
The psychologist argues that different periods in history have shown biases towards different types of intelligence, and that this bias will continue to shift with time.
Once a week.
Subscribe to our weekly newsletter.
These Roman Emperors were infamous for their debauchery and cruelty.
- Roman Emperors were known for their excesses and violent behavior.
- From Caligula to Elagabalus, the emperors exercised total power in the service of their often-strange desires.
- Most of these emperors met violent ends themselves.
We rightfully complain about many of our politicians and leaders today, but historically speaking, humanity has seen much worse. Arguably no set of rulers has been as debauched, ingenious in their cruelty, and prone to excess as the Roman Emperors.
While this list is certainly not exhaustive, here are seven Roman rulers who were perhaps the worst of the worst in what was one of the largest empires that ever existed, lasting for over a thousand years.
Officially known as Gaius (Gaius Caesar Augustus Germanicus), Caligula was the third Roman Emperor, ruling from 37 to 41 AD. He acquired the nickname "Caligula" (meaning "little [soldier's] boot") from his father's soldiers during a campaign.
While recognized for some positive measures in the early days of his rule, he became famous throughout the ages as an absolutely insane emperor, who killed anyone when it pleased him, spent exorbitantly, was obsessed with perverse sex, and proclaimed himself to be a living god.
Caligula gives his horse Incitatus a drink during a banquet. Credit: An engraving by Persichini from a drawing by Pinelli, from "The History of the Roman Emperors" from Augustus to Constantine, by Jean Baptiste Louis Crevier. 1836.
Among his litany of misdeeds, according to the accounts of Caligula's contemporaries Philo of Alexandria and Seneca the Younger, he slept with whomever he wanted, brazenly taking other men's wives (even on their wedding nights) and publicly talking about it.
He also had an insatiable blood thirst, killing for mere amusement. Once, as reports historian Suetonius, when the bridge across the sea at Puteoli was being blessed, he had a number of spectators who were there to inspect it thrown off into the water. When some tried to cling to the ships' rudders, Caligula had them dislodged with hooks and oars so they would drown. On another occasion, he got so bored that he had his guards throw a whole section of the audience into the arena during the intermission so they would be eaten by wild beasts. He also allegedly executed two consuls who forgot his birthday.
Suetonius relayed further atrocities of the mad emperor's character, writing that Caligula "frequently had trials by torture held in his presence while he was eating or otherwise enjoying himself; and kept an expert headsman in readiness to decapitate the prisoners brought in from gaol." One particular form of torture associated with Caligula involved having people sawed in half.
He caused mass starvation and purposefully wasted money and resources, like making his troops stage fake battles just for theater. If that wasn't enough, he turned his palace into a brothel and was accused of incest with his sisters, Agrippina the Younger, Drusilla, and Livilla, whom he also prostituted to other men. Perhaps most famously, he was planning to appoint his favorite horse Incitatus a consul and went as far as making the horse into a priest.
In early 41 AD, Caligula was assassinated by a conspiracy of Praetorian Guard officers, senators, and other members of the court.
Fully named Nero Claudius Caesar, Nero ruled from 54 to 68 AD and was arguably an even worse madman than his uncle Caligula. He had his step-brother Britannicus killed, his wife Octavia executed, and his mother Agrippina stabbed and murdered. He personally kicked to death his lover Poppeaea while she was pregnant with his child — a horrific action the Roman historian Tacitus depicted as "a casual outburst of rage."
He spent exorbitantly and built a 100-foot-tall bronze statue of himself called the Colossus Neronis.
He is also remembered for being strangely obsessed with music. He sang and played the lyre, although it's not likely he really fiddled as Rome burned in what is a popular myth about this crazed tyrant. As misplaced retribution for the fire which burned down a sizable portion of Rome in the year 64, he executed scores of early Christians, some of them outfitted in animal skins and brutalized by dogs, with others burned at the stake.
He died by suicide.
Roman Emperor Nero in the burning ruins of Rome. July 64 AD.Credit: From an original painting by S.J. Ferris. (Photo by Kean Collection / Getty Images)
Like some of his counterparts, Commodus (a.k.a. Lucius Aelius Aurelius Commodus) thought he was a god — in his case, a reincarnation of the Greek demigod Hercules. Ruling from 176 to 192 AD, he was also known for his debauched ways and strange stunts that seemed designed to affirm his divine status. Numerous statues around the empire showed him as Hercules, a warrior who fought both men and beasts. He fought hundreds of exotic animals in an arena like a gladiator, confusing and terrifying his subjects. Once, he killed 100 lions in a single day.
Emperor Commodus (Joaquin Phoenix) questions the loyalty of his sister Lucilla (Connie Nielsen) In Dreamworks Pictures' and Universal Pictures' Oscar-winning drama "Gladiator," directed by Ridley Scott.Credit: Photo By Getty Images
The burning desire to kill living creatures as a gladiator for the New Year's Day celebrations in 193 AD brought about his demise. After Commodus shot hundreds of animals with arrows and javelins every morning as part of the Plebeian Games leading up to New Year's, his fitness coach (aptly named Narcissus), choked the emperor to death in his bath.
Officially named Marcus Aurelius Antoninus II, Elagabalus's nickname comes from his priesthood in the cult of the Syrian god Elagabal. Ruling as emperor from 218 to 222 AD, he was so devoted to the cult, which he tried to spread in Rome, that he had himself circumcised to prove his dedication. He further offended the religious sensitivities of his compatriots by essentially replacing the main Roman god Jupiter with Elagabal as the chief deity. In another nod to his convictions, he installed on Palatine Hill a cone-like fetish made of black stone as a symbol of the Syrian sun god Sol Invictus Elagabalus.
His sexual proclivities were also not well received at the time. He was likely transgender (wearing makeup and wigs), had five marriages, and was quite open about his male lovers. According to the Roman historian (and the emperor's contemporary) Cassius Dio, Elagabalus prostituted himself in brothels and taverns and was one of the first historical figures on record to be looking for sex reassignment surgery.
He was eventually murdered in 222 in an assassination plot engineered by his own grandmother Julia Maesa.
Emperor for just eight months, from April 19th to December 20th of the year 69 AD, Vitellius made some key administrative contributions to the empire but is ultimately remembered as a cruel glutton. He was described by Suetonius as overly fond of eating and drinking, to the point where he would eat at banquets four times a day while sending out the Roman navy to get him rare foods. He also had little social grace, inviting himself over to the houses of different noblemen to eat at their banquets, too.
Vitellius dragged through the streets of Rome.Credit: Georges Rochegrosse. 1883.
He was also quite vicious and reportedly either had his own mother starved to death or approved a poison with which she committed suicide.
Vitellius was ultimately murdered in brutal fashion by supporters of the rival emperor Vespasian, who dragged him through Rome's streets, then likely beheaded him and threw his body into the Tiber river. "Yet I was once your emperor," were supposedly his last words, wrote historian Cassius Dio.
Marcus Aurelius Antoninus I ruled Rome from 211 to 217 AD on his own (while previously co-ruling with his father Septimius Severus from 198). "Caracalla"' was his nickname, referencing a hooded coat from Gaul that he brought into Roman fashion.
He started off his rise to individual power by murdering his younger brother Geta, who was named co-heir by their father. Caracalla's bloodthirsty tyranny didn't stop there. He wiped out Geta's supporters and was known to execute any opponents to his or Roman rule. For instance, he slaughtered up to 20,000 citizens of Alexandria after a local theatrical satire dared to mock him.
Geta Dying in His Mother's Arms.Credit: Jacques Pajou (1766-1828)
One of the positive outcomes of his rule was the Edict of Caracalla, which gave Roman citizenship to all free men in the empire. He was also known for building gigantic baths.
Like others on this list, Caracalla met a brutal end, being assassinated by army officers, including the Praetorian prefect Opellius Macrinus, who installed himself as the next emperor.
As the second emperor, Tiberius (ruling from 42 BC to 16 AD) is known for a number of accomplishments, especially his military exploits. He was one of the Roman Empire's most successful generals, conquering Pannonia, Dalmatia, Raetia, and parts of Germania.
He was also remembered by his contemporaries as a rather sullen, perverse, and angry man. In the chapter on his life from The Lives of the Twelve Caesars by the historian Suetonius, Tiberius is said to have been disliked from an early age for his personality by even his family. Suetonius wrote that his mother Antonia often called him "an abortion of a man, that had been only begun, but never finished, by nature."
"Orgy of the Times of Tiberius on Capri".Painting by Henryk Siemiradzki. 1881.
Suetonius also paints a damning picture of Tiberius after he retreated from public life to the island of Capri. His years on the island would put Jeffrey Epstein to shame. A horrendous pedophile, Tiberius had a reputation for "depravities that one can hardly bear to tell or be told, let alone believe," Suetonius wrote, describing how "in Capri's woods and groves he arranged a number of nooks of venery where boys and girls got up as Pans and nymphs solicited outside bowers and grottoes: people openly called this 'the old goat's garden,' punning on the island's name."
There's much, much more — far too salacious and, frankly, disgusting to repeat here. For the intrepid or morbidly curious reader, here's a link for more information.
After he died, Tiberius was fittingly succeeded in emperorship by his grandnephew and adopted grandson Caligula.
- As the material that makes all living things what/who we are, DNA is the key to understanding and changing the world. British geneticist Bryan Sykes and Francis Collins (director of the Human Genome Project) explain how, through gene editing, scientists can better treat illnesses, eradicate diseases, and revolutionize personalized medicine.
- But existing and developing gene editing technologies are not without controversies. A major point of debate deals with the idea that gene editing is overstepping natural and ethical boundaries. Just because they can, does that mean that scientists should be edit DNA?
- Harvard professor Glenn Cohen introduces another subcategory of gene experiments: mixing human and animal DNA. "The question is which are okay, which are not okay, why can we generate some principles," Cohen says of human-animal chimeras and arguments concerning improving human life versus morality.
A 19th-century surveying mistake kept lumberjacks away from what is now Minnesota's largest patch of old-growth trees.
- In 1882, Josias R. King made a mess of mapping Coddington Lake, making it larger than it actually is.
- For decades, Minnesota loggers left the local trees alone, thinking they were under water.
- Today, the area is one of the last remaining patches of old-growth forest in the state.
Vanishingly rare, but it exists: a patch of Minnesota forest untouched by the logger's axe.Credit: Dan Alosso on Substack and licensed under CC-BY-SA
The trees here tower a hundred feet above the forest floor — a ceiling as high as in prehistory and vanishingly rare today. That's because no logger's axe has ever touched these woods.
Pillars of the green cathedral
As you walk among the giant pillars of this green cathedral, you might think you're among the redwood trees of California. But those are 1,500 miles (2,500 km) away. No, these are the red and white pines of the "Lost Forty" in Minnesota. This is the largest single surviving patch of old-growth forest in the state and a fair stretch beyond. And it's all thanks to a surveying error.
Despite its name, the Lost Forty Scientific and Natural Area (SNA) is actually 144 acres (0.58 km2) in total. Still, it's an easily overlooked part of the Chippewa National Forest, which sprawls across 666,000 acres (2,700 km2) of north-central Minnesota. And that – being easily overlooked – is kind of this area's superpower.
In the 1820s, when European-Americans arrived in what is now Minnesota, they found about 20 million acres (80,000 km2) of prairie and 30 million acres (120,000 km2) of forest. Two centuries on, both ecosystems largely have been depleted. Fewer than 100,000 acres (400 km2) of natural prairie remain, and fewer than 18 million acres (73,000 km2) of forest.
And today's woods are different. They're not just younger; the original pine stands have been harvested and largely replaced with aspen and birch.
To the moon and back
White pine especially was in heavy demand during the lumbering boom that had Minnesota in its grip by the 1840s — a boom driven by an insatiable demand for building materials and supercharged by the steam that powered the saws and the rails that transported the goods to market.
The two decades flanking the turn of the 20th century were the golden age of lumbering in Minnesota. At any given time, 20,000 lumberjacks were at work in the woods, a further 20,000 in the sawmills, and another 20,000 in other lumber-related industries.
Production peaked in the year 1900, with over 2.3 billion board-feet (5.4 million m3) of lumber harvested from the state's forests. That was enough to build 600,000 two-story houses or a boardwalk nine feet (2.7 m) wide, circling Earth along the equator. From then on, yields declined, albeit slightly at first. By 1910, however, the first lumber operations started packing up and moving on to the Pacific Northwest and elsewhere.
Minnesota's era of Big Timber symbolically came to an end with the closure of the Virginia and Rainy Lake Lumber Company in 1929. At that time, a century's worth of lumbering in Minnesota had produced 68 billion board-feet (160 million m3) of pine — enough to fill a line of boxcars all the way to the moon and halfway back again.
Now spool back a few decades. It's 1882, and the Public Land Survey is measuring, mapping, and quantifying the wilderness of northern Minnesota — and its as yet unharvested north woods. Setting out from the small settlement of Grand Rapids, Josias Redgate King leads a three-man survey team 40 miles north, into the backwoods.
Mapping error becomes cartographic fact
Their job, specifically, is to chart the area between Moose and Coddington Lakes. And they mess up. Perhaps it's the lousy November weather, the desolate swampy terrain, or both. But they make a serious mistake: their survey stretches Coddington Lake half a mile further northwest than it actually exists. As happens surprisingly often with mapping mistakes, the error becomes cartographic fact, undisputed for decades.
The area is marked on all maps as being under water and is therefore excluded from the considerations of logging companies. Only in 1960 is the area re-surveyed and the error corrected. But by then, as we have seen, Big Timber has moved on from the Gopher State.
Map of the "Lost Forty" SNA (top right). Bordering it on the south is the Chippewa National Forest Unique Biological Area. Credit: Minnesota Department of Natural Resources
Incidentally, Josias R. King was more than the mismapper of Coddington Lake. He has another, and rather better, claim to fame. When the Civil War broke out, Minnesota was the first state to offer volunteers to fight for the Union. At Fort Snelling, Mr. King rushed to the front of a line of men waiting to sign up.
So it was said, with some justification, that he was the first volunteer for the Union in all of the country. During the war, he attained the rank of lieutenant colonel. After, he returned to his civilian job, surveying. Because of his credentials as the Union's first volunteer, he was asked to pose for the face of the bronze soldier on the Civil War monument which was unveiled at St. Paul's Summit Park in 1903.
The loggers' loss is nature's gain
But back to the Lost Forty. The loggers' loss — hence the name — is actually nature's gain. The SNA's crowning glory, literally, is nearly 32 acres of designated old-growth red pine and white pine forest, in two stands, partially extending into the Chippewa National Forest proper. (In fact, much of the mismapped area seems to fall within the Chippewa National Forest Unique Biological Area adjacent to the Lost Forty.) Old-growth forests represent less than 2 percent — and designated old-growth forests less than 0.25 percent — of all of Minnesota's forests.
The oldest pine trees in the Lost Forty are between 300 and 400 years old, close to their maximum natural life span, which is up to 500 years. Similar pines in other parts of the National Forest are harvested at between 80 and 150 years for pulp and lumber. As a result, the pines in the Lost Forty are not only higher than most of the surrounding woods but also bigger with a diameter of between 22 and 48 inches (55 to 122 cm). One of the biggest has a circumference of 115 inches (2.9 m).
With their craggy bark, massive trunks, and dizzying height, these trees look like the ancient beings they are. And they exist in a cluster the size of which is unique for the Midwest. There's nothing lost about these trees; in fact, it's rather the reverse. Perhaps the area should more precisely be called the "Last Forty."
At 52 feet, only half as high as an old-growth white pine: Josias R. King's likeness atop the Soldier's Monument in Summit Park, St. Paul.Credit: Library of Congress
Get a good look at the Lost Forty in this video of the local hiking trail.
Strange Maps #1084
Got a strange map? Let me know at email@example.com.
New studies stretch the boundaries of physics, achieving quantum entanglement in larger systems.
- New experiments with vibrating drums push the boundaries of quantum mechanics.
- Two teams of physicists create quantum entanglement in larger systems.
- Critics question whether the study gets around the famous Heisenberg uncertainty principle.
Recently published research pushes the boundaries of key concepts in quantum mechanics. Studies from two different teams used tiny drums to show that quantum entanglement, an effect generally linked to subatomic particles, can also be applied to much larger macroscopic systems. One of the teams also claims to have found a way to evade the Heisenberg uncertainty principle.
One question that the scientists were hoping to answer pertained to whether larger systems can exhibit quantum entanglement in the same way as microscopic ones. Quantum mechanics proposes that two objects can become "entangled," whereby the properties of one object, such as position or velocity, can become connected to those of the other.
An experiment performed at the U.S. National Institute of Standards and Technology in Boulder, Colorado, led by physicist Shlomi Kotler and his colleagues, showed that a pair of vibrating aluminum membranes, each about 10 micrometers long, can be made to vibrate in sync, in such a way that they can be described to be quantum entangled. Kotler's team amplified the signal from their devices to "see" the entanglement much more clearly. Measuring their position and velocities returned the same numbers, indicating that they were indeed entangled.
Tiny aluminium membranes used by Kotler's team.Credit: Florent Lecoq and Shlomi Kotler/NIST
Evading the Heisenberg uncertainty principle?
Another experiment with quantum drums — each one-fifth the width of a human hair — by a team led by Prof. Mika Sillanpää at Aalto University in Finland, attempted to find what happens in the area between quantum and non-quantum behavior. Like the other researchers, they also achieved quantum entanglement for larger objects, but they also made a fascinating inquiry into getting around the Heisenberg uncertainty principle.
The team's theoretical model was developed by Dr. Matt Woolley of the University of New South Wales. Photons in the microwave frequency were employed to create a synchronized vibrating pattern as well as to gauge the positions of the drums. The scientists managed to make the drums vibrate in opposite phases to each other, achieving "collective quantum motion."
The study's lead author, Dr. Laure Mercier de Lepinay, said: "In this situation, the quantum uncertainty of the drums' motion is canceled if the two drums are treated as one quantum-mechanical entity."
This effect allowed the team to measure both the positions and the momentum of the virtual drumheads at the same time. "One of the drums responds to all the forces of the other drum in the opposing way, kind of with a negative mass," Sillanpää explained.
Theoretically, this should not be possible under the Heisenberg uncertainty principle, one of the most well-known tenets of quantum mechanics. Proposed in the 1920s by Werner Heisenberg, the principle generally says that when dealing with the quantum world, where particles also act like waves, there's an inherent uncertainty in measuring both the position and the momentum of a particle at the same time. The more precisely you measure one variable, the more uncertainty in the measurement of the other. In other words, it is not possible to simultaneously pinpoint the exact values of the particle's position and momentum.
Heisenberg's Uncertainty Principle Explained. Credit: Veritasium / Youtube.com
Big Think contributor astrophysicist Adam Frank, known for the 13.8 podcast, called this "a really fascinating paper as it shows that it's possible to make larger entangled systems which behave like a single quantum object. But because we're looking at a single quantum object, the measurement doesn't really seem to me to be 'getting around' the uncertainty principle, as we know that in entangled systems an observation of one part constrains the behavior of other parts."
Ethan Siegel, also an astrophysicist, commented, "The main achievement of this latest work is that they have created a macroscopic system where two components are successfully quantum mechanically entangled across large length scales and with large masses. But there is no fundamental evasion of the Heisenberg uncertainty principle here; each individual component is exactly as uncertain as the rules of quantum physics predicts. While it's important to explore the relationship between quantum entanglement and the different components of the systems, including what happens when you treat both components together as a single system, nothing that's been demonstrated in this research negates Heisenberg's most important contribution to physics."The papers, published in the journal Science, could help create new generations of ultra-sensitive measuring devices and quantum computers.