Are geniuses real? The neuroscience and myths of visionaries
Labeling thinkers like Albert Einstein and Steve Jobs as "other" may be stifling humanity's creative potential.
TIM SANDERS: There are myths of creativity and these myths are usually propagated by people that have romantic notions about heroes, romantic notions about eureka moments. And these myths of creativity keep people from collaborating and it causes them to be a lone wolf. And the research says it causes them to fail. So let me talk a little bit about those myths of creativity. In the world of sales and marketing, I battle against three myths. Myth number one, the lone inventor. This is very dangerous because there is no such thing as a lone inventor. As a matter of fact, there's a lot of historical research that has debunked Einstein. Specifically in terms of inventions, Henry Ford, not a lone inventor. Classic example, Thomas Edison. In the invention community, Thomas Edison is a brand. It stands for 14 people. Yes, there was a figurehead named Thomas Edison. His name is on 10,000 patents. He did not invent a single thing. He marshaled people together and knew how to spot innovations and put people together like, a creative soup, if you will. Here's a classic example, Steve Jobs, you ask the average person, say a millennial who uses a lot of Apple technology, "Who's one of the greatest inventors of our time?" They'll say Steve Jobs. Steve Jobs once said, "I never created anything. "All I did was notice patterns "and put people together to finish projects." So think about it. If he doesn't have Wozniak, there is no original Apple, right? If he doesn't have Ive, there is no iPod. If he doesn't have Tony Fiddel, there is no iPhone. And the list goes on and on.
Got a good friend of mine, David Berkus, who wrote a really wonderful book about the myths of genius. And he was telling me that it's a romantic notion. And I remember when I first read this research years ago, no lone inventor, it did kind of hurt my feelings. I'm a musician in my past. I thought I wrote a lot of songs but according to the research, I'd never wrote a song. I always collaborated with somebody, the song that actually made it to the record and made it on the radio had 15 to 50 hands on it. When I talked to David, I said, "When I read your research, it kind of hurt my feelings." And he goes, "It's a romantic notion because we want to be heroes." We want to be as empowered as Ayn Rand. We want to think that we're the Fountainhead. So this is how we tell the story. But until you believe that genius is a team sport, you will never give up control. And this is the problem for a lot of people in sales. They don't want to cede any level of control over their process to somebody outside of sales world because they don't value those voices enough. But the research is clear on this, Miller Heiman Institute researched the difference between good and great. They call it world-class organizations. They win, they sell 20% more than their nearest competitor. The only thing they have in common is they've broken this myth and they understand that every deal is about rapid problem solving and no one person can solve the problem on their own.
Quickly, the other two myths of creativity that must be dispelled is the eureka moment. There is no such thing as a big idea that changes the world. I know this is another one of those hurtful, but very true based on empirical research points. There are little ideas that combined with other little ideas that improve themselves into game-changing ideas. And I've experienced this personally by who I consider, one of the authorities on creativity, Ed Catmull, President of Pixar. I remember standing backstage a few years ago just gushing to him about John Lasseter. I'm like John Lasseter, his VP of creative. I'm like, "He's the bomb." This is the guy who wrote "Wall-E" the commercial. This is the guy that wrote the script for "Toy Story," telling the story from the toys' point of view. And I remember Catmull looked at me, cut me off and not to dismiss Lasseter. He said, "Toy Story was a problematic idea from the start. Make an entire full length feature film inside a computer. Do you understand how hard that is from a rendering time standpoint? Make the characters as human as human. They didn't even have facial controller technology at the time for this. And tell the story from the toys' point of view when we've never historically had a toy, have any narrative for us to draw on." And Catmull explained that nine months in, they shut the film down. After a meeting with Disney, they called it Black Friday. And then Catmull said something to me that shattered the myth of the eureka moment. Catmull said, "Toy Story, the movie you saw, was a thousand problems solved." And it was like, a bolt of lightning. I was like, I get it. When you do a million-dollar deal in an ad agency, it's not a big idea, it's a hundred problems solved. Eighty of them are inside your agency. As you move through every level of that sale, you get obstacles in front of you. And what this means is that, if we no longer depend on the big idea to fall out of the sky and change the world, we meet more, we think more, we research more. We settle with small pieces of progress that add up to momentum.
Finally, the third myth of genius or creativity that must be shattered if you want to be more collaborative is the myth of the expert. Now I believe in involving people in a dealstorm who we think are experts on the problem space. But if you notice, I don't want experts on the solution space, because most of the great solutions to vexing problems come from the edges of a domain. People that don't know what they don't know. So they're not limited by these false constraints that hold people back that are in the middle of this subject. So the way I like to think about it is, if you could talk to a fish, if you could, if the fish could respond, if it could, and you walked up to a fish in a fish bowl and you asked the fish, "How's the water?" The fish would look at you puzzled and ask you, "What's water?" And that's the problem with experts. People that are so steeped in a domain, they don't have the expansive perspective that allows them to recognize patterns and convergence because every invention, every solution is really about pattern or convergence recognition. And so it's really important for us to follow the following mantra in collaboration: Ideas can come from anywhere. As a matter of fact, in "Dealstorming" that is one of my four key ground rules. It's just as important as stay on agenda and don't distract the person next to you. Because the problem with the myth of expert is it leads to not invented here, dismissal of good ideas. So when you're in a dealstorm meeting and someone who's on the edge of the domain, I'll just give you an example. Someone out of finance that generally handles something as mundane as revenue recognition, they come to one of your meetings. because you have a problem related to how you recognize the revenue of this deal or whatever. And you're in the middle of this conversation about packaging and they come up with a really novel way to think about how it's built. You might look at that person and say "You don't know anything about billing sales. "You're just a revenue recognition analyst. "We know billing and sales." You're about to shut him down for the rest of the meeting. And what you don't understand is that he may have an educational background, he may have had previous jobs. He may have a significant other that is steeped in the billing expertise. And he's drawing upon all that. The minute you tell someone, "Only experts can weigh in with ideas," everyone who's not an expert stops contributing and to my experience, it breaks down collaboration.
HEATHER BERLIN: I think a really big part of what it means to be a genius is to have a great deal of creative or novel thinking. Making these novel associations between ideas, having a lot of pattern detection. So it's not just about collecting a bunch of data and knowing a lot of facts, but it's making these novel connections between ideas. And I think what we wanna look at is for example, what is the neural correlative of something like divergent thinking or thinking outside the box? Having novel associations between ideas and that's the kind of thing that we can begin to measure.
CARL ZIMMER: So how can you measure something like that?
BERLIN: So it's been actually quite a problem how to quantify this, not just genius, but let's say creativity. We're breaking it down. Particularly what I'm interested in is improvisation. So when people are being spontaneously creative and what we can--
ZIMMER: Why is that important to you? What does that get at?
BERLIN: So, I think that a lot of what's happening in the brain is happening outside of awareness. And when we have our sort of conscious brain, highly active it's kind of suppressing a lot of what's going on outside of awareness. And sometimes when people are being creative, they say it almost feels like things are coming from outside of them when they're in this sort of flow state. And we're starting to understand a little bit more about that state. And it seems to be that when people are being creative in the moment that the part of their brain that has to do with their sense of self, with self-awareness, self-consciousness is turned down. It's called the dorsolateral prefrontal cortex.
ZIMMER: Where's that?
BERLIN: It's sort of like right here, it's part of the prefrontal cortex on the lateral side.
ZIMMER: So you can actually see that change, like the activity in there is changing?
BERLIN: The studies all seem to show that for example, when a jazz musician is improvising compared to when he does a memorized piece or even a rapper, when he's doing a freestyle rap compared to doing a memorized rap, there's a similar pattern of activation across the improvising rappers and the improvising jazz musicians. And they have a decreased activation in that dorsolateral prefrontal cortex, which has to do with self-awareness, monitoring your ongoing behavior making sure it conforms with social norms, but they have also increased activation in a part of the brain called the medial prefrontal cortex, which is sorta like right here, a little, if you go straight back a little bit, and that is turned up and that has to do with the internal generation of ideas, it's coming from within, it's stimulus independent. So if you think of the state, you're having this sort of free flow of unfiltered information coming from within, that's not being inhibited by that dorsolateral prefrontal cortex, you don't have to worry about, "How do people think about me?" And that free flow of information allows for the novel associations to be made. If you think about a similar pattern of brain activation happens during dreams or during daydreaming or types of meditation or hypnosis where you lose your sense of self and time and place and it allows the filter to come off. So that novel associations are okay. Dreams, don't all make sense, but that's where the creativity comes in. So that's why I'm interested in that state to see what happens in people when they're in that state. Because I think that's a big part of what is involved with genius.
JOY HIRSCH: Well, I'm not so sure that the quality of genius isn't necessarily a continuum, a continuum of creativity, a continuum of Yankee ingenuity. I think all of us as humans are sort of endowed with the need to make things better, to invent things, to go beyond the borders. We're all pioneers, we're all fascinated with a frontier. I mean, why do we think we need to go to the moon or to Mars? It's because we're human and we wanna know what's on the other side. And it's so ingrained in us that I think that genius is just an extreme version of that but it represents us as humans in a very fundamental way. And I think that we have to think about brains in the context of our society. One of the things about genius, I think, it's not just an individual or just a brain. It's about opportunity. It's about somebody who is given the pathway to actually make a contribution. Think of our musicians. Most of us would consider geniuses: Bach, Beethoven, Mozart. These are people that were put in positions that allowed them to be creative. The creative spirit comes with many things other than just a brain, I think it comes with opportunity, it comes with resources, it comes with attitude. Again, I liked the idea of not thinking of it as something that targets an individual and separates them, but something that joins us together as a quality that belongs to all of us.
ZIMMER: Because it is true that when people talk about geniuses they are other, they're almost freakish.
HIRSCH: Exactly, and I think that that attitude really deters people from taking the risk, but it's a double-edged sword. The genius term is often associated with the person that really changes the way we think. It could be something that didn't exist before that changes the course of our progress in some fundamental way. So that person, by his or her nature stands out and is different. And yet all of us are different in our creative sphere. And by incorporating the creative person into the mainstream, it might be a way to encourage more creativity.
- Revolutionary ideas and culture-shifting inventions are often credited to specific individuals, but how often do these "geniuses" actually operate in creative silos?
- Tim Sanders, former chief strategy officer at Yahoo, argues that there are three myths getting in the way of innovative ideas and productive collaborations: the myths of the expert, the eureka moment, and the "lone inventor."
- More than an innate quality reserved for an elite group, neuroscientist Heather Berlin and neurobiologist Joy Hirsch explain how creativity looks in the brain, and how given opportunity, resources, and attitude, we can all be like Bach, Beethoven, and Steve Jobs.
- Who is more inspiring, a hard worker or a genius? - Big Think ›
- 3 Dangerous Myths about Innovators and Creators - Big Think ›
Once a week.
Subscribe to our weekly newsletter.
Gain-of-function mutation research may help predict the next pandemic — or, critics argue, cause one.
This article was originally published on our sister site, Freethink.
"I was intrigued," says Ron Fouchier, in his rich, Dutch-accented English, "in how little things could kill large animals and humans."
It's late evening in Rotterdam as darkness slowly drapes our Skype conversation.
This fascination led the silver-haired virologist to venture into controversial gain-of-function mutation research — work by scientists that adds abilities to pathogens, including experiments that focus on SARS and MERS, the coronavirus cousins of the COVID-19 agent.
If we are to avoid another influenza pandemic, we will need to understand the kinds of flu viruses that could cause it. Gain-of-function mutation research can help us with that, says Fouchier, by telling us what kind of mutations might allow a virus to jump across species or evolve into more virulent strains. It could help us prepare and, in doing so, save lives.
Many of his scientific peers, however, disagree; they say his experiments are not worth the risks they pose to society.
A virus and a firestorm
The Dutch virologist, based at Erasmus Medical Center in Rotterdam, caused a firestorm of controversy about a decade ago, when he and Yoshihiro Kawaoka at the University of Wisconsin-Madison announced that they had successfully mutated H5N1, a strain of bird flu, to pass through the air between ferrets, in two separate experiments. Ferrets are considered the best flu models because their respiratory systems react to the flu much like humans.
The mutations that gave the virus its ability to be airborne transmissible are gain-of-function (GOF) mutations. GOF research is when scientists purposefully cause mutations that give viruses new abilities in an attempt to better understand the pathogen. In Fouchier's experiments, they wanted to see if it could be made airborne transmissible so that they could catch potentially dangerous strains early and develop new treatments and vaccines ahead of time.
The problem is: their mutated H5N1 could also cause a pandemic if it ever left the lab. In Science magazine, Fouchier himself called it "probably one of the most dangerous viruses you can make."
Just three special traits
Recreated 1918 influenza virionsCredit: Cynthia Goldsmith / CDC / Dr. Terrence Tumpey / Public domain via Wikipedia
For H5N1, Fouchier identified five mutations that could cause three special traits needed to trigger an avian flu to become airborne in mammals. Those traits are (1) the ability to attach to cells of the throat and nose, (2) the ability to survive the colder temperatures found in those places, and (3) the ability to survive in adverse environments.
A minimum of three mutations may be all that's needed for a virus in the wild to make the leap through the air in mammals. If it does, it could spread. Fast.
Fouchier calculates the odds of this happening to be fairly low, for any given virus. Each mutation has the potential to cripple the virus on its own. They need to be perfectly aligned for the flu to jump. But these mutations can — and do — happen.
"In 2013, a new virus popped up in China," says Fouchier. "H7N9."
H7N9 is another kind of avian flu, like H5N1. The CDC considers it the most likely flu strain to cause a pandemic. In the human outbreaks that occurred between 2013 and 2015, it killed a staggering 39% of known cases; if H7N9 were to have all five of the gain-of-function mutations Fouchier had identified in his work with H5N1, it could make COVID-19 look like a kitten in comparison.
H7N9 had three of those mutations in 2013.
Gain-of-function mutation: creating our fears to (possibly) prevent them
Flu viruses are basically eight pieces of RNA wrapped up in a ball. To create the gain-of-function mutations, the research used a DNA template for each piece, called a plasmid. Making a single mutation in the plasmid is easy, Fouchier says, and it's commonly done in genetics labs.
If you insert all eight plasmids into a mammalian cell, they hijack the cell's machinery to create flu virus RNA.
"Now you can start to assemble a new virus particle in that cell," Fouchier says.
One infected cell is enough to grow many new virus particles — from one to a thousand to a million; viruses are replication machines. And because they mutate so readily during their replication, the new viruses have to be checked to make sure it only has the mutations the lab caused.
The virus then goes into the ferrets, passing through them to generate new viruses until, on the 10th generation, it infected ferrets through the air. By analyzing the virus's genes in each generation, they can figure out what exact five mutations lead to H5N1 bird flu being airborne between ferrets.
And, potentially, people.
"This work should never have been done"
The potential for the modified H5N1 strain to cause a human pandemic if it ever slipped out of containment has sparked sharp criticism and no shortage of controversy. Rutgers molecular biologist Richard Ebright summed up the far end of the opposition when he told Science that the research "should never have been done."
"When I first heard about the experiments that make highly pathogenic avian influenza transmissible," says Philip Dormitzer, vice president and chief scientific officer of viral vaccines at Pfizer, "I was interested in the science but concerned about the risks of both the viruses themselves and of the consequences of the reaction to the experiments."
In 2014, in response to researchers' fears and some lab incidents, the federal government imposed a moratorium on all GOF research, freezing the work.
Some scientists believe gain-of-function mutation experiments could be extremely valuable in understanding the potential risks we face from wild influenza strains, but only if they are done right. Dormitzer says that a careful and thoughtful examination of the issue could lead to processes that make gain-of-function mutation research with viruses safer.
But in the meantime, the moratorium stifled some research into influenzas — and coronaviruses.
The National Academy of Science whipped up some new guidelines, and in December of 2017, the call went out: GOF studies could apply to be funded again. A panel formed by Health and Human Services (HHS) would review applications and make the decision of which studies to fund.
As of right now, only Kawaoka and Fouchier's studies have been approved, getting the green light last winter. They are resuming where they left off.
Pandora's locks: how to contain gain-of-function flu
Here's the thing: the work is indeed potentially dangerous. But there are layers upon layers of safety measures at both Fouchier's and Kawaoka's labs.
"You really need to think about it like an onion," says Rebecca Moritz of the University of Wisconsin-Madison. Moritz is the select agent responsible for Kawaoka's lab. Her job is to ensure that all safety standards are met and that protocols are created and drilled; basically, she's there to prevent viruses from escaping. And this virus has some extra-special considerations.
The specific H5N1 strain Kawaoka's lab uses is on a list called the Federal Select Agent Program. Pathogens on this list need to meet special safety considerations. The GOF experiments have even more stringent guidelines because the research is deemed "dual-use research of concern."
There was debate over whether Fouchier and Kawaoka's work should even be published.
"Dual-use research of concern is legitimate research that could potentially be used for nefarious purposes," Moritz says. At one time, there was debate over whether Fouchier and Kawaoka's work should even be published.
While the insights they found would help scientists, they could also be used to create bioweapons. The papers had to pass through a review by the U.S. National Science Board for Biosecurity, but they were eventually published.
Intentional biowarfare and terrorism aside, the gain-of-function mutation flu must be contained even from accidents. At Wisconsin, that begins with the building itself. The labs are specially designed to be able to contain pathogens (BSL-3 agricultural, for you Inside Baseball types).
They are essentially an airtight cement bunker, negatively pressurized so that air will only flow into the lab in case of any breach — keeping the viruses pushed in. And all air in and out of the lap passes through multiple HEPA filters.
Inside the lab, researchers wear special protective equipment, including respirators. Anyone coming or going into the lab must go through an intricate dance involving stripping and putting on various articles of clothing and passing through showers and decontamination.
And the most dangerous parts of the experiment are performed inside primary containment. For example, a biocontainment cabinet, which acts like an extra high-security box, inside the already highly-secure lab (kind of like the radiation glove box Homer Simpson is working in during the opening credits).
"Many people behind the institution are working to make sure this research can be done safely and securely." — REBECCA MORITZ
The Federal Select Agent program can come and inspect you at any time with no warning, Moritz says. At the bare minimum, the whole thing gets shaken down every three years.
There are numerous potential dangers — a vial of virus gets dropped; a needle prick; a ferret bite — but Moritz is confident that the safety measures and guidelines will prevent any catastrophe.
"The institution and many people behind the institution are working to make sure this research can be done safely and securely," Moritz says.
No human harm has come of the work yet, but the potential for it is real.
"Nature will continue to do this"
They were dead on the beaches.
In the spring of 2014, another type of bird flu, H10N7, swept through the harbor seal population of northern Europe. Starting in Sweden, the virus moved south and west, across Denmark, Germany, and the Netherlands. It is estimated that 10% of the entire seal population was killed.
The virus's evolution could be tracked through time and space, Fouchier says, as it progressed down the coast. Natural selection pushed through gain-of-function mutations in the seals, similarly to how H5N1 evolved to better jump between ferrets in his lab — his lab which, at the time, was shuttered.
"We did our work in the lab," Fouchier says, with a high level of safety and security. "But the same thing was happening on the beach here in the Netherlands. And so you can tell me to stop doing this research, but nature will continue to do this day in, day out."
Critics argue that the knowledge gained from the experiments is either non-existent or not worth the risk; Fouchier argues that GOF experiments are the only way to learn crucial information on what makes a flu virus a pandemic candidate.
"If these three traits could be caused by hundreds of combinations of five mutations, then that increases the risk of these things happening in nature immensely," Fouchier says.
"With something as crucial as flu, we need to investigate everything that we can," Fouchier says, hoping to find "a new Achilles' heel of the flu that we can use to stop the impact of it."
From "mutilated males" to "wandering wombs," dodgy science affects how we view the female body still today.
- The history of medicine and biology often has been embarrassingly wrong when it comes to female anatomy and was surprisingly resistant to progress.
- Aristotle and the ancient Greeks are much to blame for the mistaken notion of women as cold, passive, and little more than a "mutilated man."
- Thanks to this dubious science, and the likes of Sigmund Freud, we live today with a legacy that judges women according to antiquated biology and psychology.
The story of medicine has not been particularly kind to women. Not only was little anatomical or scientific research done on women or on women-specific issues, doctors often treated them differently.
Even today, women are up to ten times more likely to have their symptoms explained away as being psychological or psychosomatic than men. Worryingly, women are 50 percent more likely to be misdiagnosed after a heart attack, and drugs designed for "everyone" are actually much less effective (for pain) or too effective (for sleeping) in women.
Are these differences real or imagined? And what can the history of female medicine teach us about where we are today?
A mutilated male
Aristotle is rightly considered one of the greatest minds of all time and is recognized as the founding father of many disciplines, including biology. He was one of the most rigorous and comprehensive scientists and field researchers the world had known. He categorized a large number of species based on a wide range of traits, such as movement, longevity, and sensory capacity. His views on women, then, stemmed from what he thought of as good, proper study. The problem is that he got pretty much all of it wrong.
According to Aristotle, during pregnancy, it was the man who, alone, contributed the all-important "form" of a fetus (that is, its defining nature and personality), whereas the woman provided only the matter (that is, the environment and sustenance to grow the fetus, which was provided by the menstrual blood).
From this, Aristotle extrapolated all sorts of dubious conclusions. He ventured that the man was superior, active, and dominant, and the woman inferior, passive, and submissive. As such, the woman's role was to nurture children, run a household, and be silent and obedient — political and cultural manifestations of dodgy biology. If women did not provide a child's form and nature, how important could they really be?
Given this passivity, Aristotle argued that the woman must be associated with other passive things, like being cold and slow. The man, being dynamic and energetic, must be hot and fast. From this, Aristotle concluded that any defects or problems in childbirth can only be due to the sluggishness of the female womb. Even the positive biological aspects of being female, such as greater longevity, were put down to this cold rigidity — a lack of metabolism and spirit. Most notorious of all, since Aristotle believed that female children were themselves the result of an incomplete and underdeveloped gestation, women were simply "mutilated males" whose mothers' cold wombs had overpowered the warm, vital, male sperm.
Aristotle can still be counted as a great mind, but when it came to women, his ideas have not aged well in just how far they negatively influenced what came after. Given that his works were seen as the authority well into the 16th century, he left quite the pernicious legacy.
A wandering womb
But, how much can we really blame Aristotle? Without the aid of modern scientific equipment, physicians and biologists were left to guess about female anatomy. Unfortunately, the damage was done, and Aristotle's ideas of a troublesome uterus became so mainstream that they led to one of the more bizarre ideas in medical history: the wandering womb.
The "wandering womb" is the idea that the womb is actually some kind of roaming parasite in the body, possibly even a separate organism. According to this theory, after a woman menstruates, her womb becomes hot and dry and so becomes extra mobile. It is transformed into a voracious hunter. The womb will dart from organ to organ, seeking to steal its moisture and other vital fluids. This parasitic behavior caused all sorts of (female only) illnesses.
If a woman had asthma, the womb was leeching the lungs. Stomach aches, it was in the gut. And if it attacked the heart (which the ancients thought was the source of our thoughts), then it would cause all manner of mental health issues. In fact, the Greek word for womb is "hystera," and so when we call someone (often a woman) hysterical, we are saying that their womb is causing mischief.
The "solutions" or "remedies" for a wandering womb were as strange as the theory. Since the womb was supposed to be attracted to sweet smells, placing flowers or perfumes around the vagina would "lure" it down. On the flip side, if you smoked noxious substances or ate disgusting foods, it would "repel" the womb away. By using all manner of smells, you could make the womb move wherever you wanted.
The oddest "remedy" — and most male-centric of all — is that, since the wandering womb was said to be caused by heat and dryness, a good solution would be male semen, which was thought of as cooling and wet. And so, the ancient and highly inaccurate myth was born that sex could cure a woman of her "hysteria."
A lingering problem
We live today with the legacy of this kind of thinking. Freud was much taken with the idea of "hysteria," and although he did accept that men could be subject to it as well, he believed it was overwhelmingly a female problem caused by female biology. The woman, for Freud, is mostly defined by her "sexual function." What Freud calls "normal femininity" (the preferred and best outcome) is defined by passivity. A woman's ideal development is one which moves from being active and "phallic" to passive and vaginal.
Nowadays, Freud and Aristotle's legacy lies in just how easily women are defined by their sexuality. Given that men and women, both, are equally dependent on their biology, it is curious how much more often women are reduced to theirs. The idea that women are more emotional or slaves to their hormones than men is still a depressingly familiar trope. It is an idea that goes back to the Greeks.
If we think biology is important to who we are (as it most certainly is), we ought to make sure that the biology is as good and accurate as it can be.
A global survey shows the majority of countries favor Android over iPhone.
- When Android was launched soon after Apple's own iPhone, Steve Jobs threatened to "destroy" it.
- Ever since, and across the world, the rivalry between both systems has animated users.
- Now the results are in: worldwide, consumers clearly prefer one side — and it's not Steve Jobs'.
A woman on her phone in Havana, Cuba. Mobile phones have become ubiquitous the world over — and so has the divide between Android and iPhone users.Credit: Yamil Lage / AFP via Getty Images.
Us versus them: it's the archetypal binary. It makes the world understandable by dividing it into two competing halves: labor against capital, West against East, men against women.
These maps are the first to show the dividing lines between one of the world's more recent binaries: Android vs. Apple. Published by Electronics Hub, they are based on a qualitative analysis of almost 350,000 tweets worldwide that presented positive, neutral, and negative attitudes toward Android and/or Apple.
Steve Jobs wanted to go "thermonuclear"
Feelings between Android and Apple were pretty tribal from the get-go. It was Steve Jobs himself who said, when Google rolled out Android a mere ten months after Apple launched the iPhone, "I'm going to destroy Android, because it's a stolen product. I'm willing to go thermonuclear war on this."
Buying a phone is like picking a side in the eternal feud between the Hatfields and the McCoys. Each choice for automatically comes with an in-built arsenal of arguments against.
If you are an iPhone person, you appreciate the sleekness and simplicity of its design, and you are horrified by the confusing mess that is the Android operating system. If you are an Android aficionado, you pity the iPhone user, a captive of an overly expensive closed ecosystem, designed to extract money from its users.
Even without resorting to those extremes, many of us will recognize which side of the dividing line that we are on. Like the American Civil War, that line runs through families and groups of friends, but that would be a bit confusing to chart geographically. To un-muddle the information, these maps zoom out to state and country level.
If the contest is based on the number of countries, Android wins. In all, 74 of the 142 countries surveyed prefer Android (in green on the map). Only 65 favor Apple (colored grey). That's a 52/48 split, which may not sound like a decisive vote, but it was good enough for Boris Johnson to get Brexit done (after he got breakfast done, of course).
And yes, math-heads: 74 plus 65 is three short of 142. Belarus, Fiji, and Peru (in yellow on the map) could not decide which side to support in the Global Phone War.
What about the United States, home of both the Android and the iPhone? Another victory for the former, albeit a slightly narrower one: 30.16 percent of the tweets about Android were positive versus just 29.03 percent of the ones about Apple.
United States: Texas surrounded!
Credit: Electronics Hub
There can be only one winner per state, though, and that leads to this preponderance of Android logos. Frankly, it's a relief to see a map showing a visceral divide within the United States that is not the coasts versus the heartland.
- Apple dominates in 19 states: a solid Midwestern bloc, another of states surrounding Texas, the Dakotas and California, plus North Carolina, New Hampshire, and Rhode Island.
- And that's it. The other 32 are the United States of Android. You can drive from Seattle to Miami without straying into iPhone territory. But no stopovers in Dallas or Houston – both are behind enemy lines!
North America: strongly leaning toward Android
Credit: Electronics Hub
Only eight of North America's 21 countries surveyed fall into the Apple category.
- The U.S. and Canada lean Android, while Mexico goes for the iPhone.
- Central America is divided, but here too Android wins hands down, 5-2.
Europe: Big Five divided
Credit: Electronics Hub
In Europe, Apple wins, with 20 countries preferring the iPhone, 17 going for Android, and Belarus sitting on the fence.
- Of Western Europe's Big Five markets, three (UK, Germany, Spain) are pro-Android, and two (France, Italy) are pro-Apple.
- Czechia and Slovakia are an Apple island in the Android sea that is Central Europe. Glad to see there is still something the divorcees can agree on.
South America: almost even
Credit: Electronics Hub
In South America, the divide is almost even.
- Five countries prefer Android, four Apple, and one is undecided.
- In Peru, both Android- and Apple-related tweets were 25 percent positive.
Africa: watch out for Huawei
Credit: Electronics Hub
In Africa, Android wins by 17 countries versus Apple's 15.
- There's a solid Android bloc running from South Africa via DR Congo all the way to Ethiopia.
- iPhone countries are scattered throughout the north (Algeria), west (Guinea), east (Somalia), and south (Namibia).
Huawei — increasingly popular across the continent — could soon dramatically change the picture in Africa. Currently still running on Android, the Chinese phone manufacturer has just launched its own operating system, called Harmony.
Middle East: Iran vs. Saudi Arabia (again)
Credit: Electronics Hub
In the Middle East and Central Asia, Android wins 8 countries to Apple's 6.
- But it's complicated. One Turkish tweeter wondered how it is that iPhones seem more popular in the Asian half of Istanbul, while Android phones prevailed in the European part of the city.
- The phone divide matches up with the region's main geopolitical one: Iran prefers Android, Saudi Arabia the iPhone.
Asia-Pacific: Apple on the periphery
Credit: Electronics Hub
Another wafer-thin majority for Android in the Asia-Pacific region: 13 countries versus 12 for Apple — and one abstention (Fiji).
- The two giants of the Asian mainland, India and China, are both Android countries. Apple countries are on the periphery.
- And if India is Android, its rival Pakistan must be Apple. Same with North and South Korea.
Experts point to the fact that both operating systems are becoming more alike with every new generation as a potential resolution to the conflict. But as any student of human behavior will confirm: smaller differences will only exacerbate the rivalry between both camps.
Maps taken from Electronics Hub, reproduced with kind permission.
Strange Maps #1096
Got a strange map? Let me know at email@example.com.
People tend to reflexively assume that fun events – like vacations – will go by really quickly.
For many people, summer vacation can't come soon enough – especially for the half of Americans who canceled their summer plans last year due to the pandemic.
But when a vacation approaches, do you ever get the feeling that it's almost over before it starts?
If so, you're not alone.
In some recent studies Gabriela Tonietto, Sam Maglio, Eric VanEpps and I conducted, we found that about half of the people we surveyed indicated that their upcoming weekend trip felt like it would end as soon as it started.
This feeling can have a ripple effect. It can change the way trips are planned – you might, for example, be less likely to schedule extra activities. At the same time, you might be more likely to splurge on an expensive dinner because you want to make the best of the little time you think you have.
Where does this tendency come from? And can it be avoided?
Not all events are created equal
When people look forward to something, they usually want it to happen as soon as possible and last as long as possible.
We first explored the effect of this attitude in the context of Thanksgiving.
We chose Thanksgiving because almost everyone in the U.S. celebrates it, but not everyone looks forward to it. Some people love the annual family get-together. Others – whether it's the stress of cooking, the tedium of cleaning or the anxiety of dealing with family drama – dread it.
So on the Monday before Thanksgiving in 2019, we surveyed 510 people online and asked them to tell us whether they were looking forward to the holiday. Then we asked them how far away it seemed, and how long they felt it would last. We had them move a 100-point slider – 0 meaning very short and 100 meaning very long – to a location that reflected their feelings.
As we suspected, the more participants looked forward to their Thanksgiving festivities, the farther away it seemed and shorter it felt. Ironically, longing for something seems to shrink its duration in the mind's eye.
Winding the mind's clock
Most people believe the idiom “time flies when you're having fun," and research has, indeed, shown that when time seems to pass by quickly, people assume the task must have been engaging and enjoyable.
We reasoned that people might be over-applying their assumption about the relationship between time and fun when judging the duration of events yet to happen.
As a result, people tend to reflexively assume that fun events – like vacations – will go by really quickly. Meanwhile, pining for something can make the time leading up to the event seem to drag. The combination of its beginning pushed farther away in their minds – with its end pulled closer – resulted in our participants' anticipating that something they looked forward would feel as if it had almost no duration at all.
In another study, we asked participants to imagine going on a weekend trip that they either expected to be fun or terrible. We then asked them how far away the start and end of this trip felt like using a similar 0 to 100 scale. 46% of participants evaluated the positive weekend as feeling like it had no duration at all: They marked the beginning and the end of the vacation virtually at the same location when using the slider scale.
Thinking in hours and days
Our goal was to show how these two judgments of an event – the fact that it simultaneously seems farther away and is assumed to last for less time – can nearly eliminate the event's duration in the mind's eye.
We reasoned that if we didn't explicitly highlight these two separate pieces – and instead directly asked them about the duration of the event – a smaller portion of people would indicate virtually no duration for something they looked forward to.
We tested this theory in another study, in which we told participants that they would watch two five-minute-long videos back-to-back. We described the second video as either humorous or boring, and then asked them how long they thought each video would feel like it lasted.
We found that the participants predicted that the funny video would still feel shorter and was farther away than the boring one. But we also found that participants believed it would last a bit longer than the responses we received in the earlier studies.
This finding gives us a way to overcome this biased perception: focus on the actual duration. Because in this study, participants directly reported how long the funny video would last – and not the perceived distance of its beginning and its end – they were far less likely to assume it would be over just as it started.
While it sounds trivial and obvious, we often rely on our subjective feelings – not objective measures of time – when deciding how long a period of time will feel and how to best use it.
So when looking forward to much-anticipated events like vacations, it's important to remind yourself just how many days it will last.
You'll get more out of the experience – and, hopefully, put yourself in a better position to take advantage of the time you do have.