Social Media's Dark Side: How Connectivity Uprooted Our Self-Worth
We used to use technology. Now technology uses us. Silicon Valley ethicist Tristan Harris explains how the attention economy hijacked our self-worth for profit.
Tristan Harris is a design thinker, philosopher and entrepreneur.
Called the “closest thing Silicon Valley has to a conscience,” by The Atlantic magazine, Tristan Harris was a Design Ethicist at Google and is now a leader in Time Well Spent, a movement to align technology with our humanity. Time Well Spent aims to heighten consumer awareness about how technology shapes our minds, empower consumers with better ways to use technology and change business incentives and design practices to align with humanity’s best interest.
Tristan is an avid researcher of what influences human behavior, beliefs and interpersonal dynamics, drawing on insights from sleight of hand magic and hypnosis to cults and behavioral economics. Currently he is developing a framework for ethical influence, especially as it relates to the moral responsibility of technology companies.
His work has been featured on PBS NewsHour, The Atlantic Magazine, ReCode, TED, 1843 Economist Magazine, Wired, NYTimes, Der Spiegel, NY Review of Books, Rue89 and more.
Previously, Tristan was CEO of Apture, which Google acquired in 2011. Apture enabled millions of users to get instant, on-the-fly explanations across a publisher network of a billion page views per month.
Tristan holds several patents from his work at Apple, Wikia, Apture and Google. He graduated from Stanford University with a degree in Computer Science, focused on Human Computer Interaction, while dabbling in behavioral economics, social psychology, behavior change and habit formation in Professor BJ Fogg’s Stanford Persuasive Technology lab. He was rated #16 in Inc Magazine’s Top 30 Entrepreneurs Under 30 in 2009.
You can read his most popular essay: How Technology Hijacks People’s Minds – from a Magician and Google’s Design Ethicist.
Tristan Harris: Well, there's a really common misconception that technology is neutral and it's up to us to just choose how to use it.
And so we're sitting there and we're scrolling and we find ourselves in this kind of wormhole and then we say, “Oh man, like, I should really have more self-control." And that's partially true, but what we forget when we talk about it that way is that there's a thousand engineers on the other side of the screen whose job it was to get my finger to do that the next time. And there's this whole playbook of techniques that they use to get us to keep using the software more.
Was design always this manipulative? It wasn't always this way. In fact, back in the 1970s and the early '80s at Xerox PARC when Steve Jobs first went over and saw the graphical user interface, the way people talked about computers and what computers were supposed to be was a “bicycle for our minds” that, here we are, you take a human being and they have a certain set of capacities and capabilities, and then you give them a bicycle and they can go to all these new distances, they're empowered to go to these brand-new places and to do these new things, to have these new capacities.
And that's always been the philosophy of people who make technology: how do we create bicycles for our minds to do and empower us to feel and access more?
Now, when the first iPhone was introduced it was also the philosophy of the technology; how do we empower people to do something more? And in those days it wasn't manipulative because there was no competition for attention. Photoshop wasn't trying to maximize how much attention it took from you—it didn't measure its success that way.
And the Internet overall had been, in the very beginning, not designed to maximize attention, it was just putting things out there, putting things out there, creating these message boards.
It wasn't designed with this whole persuasive psychology that emerged later. What happened is that the attention economy and this race for attention got more and more competitive, and the more competitive it got to get people's attention on, let's say a news website, the more they need to add these design principles, these more manipulative design tactics as ways of holding onto your attention.
And so YouTube goes from being a more neutral, honest tool of just, “Here's a video,” to, “Oh, do you want to see these other videos? And do you want to auto-play the next video? And here's some notifications…”
These products start to look and feel more like media that's about maximizing consumption and less like bicycles for our minds.
And I think that's such a subtle an important thing to recognize, is that more and more of technology is really not on our team to help us spend our time the way we want to, it's more on the team of maximizing how much time we spend on the screen.
Right now, the only way to succeed in the app store is by proving you're really good at getting people's attention. And so by making attention the currency of success it forces all of these good-hearted, good-intentioned people who make apps or make media sites, to do things they don't want to do just because they have to get attention. You have mediation apps that have to send you all thee notifications and add streaks and do all this game-y stuff just to get you to use it the most, as opposed to having the phone itself recognize that there's parts of your life where mediation might want to be at the top of the list for you, because you're defining that, and so that meditation app could win for the right reasons, not for the wrong reasons.
So we always worry about new technologies when they first appear. When people started bringing out the newspapers on the subway people worried, “Oh my god, people are going to stop talking to each other on the subway!” And then TV showed up and people worried, “Oh my god, people are going to spend all this time at home!” And the radio—we always worry, and then somehow it seems to turn out okay. Back in the 1970s people were at home on the telephone and calling each other all the time, we thought, “Oh the kids these days, what are they doing to their minds?” So now it's tempting to say, “Well now the kids these days they're on Snapchat, and therefore we survived all those other technology transitions, nothing really bad happened, so maybe it's all okay, this is just kids being kids in a new way.” And I want to talk about why that's not true and why this is different.
What's different is that, let's say—let's take that telephone example: in the 1970s if someone picked up a telephone to go call their friend and they spent time gossiping on the telephone, we could do that, that's fine. But the telephone wasn't updated every day with new manipulative design principles to be better and better at seducing you into calling your friends.
So what's different with Snapchat is that there are a thousand engineers every single day who work on the product to actually find new ways not to just get you to use it but to kind of tap you both on the shoulder and try to get you into a reciprocity relationship where you owe someone else a response. In fact they have a feature in Snapchat called Snapstreaks. And Snapstreaks count the number of days in a row you've sent a message back-and-forth with someone. So if I have a best friend and we've been chatting every day for a hundred days I see a little fireball and the number 100 next to it. And what that does is, they've just created something now I don't want to lose because I have a streak going, I've got a hundred days and now if I don't send them a message tomorrow I'm going to lose the whole thing.
And when you're a kid actually this has a really big impact on you. I'm not making this up. There's a woman named Emily Weinstein at Harvard who studied the effects of Snapchat and Snapstreaks on kids by accident—it emerged in her interviews.
And she found out that kids, when they go on vacation they have to give their password to up to five other friends to actually have them send messages while they're gone on vacation because they're so worried about losing their Snapstreaks. In fact a lot of kids, they wake up and they see these 30 contacts with different streaks and they have to just take pictures of floors and ceilings just to kind of get through all these Snapstreaks so they don't lose any of them.
So we have to ask, when Snapchat is designing a feature like Snapstreaks, are they doing that because they most want to help kids empower them to live their lives or are they doing that because it's really good at getting their attention?
And when we're parents and we see kids using it this way we have to recognize there's something very different going on here. This was not true in the 1970s when we had a neutral telephone that we would choose to use when we wanted to use it. There are now a thousand people behind our new telephone, which is say, Snapchat, that's being designed and updated every day to be more compelling at addicting and holding onto people's attention. And it's not good for us.
I think we can all feel it. We become more and more on the treadmill to the number of likes or feedback we get, basically, from social media and start tying our own approval, our own self-worth to how much attention we get from other people. I mean, even for me I notice that if I post something it does affect me whether or not I have a lot of likes or few likes.
And it's hard to really, if you think about it and get to the sense of, “My self-worth is completely independent of that.” That's a subtle thing to hold and developmentally, children are more vulnerable to their self-worth being externalized like this.
And the problem is that there's always been, as well, ways of externalizing our self-worth in terms of how much money we have in our bank account or in terms of how many friends we have, but now the externalization of our self-worth is controlled by a handful of companies whose goals are different than our goals.
They're not evil companies, they just have the different goal of maximizing attention, and our goals are not that. But the problem is that their goals become our goals. This is what's actually so dangerous, is that their goals of engaging us the most by having us care about likes become our goals. We actually wake up in the morning as a sovereign human beings and we start caring about the number of likes we got, as if that's our goal in life. That becomes our goal. And it's as if we've been infected, it's as if they've drilled a hole in the back of our head and now they've injected the virus and now we walk around searching for feedback using social media. And they won, if that happens.
And again, it's not because they're evil but they're in a different game, they're trying to maximize attention. But we have to ask a much deeper question, which is: what do we want in our lives and what is our self-worth actually tied to? And maybe it's being virtuous or being a good friend or caring about what matters or living by our values.
There's a whole bunch of things that we can define for ourselves, and I think the less people have to find their own values the more vulnerable they are to someone else coming in and giving them their values. And to fix that we're going to need to talk about different kinds of metrics, different ways of measuring success and monetizing success. So instead of making more money the more time we get, we instead make more money the more we've helped you in your life.
In the 1970s, at the dawn of personal computers, people like Steve Jobs and the scientists at Xerox PARC talked about computers as "bicycles for our mind". Sure, someone was going to make big money selling these hardware units, but the intention was at heart quite pure; computers would give our minds wheels to go farther than ever before. Our capabilities would be augmented by technology, and we would become smarter and more capable. That ethos has not really stuck, and today we find ourselves in a Pavlovian relationship with push notifications, incapacitated by the multi-directional pull on our attention spans.
We've made it through every new technological wave—newspapers, radio, TV, laptops, cell phones—without the social decay that was widely prophesied, but there's something different about smartphones loaded with apps living in the palm of our hand, says tech ethicist Tristan Harris. It would be a mistake not to recognize how, this time, it really is different. Companies today are not more evil than they were in the 1970s, what's changed is the environment they operate in: the attention economy, where the currency is your eyeballs on their product, for as long as possible—precious exposure that can be sold to advertisers. Unlike the neutral technology we once used, and could walk away from, today's technology uses us. Behind every app—Facebook, Twitter, Snapchat—are 1,000 software designers working every day to update and find new psychological levers to keep you hooked to this product. The most powerful development has been that of 'likes', public feedback that externalized our self-worth onto a score card (this has reached new heights with Snapchat's streaks, which research by Emily Weinstein at Harvard has shown puts extreme stress on kids and adolescents.) "These products start to look and feel more like media that's about maximizing consumption and less like bicycles for our minds," says Harris. Is it too late to do something about the attention economy? To find out more about Tristan Harris, head to tristanharris.com.
Once a week.
Subscribe to our weekly newsletter.
From "mutilated males" to "wandering wombs," dodgy science affects how we view the female body still today.
- The history of medicine and biology often has been embarrassingly wrong when it comes to female anatomy and was surprisingly resistant to progress.
- Aristotle and the ancient Greeks are much to blame for the mistaken notion of women as cold, passive, and little more than a "mutilated man."
- Thanks to this dubious science, and the likes of Sigmund Freud, we live today with a legacy that judges women according to antiquated biology and psychology.
The story of medicine has not been particularly kind to women. Not only was little anatomical or scientific research done on women or on women-specific issues, doctors often treated them differently.
Even today, women are up to ten times more likely to have their symptoms explained away as being psychological or psychosomatic than men. Worryingly, women are 50 percent more likely to be misdiagnosed after a heart attack, and drugs designed for "everyone" are actually much less effective (for pain) or too effective (for sleeping) in women.
Are these differences real or imagined? And what can the history of female medicine teach us about where we are today?
A mutilated male
Aristotle is rightly considered one of the greatest minds of all time and is recognized as the founding father of many disciplines, including biology. He was one of the most rigorous and comprehensive scientists and field researchers the world had known. He categorized a large number of species based on a wide range of traits, such as movement, longevity, and sensory capacity. His views on women, then, stemmed from what he thought of as good, proper study. The problem is that he got pretty much all of it wrong.
According to Aristotle, during pregnancy, it was the man who, alone, contributed the all-important "form" of a fetus (that is, its defining nature and personality), whereas the woman provided only the matter (that is, the environment and sustenance to grow the fetus, which was provided by the menstrual blood).
From this, Aristotle extrapolated all sorts of dubious conclusions. He ventured that the man was superior, active, and dominant, and the woman inferior, passive, and submissive. As such, the woman's role was to nurture children, run a household, and be silent and obedient — political and cultural manifestations of dodgy biology. If women did not provide a child's form and nature, how important could they really be?
Given this passivity, Aristotle argued that the woman must be associated with other passive things, like being cold and slow. The man, being dynamic and energetic, must be hot and fast. From this, Aristotle concluded that any defects or problems in childbirth can only be due to the sluggishness of the female womb. Even the positive biological aspects of being female, such as greater longevity, were put down to this cold rigidity — a lack of metabolism and spirit. Most notorious of all, since Aristotle believed that female children were themselves the result of an incomplete and underdeveloped gestation, women were simply "mutilated males" whose mothers' cold wombs had overpowered the warm, vital, male sperm.
Aristotle can still be counted as a great mind, but when it came to women, his ideas have not aged well in just how far they negatively influenced what came after. Given that his works were seen as the authority well into the 16th century, he left quite the pernicious legacy.
A wandering womb
But, how much can we really blame Aristotle? Without the aid of modern scientific equipment, physicians and biologists were left to guess about female anatomy. Unfortunately, the damage was done, and Aristotle's ideas of a troublesome uterus became so mainstream that they led to one of the more bizarre ideas in medical history: the wandering womb.
The "wandering womb" is the idea that the womb is actually some kind of roaming parasite in the body, possibly even a separate organism. According to this theory, after a woman menstruates, her womb becomes hot and dry and so becomes extra mobile. It is transformed into a voracious hunter. The womb will dart from organ to organ, seeking to steal its moisture and other vital fluids. This parasitic behavior caused all sorts of (female only) illnesses.
If a woman had asthma, the womb was leeching the lungs. Stomach aches, it was in the gut. And if it attacked the heart (which the ancients thought was the source of our thoughts), then it would cause all manner of mental health issues. In fact, the Greek word for womb is "hystera," and so when we call someone (often a woman) hysterical, we are saying that their womb is causing mischief.
The "solutions" or "remedies" for a wandering womb were as strange as the theory. Since the womb was supposed to be attracted to sweet smells, placing flowers or perfumes around the vagina would "lure" it down. On the flip side, if you smoked noxious substances or ate disgusting foods, it would "repel" the womb away. By using all manner of smells, you could make the womb move wherever you wanted.
The oddest "remedy" — and most male-centric of all — is that, since the wandering womb was said to be caused by heat and dryness, a good solution would be male semen, which was thought of as cooling and wet. And so, the ancient and highly inaccurate myth was born that sex could cure a woman of her "hysteria."
A lingering problem
We live today with the legacy of this kind of thinking. Freud was much taken with the idea of "hysteria," and although he did accept that men could be subject to it as well, he believed it was overwhelmingly a female problem caused by female biology. The woman, for Freud, is mostly defined by her "sexual function." What Freud calls "normal femininity" (the preferred and best outcome) is defined by passivity. A woman's ideal development is one which moves from being active and "phallic" to passive and vaginal.
Nowadays, Freud and Aristotle's legacy lies in just how easily women are defined by their sexuality. Given that men and women, both, are equally dependent on their biology, it is curious how much more often women are reduced to theirs. The idea that women are more emotional or slaves to their hormones than men is still a depressingly familiar trope. It is an idea that goes back to the Greeks.
If we think biology is important to who we are (as it most certainly is), we ought to make sure that the biology is as good and accurate as it can be.
People tend to reflexively assume that fun events – like vacations – will go by really quickly.
For many people, summer vacation can't come soon enough – especially for the half of Americans who canceled their summer plans last year due to the pandemic.
But when a vacation approaches, do you ever get the feeling that it's almost over before it starts?
If so, you're not alone.
In some recent studies Gabriela Tonietto, Sam Maglio, Eric VanEpps and I conducted, we found that about half of the people we surveyed indicated that their upcoming weekend trip felt like it would end as soon as it started.
This feeling can have a ripple effect. It can change the way trips are planned – you might, for example, be less likely to schedule extra activities. At the same time, you might be more likely to splurge on an expensive dinner because you want to make the best of the little time you think you have.
Where does this tendency come from? And can it be avoided?
Not all events are created equal
When people look forward to something, they usually want it to happen as soon as possible and last as long as possible.
We first explored the effect of this attitude in the context of Thanksgiving.
We chose Thanksgiving because almost everyone in the U.S. celebrates it, but not everyone looks forward to it. Some people love the annual family get-together. Others – whether it's the stress of cooking, the tedium of cleaning or the anxiety of dealing with family drama – dread it.
So on the Monday before Thanksgiving in 2019, we surveyed 510 people online and asked them to tell us whether they were looking forward to the holiday. Then we asked them how far away it seemed, and how long they felt it would last. We had them move a 100-point slider – 0 meaning very short and 100 meaning very long – to a location that reflected their feelings.
As we suspected, the more participants looked forward to their Thanksgiving festivities, the farther away it seemed and shorter it felt. Ironically, longing for something seems to shrink its duration in the mind's eye.
Winding the mind's clock
Most people believe the idiom “time flies when you're having fun," and research has, indeed, shown that when time seems to pass by quickly, people assume the task must have been engaging and enjoyable.
We reasoned that people might be over-applying their assumption about the relationship between time and fun when judging the duration of events yet to happen.
As a result, people tend to reflexively assume that fun events – like vacations – will go by really quickly. Meanwhile, pining for something can make the time leading up to the event seem to drag. The combination of its beginning pushed farther away in their minds – with its end pulled closer – resulted in our participants' anticipating that something they looked forward would feel as if it had almost no duration at all.
In another study, we asked participants to imagine going on a weekend trip that they either expected to be fun or terrible. We then asked them how far away the start and end of this trip felt like using a similar 0 to 100 scale. 46% of participants evaluated the positive weekend as feeling like it had no duration at all: They marked the beginning and the end of the vacation virtually at the same location when using the slider scale.
Thinking in hours and days
Our goal was to show how these two judgments of an event – the fact that it simultaneously seems farther away and is assumed to last for less time – can nearly eliminate the event's duration in the mind's eye.
We reasoned that if we didn't explicitly highlight these two separate pieces – and instead directly asked them about the duration of the event – a smaller portion of people would indicate virtually no duration for something they looked forward to.
We tested this theory in another study, in which we told participants that they would watch two five-minute-long videos back-to-back. We described the second video as either humorous or boring, and then asked them how long they thought each video would feel like it lasted.
We found that the participants predicted that the funny video would still feel shorter and was farther away than the boring one. But we also found that participants believed it would last a bit longer than the responses we received in the earlier studies.
This finding gives us a way to overcome this biased perception: focus on the actual duration. Because in this study, participants directly reported how long the funny video would last – and not the perceived distance of its beginning and its end – they were far less likely to assume it would be over just as it started.
While it sounds trivial and obvious, we often rely on our subjective feelings – not objective measures of time – when deciding how long a period of time will feel and how to best use it.
So when looking forward to much-anticipated events like vacations, it's important to remind yourself just how many days it will last.
You'll get more out of the experience – and, hopefully, put yourself in a better position to take advantage of the time you do have.
Inventions with revolutionary potential made by a mysterious aerospace engineer for the U.S. Navy come to light.
- U.S. Navy holds patents for enigmatic inventions by aerospace engineer Dr. Salvatore Pais.
- Pais came up with technology that can "engineer" reality, devising an ultrafast craft, a fusion reactor, and more.
- While mostly theoretical at this point, the inventions could transform energy, space, and military sectors.
The U.S. Navy controls patents for some futuristic and outlandish technologies, some of which, dubbed "the UFO patents," came to light recently. Of particular note are inventions by the somewhat mysterious Dr. Salvatore Cezar Pais, whose tech claims to be able to "engineer reality." His slate of highly-ambitious, borderline sci-fi designs meant for use by the U.S. government range from gravitational wave generators and compact fusion reactors to next-gen hybrid aerospace-underwater crafts with revolutionary propulsion systems, and beyond.
Of course, the existence of patents does not mean these technologies have actually been created, but there is evidence that some demonstrations of operability have been successfully carried out. As investigated and reported by The War Zone, a possible reason why some of the patents may have been taken on by the Navy is that the Chinese military may also be developing similar advanced gadgets.
Among Dr. Pais's patents are designs, approved in 2018, for an aerospace-underwater craft of incredible speed and maneuverability. This cone-shaped vehicle can potentially fly just as well anywhere it may be, whether air, water or space, without leaving any heat signatures. It can achieve this by creating a quantum vacuum around itself with a very dense polarized energy field. This vacuum would allow it to repel any molecule the craft comes in contact with, no matter the medium. Manipulating "quantum field fluctuations in the local vacuum energy state," would help reduce the craft's inertia. The polarized vacuum would dramatically decrease any elemental resistance and lead to "extreme speeds," claims the paper.
Not only that, if the vacuum-creating technology can be engineered, we'd also be able to "engineer the fabric of our reality at the most fundamental level," states the patent. This would lead to major advancements in aerospace propulsion and generating power. Not to mention other reality-changing outcomes that come to mind.
Among Pais's other patents are inventions that stem from similar thinking, outlining pieces of technology necessary to make his creations come to fruition. His paper presented in 2019, titled "Room Temperature Superconducting System for Use on a Hybrid Aerospace Undersea Craft," proposes a system that can achieve superconductivity at room temperatures. This would become "a highly disruptive technology, capable of a total paradigm change in Science and Technology," conveys Pais.
High frequency gravitational wave generator.
Credit: Dr. Salvatore Pais
Another invention devised by Pais is an electromagnetic field generator that could generate "an impenetrable defensive shield to sea and land as well as space-based military and civilian assets." This shield could protect from threats like anti-ship ballistic missiles, cruise missiles that evade radar, coronal mass ejections, military satellites, and even asteroids.
Dr. Pais's ideas center around the phenomenon he dubbed "The Pais Effect". He referred to it in his writings as the "controlled motion of electrically charged matter (from solid to plasma) via accelerated spin and/or accelerated vibration under rapid (yet smooth) acceleration-deceleration-acceleration transients." In less jargon-heavy terms, Pais claims to have figured out how to spin electromagnetic fields in order to contain a fusion reaction – an accomplishment that would lead to a tremendous change in power consumption and an abundance of energy.
According to his bio in a recently published paper on a new Plasma Compression Fusion Device, which could transform energy production, Dr. Pais is a mechanical and aerospace engineer working at the Naval Air Warfare Center Aircraft Division (NAWCAD), which is headquartered in Patuxent River, Maryland. Holding a Ph.D. from Case Western Reserve University in Cleveland, Ohio, Pais was a NASA Research Fellow and worked with Northrop Grumman Aerospace Systems. His current Department of Defense work involves his "advanced knowledge of theory, analysis, and modern experimental and computational methods in aerodynamics, along with an understanding of air-vehicle and missile design, especially in the domain of hypersonic power plant and vehicle design." He also has expert knowledge of electrooptics, emerging quantum technologies (laser power generation in particular), high-energy electromagnetic field generation, and the "breakthrough field of room temperature superconductivity, as related to advanced field propulsion."
Suffice it to say, with such a list of research credentials that would make Nikola Tesla proud, Dr. Pais seems well-positioned to carry out groundbreaking work.
A craft using an inertial mass reduction device.
Credit: Salvatore Pais
The patents won't necessarily lead to these technologies ever seeing the light of day. The research has its share of detractors and nonbelievers among other scientists, who think the amount of energy required for the fields described by Pais and his ideas on electromagnetic propulsions are well beyond the scope of current tech and are nearly impossible. Yet investigators at The War Zone found comments from Navy officials that indicate the inventions are being looked at seriously enough, and some tests are taking place.
If you'd like to read through Pais's patents yourself, check them out here.
Laser Augmented Turbojet Propulsion System
Credit: Dr. Salvatore Pais
A global survey shows the majority of countries favor Android over iPhone.
- When Android was launched soon after Apple's own iPhone, Steve Jobs threatened to "destroy" it.
- Ever since, and across the world, the rivalry between both systems has animated users.
- Now the results are in: worldwide, consumers clearly prefer one side — and it's not Steve Jobs'.
A woman on her phone in Havana, Cuba. Mobile phones have become ubiquitous the world over — and so has the divide between Android and iPhone users.Credit: Yamil Lage / AFP via Getty Images.
Us versus them: it's the archetypal binary. It makes the world understandable by dividing it into two competing halves: labor against capital, West against East, men against women.
These maps are the first to show the dividing lines between one of the world's more recent binaries: Android vs. Apple. Published by Electronics Hub, they are based on a qualitative analysis of almost 350,000 tweets worldwide that presented positive, neutral, and negative attitudes toward Android and/or Apple.
Steve Jobs wanted to go "thermonuclear"
Feelings between Android and Apple were pretty tribal from the get-go. It was Steve Jobs himself who said, when Google rolled out Android a mere ten months after Apple launched the iPhone, "I'm going to destroy Android, because it's a stolen product. I'm willing to go thermonuclear war on this."
Buying a phone is like picking a side in the eternal feud between the Hatfields and the McCoys. Each choice for automatically comes with an in-built arsenal of arguments against.
If you are an iPhone person, you appreciate the sleekness and simplicity of its design, and you are horrified by the confusing mess that is the Android operating system. If you are an Android aficionado, you pity the iPhone user, a captive of an overly expensive closed ecosystem, designed to extract money from its users.
Even without resorting to those extremes, many of us will recognize which side of the dividing line that we are on. Like the American Civil War, that line runs through families and groups of friends, but that would be a bit confusing to chart geographically. To un-muddle the information, these maps zoom out to state and country level.
If the contest is based on the number of countries, Android wins. In all, 74 of the 142 countries surveyed prefer Android (in green on the map). Only 65 favor Apple (colored grey). That's a 52/48 split, which may not sound like a decisive vote, but it was good enough for Boris Johnson to get Brexit done (after he got breakfast done, of course).
And yes, math-heads: 74 plus 65 is three short of 142. Belarus, Fiji, and Peru (in yellow on the map) could not decide which side to support in the Global Phone War.
What about the United States, home of both the Android and the iPhone? Another victory for the former, albeit a slightly narrower one: 30.16 percent of the tweets about Android were positive versus just 29.03 percent of the ones about Apple.
United States: Texas surrounded!
Credit: Electronics Hub
There can be only one winner per state, though, and that leads to this preponderance of Android logos. Frankly, it's a relief to see a map showing a visceral divide within the United States that is not the coasts versus the heartland.
- Apple dominates in 19 states: a solid Midwestern bloc, another of states surrounding Texas, the Dakotas and California, plus North Carolina, New Hampshire, and Rhode Island.
- And that's it. The other 32 are the United States of Android. You can drive from Seattle to Miami without straying into iPhone territory. But no stopovers in Dallas or Houston – both are behind enemy lines!
North America: strongly leaning toward Android
Credit: Electronics Hub
Only eight of North America's 21 countries surveyed fall into the Apple category.
- The U.S. and Canada lean Android, while Mexico goes for the iPhone.
- Central America is divided, but here too Android wins hands down, 5-2.
Europe: Big Five divided
Credit: Electronics Hub
In Europe, Apple wins, with 20 countries preferring the iPhone, 17 going for Android, and Belarus sitting on the fence.
- Of Western Europe's Big Five markets, three (UK, Germany, Spain) are pro-Android, and two (France, Italy) are pro-Apple.
- Czechia and Slovakia are an Apple island in the Android sea that is Central Europe. Glad to see there is still something the divorcees can agree on.
South America: almost even
Credit: Electronics Hub
In South America, the divide is almost even.
- Five countries prefer Android, four Apple, and one is undecided.
- In Peru, both Android- and Apple-related tweets were 25 percent positive.
Africa: watch out for Huawei
Credit: Electronics Hub
In Africa, Android wins by 17 countries versus Apple's 15.
- There's a solid Android bloc running from South Africa via DR Congo all the way to Ethiopia.
- iPhone countries are scattered throughout the north (Algeria), west (Guinea), east (Somalia), and south (Namibia).
Huawei — increasingly popular across the continent — could soon dramatically change the picture in Africa. Currently still running on Android, the Chinese phone manufacturer has just launched its own operating system, called Harmony.
Middle East: Iran vs. Saudi Arabia (again)
Credit: Electronics Hub
In the Middle East and Central Asia, Android wins 8 countries to Apple's 6.
- But it's complicated. One Turkish tweeter wondered how it is that iPhones seem more popular in the Asian half of Istanbul, while Android phones prevailed in the European part of the city.
- The phone divide matches up with the region's main geopolitical one: Iran prefers Android, Saudi Arabia the iPhone.
Asia-Pacific: Apple on the periphery
Credit: Electronics Hub
Another wafer-thin majority for Android in the Asia-Pacific region: 13 countries versus 12 for Apple — and one abstention (Fiji).
- The two giants of the Asian mainland, India and China, are both Android countries. Apple countries are on the periphery.
- And if India is Android, its rival Pakistan must be Apple. Same with North and South Korea.
Experts point to the fact that both operating systems are becoming more alike with every new generation as a potential resolution to the conflict. But as any student of human behavior will confirm: smaller differences will only exacerbate the rivalry between both camps.
Maps taken from Electronics Hub, reproduced with kind permission.
Strange Maps #1096
Got a strange map? Let me know at firstname.lastname@example.org.