from the world's big
Reclaim Reality, Relationships, and Your Attention Span from Your Devices
Your future happiness and success will depend on the double-edged sword of embracing new technology to stay connected, and being smart enough to unplug at the right time.
Adam Alter is an Associate Professor of Marketing at New York University’s Stern School of Business, with an affiliated appointment in the New York University Psychology Department.
Adam is the author of the New York Times bestseller, Drunk Tank Pink: And Other Unexpected Forces That Shape How We Think, Feel, and Behave, which examines how features of the world shape our thoughts and feelings beyond our control. He has also written for the New York Times, New Yorker, Atlantic, WIRED, Slate, Huffington Post, and Popular Science, among other publications. Adam has shared his ideas at the Cannes Lions Festival of Creativity, and with dozens of companies, including Google, Microsoft, Anheuser Busch, Prudential, and Fidelity, and with several design and ad agencies around the world. He is working on his second book, which asks why so many people today are addicted to so many behaviors, from incessant smart phone and internet usage to video game playing and online shopping.
Adam’s academic research focuses on judgment and decision-making and social psychology, with a particular interest in the sometimes surprising effects of subtle cues in the environment on human cognition and behavior. His research has been published widely in academic journals, and featured in dozens of TV, radio and print outlets around the world.
He received his Bachelor of Science (Honors Class 1, University Medal) in Psychology from the University of New South Wales and his M.A. and Ph.D. in Psychology from Princeton University, where he held the Charlotte Elizabeth Procter Honorific Dissertation Fellowship and a Fellowship in the Woodrow Wilson Society of Scholars.
Adam Alter: Young people today, in particular, but even adults don't have a tolerance for boredom—at all. There's some research looking at how our attention spans have changed across time. There's some evidence that they've shrunk by about 33 percent since the year 2000. One reason for that is because we interact with devices so much of the time and they don't demand anything of you. They are deliverers. They bring things to you. You don't need an attention span. If you're reading a book, if you have a lapse in attention for even a couple of minutes, you know you'll get to the end of the page, you'll realize that you haven't really been paying attention for the last half of page—that's a problem and you have to return back to where you were. That requires willed and directed attention.
That's just not true of smart phones. They are constantly competing for our attention. Every app, every social media platform, pretty much everything you encounter on a smart phone is designed to give you what you need. They are competing for you instead of you competing for whatever else is going on in the world.
And what that means is you don't need to have much of an attention span. And we've seen this in a lot of different respects, even beyond smart phones themselves, the way we interact with email, for example, when we get an email in the workplace it takes us on average about six seconds to check that email. And every time you check an email you spend 25 minutes getting back to the zone of engagement you were in before you checked the email. So what's happening here is we just don't have much attention for the things that we're doing because we're constantly distracted, we're pulled away to do things like check emails, quickly refresh a Twitter feed, an Instagram feed and so on. And as a result we really don't need an attention span. We don't need to be as engaged as we used to and we can still get through and get by in the world.
I think we have a lot to be worried about with respect to the evolution of tech and the way it engages us. We are far more engaged with tech today than we were ten years ago. And when we look back ten years from now I think we're going to look at Facebook, Instagram, Twitter as relics—and as primitive relics to be totally honest. The degree of engagement we'll have with things like virtual reality tech and virtual reality platforms will far exceed anything you see now. If I'm sitting with you at a table and we're having a conversation and there's a phone upside down on the table next to us, just for the presence of that phone, the connection we form between us will be diminished.
And if a phone can do that turned upside down because of all the things that it implies—that there's a whole world out there—imagine at any moment in time you have to choose between the real world with all its messiness, with all its complexity, with all its imperfections, and this perfect virtual world.
When virtual reality tech is really advanced and you can basically go anywhere at any time to speak to whomever you like, it's going to be really hard to resist the desire to leave the real world, the here and now, the world that's social and stop yourself from always retreating to that virtual world. So I do think there's a big concern.
I also think it's a concern from the business perspective that businesses make money for every moment of attention they can grab. And they are trying to basically weaponize whatever content they put out so that it's harder and harder for us to resist. A perfect example of this is clickbait on, for example, Facebook. Clickbait is something that's relatively new that didn't really exist ten years ago.
And there are a couple of techniques that clickbait purveyors have developed over the years; the sorts of headlines they use are almost impossible to resist, they hit a whole lot of psychological notes that we can't not respond to.
They basically open a loop, they open a cliffhanger and you have to know what the end of the story is so you click on that button. The other thing that they do is they give you these pictures that are kind of half filled so the thumbnail picture doesn't spell out the whole issue. So say they say, “You won't believe what this person did next!”, they'll show you the person with a surprised expression on his or her face but you won't be able to see the rest of the image, you have to click on the image to go to the next page.
And then you'll get there and it will be a slideshow with 17 different slides, each click providing some income to the company that's releasing it. So I think that's more sophisticated than what we saw five years ago, and we're only going to continue in that direction. It's hard to predict what, in ten years, we'll be competing with, but certainly they'll be a lot of competition for our attention and it will probably be quite successful at luring us away from the here and now.
I think about our current time as being at the bottom of a very steep, long, tall hill and we're all moving up this hill and at the moment we think of ourselves as very advanced. There's this illusion known as the end-of-history illusion, it's the idea that where we are right now feels incredibly advanced no matter where we are in the tech world, no matter where we are in the evolution of tech. I think of Twitter and Instagram and Facebook as pretty advanced, but in ten years we're not going to feel that way. In 20 years we're certainly not going to feel that way; we're going to look at the first social media platforms as relics, as curiosities. And I think what's really going to change in the next few years if you talk to experts in the virtual reality industry is that they say between four and five years from now everyone will own virtual reality goggles in the same way as we now all pretty much own smart phones. And once you have access to those goggles, at any moment in time you could leave this real imperfect world to go to a perfect virtual world. It's really hard to see why you would resist that at any moment in time. I mean if humans are basically constantly roaming the earth trying to be as happy as possible, hoovering up well-being, it seems almost rational that you'll leave the here and now for the well-being that you could acquire by leaving and going to that virtual world.
So I think that's a pretty serious concern. I think it's a concern because the social world—the real social world—can only exist and encourage us to thrive when we continue to be a part of it. If we leave it for long periods of time, I don't know if this has ever happened to you, but if you're unwell for a few days and you're home in bed, when you return to the social world there's a sort of awkwardness to the first communications you have. That's how humans are, we basically need to exercise that social muscle across time. If we don't do that, if it withers away, if we leave for the virtual world for enough time there's a pretty good chance that the way we communicate as a species will change. And I'm especially concerned about that for kids. So there are certain critical periods in child development where kids basically acquire the social skills that they'll take through their lives. And what they need to do is they need a lot of feedback, pretty rapid high fidelity feedback. If I'm a child and I take a toy from another child, I need to see what that does to that other child. I need to see his face scrunch up. I need to see him be upset. I need to see him lash out. I need to see that really fast so I learn that's not the right way to do things.
But if I'm interacting mainly through screens I don't get that rapid feedback and so I never really hone the skills that I'm going to need when I deal with people face to face in the real world. And I think that's a big concern because if you take the whole screen generation en masse, the kids who are born today, I think a lot of them may not hone those skills the way we have in the past so when it comes to being in the workplace, to being real people forming romantic relationships and forming friendships, I could imagine that being quite difficult for them in a way that just came naturally to us when we existed in a different world as kids, in a world that was largely face to face.
It's really hard to say whether unplugging in the long run will be beneficial, whether there'll be some sort of competitive advantage to not being engrossed in the tech world. I think we already see some advantages in some spheres. If you aren't buried in your phone, if you aren't buried in games you have more time to do other things that I think make you successful. You have more time to hone your social skills, you have more time to do useful work, you have more time to form deep, meaningful relationships with other people. I think all of that helps you to get ahead. At the same time being engaged with tech to some extent is necessary in this world. You need it for travel, you need it to be a member of the workplace, you need it to communicate.
I think disengaging completely is not the answer. I think the skill in the long run will be striking the right balance. A lot of people use the language of environmentalism, the idea that we need to form a sustainable relationship with tech.
And I think the people who manage to do that over time are the ones who will have that competitive advantage. They'll be plugged in at the right times but unplugged at the times when it's more beneficial to have face to face relationships and interactions.
I don't know what that will look like now. Already today many of us are plugged in for the eight hours when we're at work and then for three additional hours in front of our screens, our phones, and then for some of us additional hours in front of the TV. That's a lot of time. That's 12, 13 hours during the day when we are awake. That doesn't leave a lot of time to do things like exercise, engaging with nature, talking to people face to face, being creative and thoughtful, reading books and so on. So I think we're leaving a lot of things behind by being overly engaged with tech.
We shouldn't disengage completely. I think the skill over the next number of years into decades will be learning how to manage our relationship with tech in a sustainable way. And the people who do that best in the long run will be the most successful people and the happiest people.
There is a psychological self-deception called the end-of-history illusion, which refers to the feeling that—no matter where you are in the evolution of technology—your time seems incredibly advanced. However Adam Alter reminds us that the trajectory of progress keeps rising, and what we think is cutting-edge now—Snapchat, Facebook, the iPhone 8, the iPhone 12—will in ten years will seem laughably primitive. It's what we'll have in this new world that concerns Alter. He cites experts who predict that most of us will own VR goggles in the next 5 years, and if the success of clickbait and its irresistible effect on our psychology is any indication, the fully immersive alternative realities of VR will shake the foundations of our minds, relationships, and attention spans (which are already kaput). As we're lured into a life on the digital plain by corporations—who make money from every second they can capture our attention—virtual reality may threaten reality itself. Those of us who have known a life without it will have an slight advantage in managing its control over our behavior, but Alter raises concerns for children won't come at this technology pre-equipped and skeptical enough to see the intentions behind such lures—and what might be lost if we don't know how to disconnect. Adam Alter is the author of Irresistible: The Rise of Addictive Technology and the Business of Keeping Us Hooked.
Ready to see the future? Nanotronics CEO Matthew Putman talks innovation and the solutions that are right under our noses.
Innovation in manufacturing has crawled since the 1950s. That's about to speed up.
A clever new study definitively measures how long it takes for quantum particles to pass through a barrier.
- Quantum particles can tunnel through seemingly impassable barriers, popping up on the other side.
- Quantum tunneling is not a new discovery, but there's a lot that's unknown about it.
- By super-cooling rubidium particles, researchers use their spinning as a magnetic timer.
When it comes to weird behavior, there's nothing quite like the quantum world. On top of that world-class head scratcher entanglement, there's also quantum tunneling — the mysterious process in which particles somehow find their way through what should be impenetrable barriers.
Exactly why or even how quantum tunneling happens is unknown: Do particles just pop over to the other side instantaneously in the same way entangled particles interact? Or do they progressively tunnel through? Previous research has been conflicting.
That quantum tunneling occurs has not been a matter of debate since it was discovered in the 1920s. When IBM famously wrote their name on a nickel substrate using 35 xenon atoms, they used a scanning tunneling microscope to see what they were doing. And tunnel diodes are fast-switching semiconductors that derive their negative resistance from quantum tunneling.
Nonetheless, "Quantum tunneling is one of the most puzzling of quantum phenomena," says Aephraim Steinberg of the Quantum Information Science Program at Canadian Institute for Advanced Research in Toronto to Live Science. Speaking with Scientific American he explains, "It's as though the particle dug a tunnel under the hill and appeared on the other."
Steinberg is a co-author of a study just published in the journal Nature that presents a series of clever experiments that allowed researchers to measure the amount of time it takes tunneling particles to find their way through a barrier. "And it is fantastic that we're now able to actually study it in this way."
Frozen rubidium atoms
Image source: Viktoriia Debopre/Shutterstock/Big Think
One of the difficulties in ascertaining the time it takes for tunneling to occur is knowing precisely when it's begun and when it's finished. The authors of the new study solved this by devising a system based on particles' precession.
Subatomic particles all have magnetic qualities, and they spin, or "precess," like a top when they encounter an external magnetic field. With this in mind, the authors of the study decided to construct a barrier with a magnetic field, causing any particles passing through it to precess as they did so. They wouldn't precess before entering the field or after, so by observing and timing the duration of the particles' precession, the researchers could definitively identify the length of time it took them to tunnel through the barrier.
To construct their barrier, the scientists cooled about 8,000 rubidium atoms to a billionth of a degree above absolute zero. In this state, they form a Bose-Einstein condensate, AKA the fifth-known form of matter. When in this state, atoms slow down and can be clumped together rather than flying around independently at high speeds. (We've written before about a Bose-Einstein experiment in space.)
Using a laser, the researchers pusehd about 2,000 rubidium atoms together in a barrier about 1.3 micrometers thick, endowing it with a pseudo-magnetic field. Compared to a single rubidium atom, this is a very thick wall, comparable to a half a mile deep if you yourself were a foot thick.
With the wall prepared, a second laser nudged individual rubidium atoms toward it. Most of the atoms simply bounced off the barrier, but about 3% of them went right through as hoped. Precise measurement of their precession produced the result: It took them 0.61 milliseconds to get through.
Reactions to the study
Scientists not involved in the research find its results compelling.
"This is a beautiful experiment," according to Igor Litvinyuk of Griffith University in Australia. "Just to do it is a heroic effort." Drew Alton of Augustana University, in South Dakota tells Live Science, "The experiment is a breathtaking technical achievement."
What makes the researchers' results so exceptional is their unambiguity. Says Chad Orzel at Union College in New York, "Their experiment is ingeniously constructed to make it difficult to interpret as anything other than what they say." He calls the research, "one of the best examples you'll see of a thought experiment made real." Litvinyuk agrees: "I see no holes in this."
As for the researchers themselves, enhancements to their experimental apparatus are underway to help them learn more. "We're working on a new measurement where we make the barrier thicker," Steinberg said. In addition, there's also the interesting question of whether or not that 0.61-millisecond trip occurs at a steady rate: "It will be very interesting to see if the atoms' speed is constant or not."
So far, 30 student teams have entered the Indy Autonomous Challenge, scheduled for October 2021.
- The Indy Autonomous Challenge will task student teams with developing self-driving software for race cars.
- The competition requires cars to complete 20 laps within 25 minutes, meaning cars would need to average about 110 mph.
- The organizers say they hope to advance the field of driverless cars and "inspire the next generation of STEM talent."
Indy Autonomous Challenge<p>Completing the race in 25 minutes means the cars will need to average about 110 miles per hour. So, while the race may end up being a bit slower than a typical Indy 500 competition, in which winners average speeds of over 160 mph, it's still set to be the fastest autonomous race featuring full-size cars.</p><p style="margin-left: 20px;">"There is no human redundancy there," Matt Peak, managing director for Energy Systems Network, a nonprofit that develops technology for the automation and energy sectors, told the <a href="https://www.post-gazette.com/business/tech-news/2020/06/01/Indy-Autonomous-Challenge-Indy-500-Indianapolis-Motor-Speedway-Ansys-Aptiv-self-driving-cars/stories/202005280137" target="_blank">Pittsburgh Post-Gazette</a>. "Either your car makes this happen or smash into the wall you go."</p>
Illustration of the Indy Autonomous Challenge
Indy Autonomous Challenge<p>The Indy Autonomous Challenge <a href="https://www.indyautonomouschallenge.com/rules" target="_blank">describes</a> itself as a "past-the-post" competition, which "refers to a binary, objective, measurable performance rather than a subjective evaluation, judgement, or recognition."</p><p>This competition design was inspired by the 2004 DARPA Grand Challenge, which tasked teams with developing driverless cars and sending them along a 150-mile route in Southern California for a chance to win $1 million. But that prize went unclaimed, because within a few hours after starting, all the vehicles had suffered some kind of critical failure.</p>
Indianapolis Motor Speedway
Indy Autonomous Challenge<p>One factor that could prevent a similar outcome in the upcoming race is the ability to test-run cars on a virtual racetrack. The simulation software company Ansys Inc. has already developed a model of the Indianapolis Motor Speedway on which teams will test their algorithms as part of a series of qualifying rounds.</p><p style="margin-left: 20px;">"We can create, with physics, multiple real-life scenarios that are reflective of the real world," Ansys President Ajei Gopal told <a href="https://www.wsj.com/articles/autonomous-vehicles-to-race-at-indianapolis-motor-speedway-11595237401?mod=e2tw" target="_blank">The Wall Street Journal</a>. "We can use that to train the AI, so it starts to come up to speed."</p><p>Still, the race could reveal that self-driving cars aren't quite ready to race at speeds of over 110 mph. After all, regular self-driving cars already face enough logistical and technical roadblocks, including <a href="https://www.bbc.com/news/technology-53349313#:~:text=Tesla%20will%20be%20able%20to,no%20driver%20input%2C%20he%20said." target="_blank">crumbling infrastructure, communication issues</a> and the <a href="https://bigthink.com/paul-ratner/would-you-ride-in-a-car-thats-programmed-to-kill-you" target="_self">fateful moral decisions driverless cars will have to make in split seconds</a>.</p>But the Indy Autonomous Challenge <a href="https://static1.squarespace.com/static/5da73021d0636f4ec706fa0a/t/5dc0680c41954d4ef41ec2b2/1572890638793/Indy+Autonomous+Challenge+Ruleset+-+v5NOV2019+%282%29.pdf" target="_blank">says</a> its main goal is to advance the industry, by challenging "students around the world to imagine, invent, and prove a new generation of automated vehicle (AV) software and inspire the next generation of STEM talent."
Health officials in China reported that a man was infected with bubonic plague, the infectious disease that caused the Black Death.
- The case was reported in the city of Bayannur, which has issued a level-three plague prevention warning.
- Modern antibiotics can effectively treat bubonic plague, which spreads mainly by fleas.
- Chinese health officials are also monitoring a newly discovered type of swine flu that has the potential to develop into a pandemic virus.
Bacteria under microscope
needpix.com<p>Today, bubonic plague can be treated effectively with antibiotics.</p><p style="margin-left: 20px;">"Unlike in the 14th century, we now have an understanding of how this disease is transmitted," Dr. Shanthi Kappagoda, an infectious disease physician at Stanford Health Care, told <a href="https://www.healthline.com/health-news/seriously-dont-worry-about-the-plague#Heres-how-the-plague-spreads" target="_blank">Healthline</a>. "We know how to prevent it — avoid handling sick or dead animals in areas where there is transmission. We are also able to treat patients who are infected with effective antibiotics, and can give antibiotics to people who may have been exposed to the bacteria [and] prevent them [from] getting sick."</p>
This plague patient is displaying a swollen, ruptured inguinal lymph node, or buboe.
Centers for Disease Control and Prevention<p>Still, hundreds of people develop bubonic plague every year. In the U.S., a handful of cases occur annually, particularly in New Mexico, Arizona and Colorado, <a href="https://www.cdc.gov/plague/faq/index.html" target="_blank">where habitats allow the bacteria to spread more easily among wild rodent populations</a>. But these cases are very rare, mainly because you need to be in close contact with rodents in order to get infected. And though plague can spread from human to human, this <a href="https://www.healthline.com/health-news/seriously-dont-worry-about-the-plague#Heres-how-the-plague-spreads" target="_blank">only occurs with pneumonic plague</a>, and transmission is also rare.</p>
A new swine flu in China<p>Last week, researchers in China also reported another public health concern: a new virus that has "all the essential hallmarks" of a pandemic virus.<br></p><p>In a paper published in the <a href="https://www.pnas.org/content/early/2020/06/23/1921186117" target="_blank">Proceedings of the National Academy of Sciences</a>, researchers say the virus was discovered in pigs in China, and it descended from the H1N1 virus, commonly called "swine flu." That virus was able to transmit from human to human, and it killed an estimated 151,700 to 575,400 people worldwide from 2009 to 2010, according to the Centers for Disease Control and Prevention.</p>There's no evidence showing that the new virus can spread from person to person. But the researchers did find that 10 percent of swine workers had been infected by the virus, called G4 reassortant EA H1N1. This level of infectivity raises concerns, because it "greatly enhances the opportunity for virus adaptation in humans and raises concerns for the possible generation of pandemic viruses," the researchers wrote.
A new Harvard study finds that the language you use affects patient outcome.
- A study at Harvard's McLean Hospital claims that using the language of chemical imbalances worsens patient outcomes.
- Though psychiatry has largely abandoned DSM categories, professor Joseph E Davis writes that the field continues to strive for a "brain-based diagnostic system."
- Chemical explanations of mental health appear to benefit pharmaceutical companies far more than patients.
Challenging the Chemical Imbalance Theory of Mental Disorders: Robert Whitaker, Journalist<span style="display:block;position:relative;padding-top:56.25%;" class="rm-shortcode" data-rm-shortcode-id="41699c8c2cb2aee9271a36646e0bee7d"><iframe type="lazy-iframe" data-runner-src="https://www.youtube.com/embed/-8BDC7i8Yyw?rel=0" width="100%" height="auto" frameborder="0" scrolling="no" style="position:absolute;top:0;left:0;width:100%;height:100%;"></iframe></span><p>This is a far cry from Howard Rusk's 1947 NY Times editorial calling for mental healt</p><p>h disorders to be treated similarly to physical disease (such as diabetes and cancer). This mindset—not attributable to Rusk alone; he was merely relaying the psychiatric currency of the time—has dominated the field for decades: mental anguish is a genetic and/or chemical-deficiency disorder that must be treated pharmacologically.</p><p>Even as psychiatry untethered from DSM categories, the field still used chemistry to validate its existence. Psychotherapy, arguably the most efficient means for managing much of our anxiety and depression, is time- and labor-intensive. Counseling requires an empathetic and wizened ear to guide the patient to do the work. Ingesting a pill to do that work for you is more seductive, and easier. As Davis writes, even though the industry abandoned the DSM, it continues to strive for a "brain-based diagnostic system." </p><p>That language has infiltrated public consciousness. The team at McLean surveyed 279 patients seeking acute treatment for depression. As they note, the causes of psychological distress have constantly shifted over the millennia: humoral imbalance in the ancient world; spiritual possession in medieval times; early childhood experiences around the time of Freud; maladaptive thought patterns dominant in the latter half of last century. While the team found that psychosocial explanations remain popular, biogenetic explanations (such as the chemical imbalance theory) are becoming more prominent. </p><p>Interestingly, the 80 people Davis interviewed for his book predominantly relied on biogenetic explanations. Instead of doctors diagnosing patients, as you might expect, they increasingly serve to confirm what patients come in suspecting. Patients arrive at medical offices confident in their self-diagnoses. They believe a pill is the best course of treatment, largely because they saw an advertisement or listened to a friend. Doctors too often oblige without further curiosity as to the reasons for their distress. </p>
Image: Illustration Forest / Shutterstock<p>While medicalizing mental health softens the stigma of depression—if a disorder is inheritable, it was never really your fault—it also disempowers the patient. The team at McLean writes,</p><p style="margin-left: 20px;">"More recent studies indicate that participants who are told that their depression is caused by a chemical imbalance or genetic abnormality expect to have depression for a longer period, report more depressive symptoms, and feel they have less control over their negative emotions."</p><p>Davis points out the language used by direct-to-consumer advertising prevalent in America. Doctors, media, and advertising agencies converge around common messages, such as everyday blues is a "real medical condition," everyone is susceptible to clinical depression, and drugs correct underlying somatic conditions that you never consciously control. He continues,</p><p style="margin-left: 20px;">"Your inner life and evaluative stance are of marginal, if any, relevance; counseling or psychotherapy aimed at self-insight would serve little purpose." </p><p>The McLean team discovered a similar phenomenon: patients expect little from psychotherapy and a lot from pills. When depression is treated as the result of an internal and immutable essence instead of environmental conditions, behavioral changes are not expected to make much difference. Chemistry rules the popular imagination.</p>