Once a week.
Subscribe to our weekly newsletter.
Does digital technology make students stupid?
Conventional wisdom believes "screen time" disrupts mental development, but research hints at a more complicated relationship between our minds and digital technology.
- Worry over test scores has led many to blame digital technology for waning educational achievement.
- New studies show that the persistent effects of "screen time" are not yet understood and may be short-lived.
- Many experts argue the best approach is to teach students the strategic and selective use of digital technology.
We've been here before. When books were the fresh new tech, Socrates believed they would spread an epidemic of forgetfulness. A millennium later, aristocrats fretted that the printing press would lead to mental overload among the masses. Then parents worried that calculators handicapped arithmetic skills and that e-mail would prove more harmful to IQ than pot.
Now, there's a new mind-mushing invention on the scene: digital technology.
According to a PBS poll, 53 percent of people believe that technology is making us dumber. Polling more than a thousand experts, the Elon University's Imagining the Internet Center and the Pew Internet Project found that 42 percent believed "the hyperconnected brain is shallow" and maintains "an unhealthy dependence on the Internet and mobile devices." And Nicholas Carr's Pulitzer Prize finalist book, The Shallows: How the Internet Is Changing the Way We Think, Read and Remember, says it right in the title.
But the worry over digital technology's place in the classroom isn't just the latest flare up of mob technophobia. It's fueled by high-profile events coinciding with the mass adoption of digital tech among students, leading to a strong associative relationship.
Digital technology enters the classroom
Consider Finland. At the beginning of the century, Finland's education system gained renown as the best in the world. It was a top performer in the 2000 Program for International Student Assessment (PISA), scoring high in math and science and number one in reading. Educators flocked to the country to uncover its secret pedagogic spice.
But between 2006 and 2012, the country's scores fell sharply while other top performers remained steady. Several theories have been proffered for the trend reversal, among them the increased adoption of "screen time" technology.
As educator and policy adviser Pasi Shalberg told the Washington Post, Finnish girls outperform boys in reading, mathematics, and science. Finland is the only OECD country where girls outperform boys in the latter two subjects
Girls generally read for pleasure more than boys, and PISA test questions lean heavily on reading comprehension. As such, the appearance of digital technologies among school-aged children may have "accelerated this trend" — with boys' diminishing reading skills anchoring their test scores down.
Shalberg further posits that increased time spent on the internet for media and socializing may lead to difficulties in concentrating on complex issues, such as those found in math and science.
Another high-profile example comes from the United States, where technology's introduction into the classroom has been met with mixed results. As reported by the New York Times, Kansas students have staged sit-ins and walkouts to protest the use of Summit Learning Platform. Meanwhile, a Connecticut school district has suspended use of the same digital education system.
A personalized learning system backed by Mark Zuckerberg and Priscilla Chan, Summit Learning uses online tools to generate customized education aimed at promoting self-directed learning. However, some students have found the screen-focused lessons isolating and anxiety-inducing, while parents worry over the effects an untested system will have on their children's mental development.
We the Parents, a parental organization that opposes mass-customized learning, believes systems like Summit are risky given their lack of proven efficacy. In a letter to the Indiana Area School District board, one member spelled out their concerns over Summit, including the argument that screen-based education removes children from the interpersonal connections that facilitate proper learning.
The letter states: "But lack of evidence does not give us 'a pass' to proceed without caution, and the truth is we have many clues that do not bode well when it comes to heavy uses of technology and our children's educational or socioemotional wellbeing" and "there is no real way to assess his learning outcomes until this little experiment on our children is measured later, after the damage has been done."
In other words, we are social learners, not digital ones.
Can we determine digital technology's lingering effects?
Examples like these have primed popular imagination to distrust digital technology's role in our cognitive development and maintaining mental acuity. But some recent studies have complicated the issue.
"There have been so many books and articles about how we may be relying so much on technology that we are losing some of our cognitive abilities ... but it hasn't been well studied. I can count on one hand the number of people studying the lingering effects of smartphone usage," Peter Frost, a professor of psychology at Southern New Hampshire University, told the Concord Monitor Report.
Deciding to analyze those lingering effects, Frost took his question and performed a study. First, Frost and his team analyzed college student phone usage and short-term cognitive abilities. They found that more smartphone usage correlated negatively with social problem-solving, but positively with the ability to make observations and judge the credibility of information.
He then assigned 50 undergrads to use their phones for less than two hours a day, while another group of 50 was assigned to more than five hours a day. At the one-week mark, the high-use students showed a diminished ability to interpret and analyze data. But at the four-week mark, that difference disappeared.
"The findings of this study suggest that, even in the rare cases where smartphones might alter cognition, this effect is likely transitory [and that] the mechanism by which smartphones initiate this temporary change remains an open question," Frost writes.
Another study, reported on in New Scientist, found that children who interacted with screens developed fine motor skills earlier, and no correlation was found that screen time interfered with developmental milestones like learning to walk and talk.
"[Digital technologies offer] unprecedented power, but there are still many important questions about these maddening, valuable devices that we have been unable to answer. What is clear, however, is that many initial reactions have been more knee-jerk than evidence-based," writes New Scientist consultant Douglas Heaven.
But you may have noticed something missing: causal links.
While the adoption of digital technology predates Finland's score drops, there's no direct evidence to suggest cause and effect. Another possible explanation offered by Shalberg includes Finland's post-2008 economic hardships. And although Summit Learning touts a collaboration with Harvard researchers, it has not let researchers study its specific platform.
Looking to the studies, we stumble into a chicken-and-egg problem. Do the students with improved judgement bolster such skills with their phones, or are students with such abilities more prone to high-usage? Does the phone help toddlers practice fine motor skills, or do more advanced children simply reach for the digital technology sooner?
Learning in the face of uncertainty
In many ways, researchers studying digital technology's effects on students face the same barriers as the nutritionist. Whether looking at diets that are digital or nutritive, it's difficult to persuade people to change their lives substantially over a protracted period of time. How many people do you know that would freely renounce all digital technology in the name of science? Or parents that would assign their child to a digital regimen where the deleterious effects are unknown?
And even should people agree, they can't be put in a lab for years to prove they stuck to the program. Our digital-laced reality means variables will creep into the data, and researchers end up relying on surveys to gather results.
None of this is to say that science can't ultimately provide evidence-based answers; just that such evidence is tricky to suss out and that digital technology is new and changing rapidly.
In the face of such uncertainty, many experts argue we should avoid the indiscriminate adoption of digital technology. Instead, our approach should be one of intention, only adopting the technologies we need to achieve a desired outcome.
This is the philosophy espoused by Cal Newport in his book Digital Minimalism, Douglas Rushkoff's Team Human podcast, and websites like the Tech Edvocate. Some developers are also adopting this philosophy, such as the digital-learning platform Cerego.
Cerego's adaptive-learning tools are designed to nurture learning and long-term retention. Students engage with the platform for cognitive work, but the lessons are spaced out to give their minds time to consolidate the information and to allow for non-digital learning experiences. The goal is to build stronger neural connections with the information, and approach it from multiple angles.
This approach stands in contrast to other digital systems, which profit through each point of engagement and so distract with continuous notifications designed to keep you on the platform.
"If I offered you an ax, you could use it as a tool of incredible destruction, or it could be a great benefit to you," Lewis said in an interview. "It's all about finding the right tool for the right mission. But remember: you wield the ax, not anybody else."
In a case study with Arizona State University's Global Freshman Academy, astronomy and health-and-wellness students who used Cerego and completed all the course sets scored better than students who did not, suggesting improved retention of foundational knowledge. (Though, in keeping with our theme, these results are correlative.)
And we've been here before. When calculators became widespread in elementary schools, parents and pundits worried that they would irrevocably harm the students' ability to learn mathematics. But math teachers chose to integrate them into the classroom with intentionality. Today, they teach students the "selective and strategic use" of calculators, improving not only math skills but reasoning and problem-solving skills in general.
As the evidence on digital technology continues to be cataloged, it seems the best approach is to consider it neither salubrious nor harmful. As such, the question shouldn't be whether they make students stupid. It's whether we are employing them in a way that deters or promotes mentally engaging activities.
- Digital Natives Do Not Exist, Claims New Paper - Big Think ›
- Revolt on the horizon? How young people really feel about digital ... ›
- Two-thirds of parents say technology makes parenting harder - Big Think ›
- If you want to be a great innovator, embrace uncertainty - Big Think ›
Why mega-eruptions like the ones that covered North America in ash are the least of your worries.
- The supervolcano under Yellowstone produced three massive eruptions over the past few million years.
- Each eruption covered much of what is now the western United States in an ash layer several feet deep.
- The last eruption was 640,000 years ago, but that doesn't mean the next eruption is overdue.
The end of the world as we know it
Panoramic view of Yellowstone National Park
Image: Heinrich Berann for the National Park Service – public domain
Of the many freak ways to shuffle off this mortal coil – lightning strikes, shark bites, falling pianos – here's one you can safely scratch off your worry list: an outbreak of the Yellowstone supervolcano.
As the map below shows, previous eruptions at Yellowstone were so massive that the ash fall covered most of what is now the western United States. A similar event today would not only claim countless lives directly, but also create enough subsidiary disruption to kill off global civilisation as we know it. A relatively recent eruption of the Toba supervolcano in Indonesia may have come close to killing off the human species (see further below).
However, just because a scenario is grim does not mean that it is likely (insert topical political joke here). In this case, the doom mongers claiming an eruption is 'overdue' are wrong. Yellowstone is not a library book or an oil change. Just because the previous mega-eruption happened long ago doesn't mean the next one is imminent.
Ash beds of North America
Ash beds deposited by major volcanic eruptions in North America.
Image: USGS – public domain
This map shows the location of the Yellowstone plateau and the ash beds deposited by its three most recent major outbreaks, plus two other eruptions – one similarly massive, the other the most recent one in North America.
The Huckleberry Ridge eruption occurred 2.1 million years ago. It ejected 2,450 km3 (588 cubic miles) of material, making it the largest known eruption in Yellowstone's history and in fact the largest eruption in North America in the past few million years.
This is the oldest of the three most recent caldera-forming eruptions of the Yellowstone hotspot. It created the Island Park Caldera, which lies partially in Yellowstone National Park, Wyoming and westward into Idaho. Ash from this eruption covered an area from southern California to North Dakota, and southern Idaho to northern Texas.
About 1.3 million years ago, the Mesa Falls eruption ejected 280 km3 (67 cubic miles) of material and created the Henry's Fork Caldera, located in Idaho, west of Yellowstone.
It was the smallest of the three major Yellowstone eruptions, both in terms of material ejected and area covered: 'only' most of present-day Wyoming, Colorado, Kansas and Nebraska, and about half of South Dakota.
The Lava Creek eruption was the most recent major eruption of Yellowstone: about 640,000 years ago. It was the second-largest eruption in North America in the past few million years, creating the Yellowstone Caldera.
It ejected only about 1,000 km3 (240 cubic miles) of material, i.e. less than half of the Huckleberry Ridge eruption. However, its debris is spread out over a significantly wider area: basically, Huckleberry Ridge plus larger slices of both Canada and Mexico, plus most of Texas, Louisiana, Arkansas, and Missouri.
This eruption occurred about 760,000 years ago. It was centered on southern California, where it created the Long Valley Caldera, and spewed out 580 km3 (139 cubic miles) of material. This makes it North America's third-largest eruption of the past few million years.
The material ejected by this eruption is known as the Bishop ash bed, and covers the central and western parts of the Lava Creek ash bed.
Mount St Helens
The eruption of Mount St Helens in 1980 was the deadliest and most destructive volcanic event in U.S. history: it created a mile-wide crater, killed 57 people and created economic damage in the neighborhood of $1 billion.
Yet by Yellowstone standards, it was tiny: Mount St Helens only ejected 0.25 km3 (0.06 cubic miles) of material, most of the ash settling in a relatively narrow band across Washington State and Idaho. By comparison, the Lava Creek eruption left a large swathe of North America in up to two metres of debris.
The difference between quakes and faults
The volume of dense rock equivalent (DRE) ejected by the Huckleberry Ridge event dwarfs all other North American eruptions. It is itself overshadowed by the DRE ejected at the most recent eruption at Toba (present-day Indonesia). This was one of the largest known eruptions ever and a relatively recent one: only 75,000 years ago. It is thought to have caused a global volcanic winter which lasted up to a decade and may be responsible for the bottleneck in human evolution: around that time, the total human population suddenly and drastically plummeted to between 1,000 and 10,000 breeding pairs.
Image: USGS – public domain
So, what are the chances of something that massive happening anytime soon? The aforementioned mongers of doom often claim that major eruptions occur at intervals of 600,000 years and point out that the last one was 640,000 years ago. Except that (a) the first interval was about 200,000 years longer, (b) two intervals is not a lot to base a prediction on, and (c) those intervals don't really mean anything anyway. Not in the case of volcanic eruptions, at least.
Earthquakes can be 'overdue' because the stress on fault lines is built up consistently over long periods, which means quakes can be predicted with a relative degree of accuracy. But this is not how volcanoes behave. They do not accumulate magma at constant rates. And the subterranean pressure that causes the magma to erupt does not follow a schedule.
What's more, previous super-eruptions do not necessarily imply future ones. Scientists are not convinced that there ever will be another big eruption at Yellowstone. Smaller eruptions, however, are much likelier. Since the Lava Creek eruption, there have been about 30 smaller outbreaks at Yellowstone, the last lava flow being about 70,000 years ago.
As for the immediate future (give or take a century): the magma chamber beneath Yellowstone is only 5 percent to 15 percent molten. Most scientists agree that is as un-alarming as it sounds. And that its statistically more relevant to worry about death by lightning, shark, or piano.
Strange Maps #1041
Got a strange map? Let me know at firstname.lastname@example.org.
How imagining the worst case scenario can help calm anxiety.
- Stoicism is the philosophy that nothing about the world is good or bad in itself, and that we have control over both our judgments and our reactions to things.
- It is hardest to control our reactions to the things that come unexpectedly.
- By meditating every day on the "worst case scenario," we can take the sting out of the worst that life can throw our way.
Are you a worrier? Do you imagine nightmare scenarios and then get worked up and anxious about them? Does your mind get caught in a horrible spiral of catastrophizing over even the smallest of things? Worrying, particularly imagining the worst case scenario, seems to be a natural part of being human and comes easily to a lot of us. It's awful, perhaps even dangerous, when we do it.
But, there might just be an ancient wisdom that can help. It involves reframing this attitude for the better, and it comes from Stoicism. It's called "premeditation," and it could be the most useful trick we can learn.
Broadly speaking, Stoicism is the philosophy of choosing your judgments. Stoics believe that there is nothing about the universe that can be called good or bad, valuable or valueless, in itself. It's we who add these values to things. As Shakespeare's Hamlet says, "There is nothing either good or bad, but thinking makes it so." Our minds color the things we encounter as being "good" or "bad," and given that we control our minds, we therefore have control over all of our negative feelings.
Put another way, Stoicism maintains that there's a gap between our experience of an event and our judgment of it. For instance, if someone calls you a smelly goat, you have an opportunity, however small and hard it might be, to pause and ask yourself, "How will I judge this?" What's more, you can even ask, "How will I respond?" We have power over which thoughts we entertain and the final say on our actions. Today, Stoicism has influenced and finds modern expression in the hugely effective "cognitive behavioral therapy."
Helping you practice StoicismCredit: Robyn Beck via Getty Images
One of the principal fathers of ancient Stoicism was the Roman statesmen, Seneca, who argued that the unexpected and unforeseen blows of life are the hardest to take control over. The shock of a misfortune can strip away the power we have to choose our reaction. For instance, being burglarized feels so horrible because we had felt so safe at home. A stomach ache, out of the blue, is harder than a stitch thirty minutes into a run. A sudden bang makes us jump, but a firework makes us smile. Fell swoops hurt more than known hardships.
What could possibly go wrong?
So, how can we resolve this? Seneca suggests a Stoic technique called "premeditatio malorum" or "premeditation." At the start of every day, we ought to take time to indulge our anxious and catastrophizing mind. We should "rehearse in the mind: exile, torture, war, shipwreck." We should meditate on the worst things that could happen: your partner will leave you, your boss will fire you, your house will burn down. Maybe, even, you'll die.
This might sound depressing, but the important thing is that we do not stop there.
Stoicism has influenced and finds modern expression in the hugely effective "cognitive behavioral therapy."
The Stoic also rehearses how they will react to these things as they come up. For instance, another Stoic (and Roman Emperor) Marcus Aurelius asks us to imagine all the mean, rude, selfish, and boorish people we'll come across today. Then, in our heads, we script how we'll respond when we meet them. We can shrug off their meanness, smile at their rudeness, and refuse to be "implicated in what is degrading." Thus prepared, we take control again of our reactions and behavior.
The Stoics cast themselves into the darkest and most desperate of conditions but then realize that they can and will endure. With premeditation, the Stoic is prepared and has the mental vigor necessary to take the blow on the chin and say, "Yep, l can deal with this."
Catastrophizing as a method of mental inoculation
Seneca wrote: "In times of peace, the soldier carries out maneuvers." This is also true of premeditation, which acts as the war room or training ground. The agonizing cut of the unexpected is blunted by preparedness. We can prepare the mind for whatever trials may come, in just the same way we can prepare the body for some endurance activity. The world can throw nothing as bad as that which our minds have already imagined.
Stoicism teaches us to embrace our worrying mind but to embrace it as a kind of inoculation. With a frown over breakfast, try to spend five minutes of your day deliberately catastrophizing. Get your anti-anxiety battle plan ready and then face the world.
A study on charity finds that reminding people how nice it feels to give yields better results than appealing to altruism.
- A study finds asking for donations by appealing to the donor's self-interest may result in more money than appealing to their better nature.
- Those who received an appeal to self-interest were both more likely to give and gave more than those in the control group.
- The effect was most pronounced for those who hadn't given before.
Even the best charities with the longest records of doing great fundraising work have to spend some time making sure that the next donation checks will keep coming in. One way to do this is by showing potential donors all the good things the charity did over the previous year. But there may be a better way.
A new study by researchers in the United States and Australia suggests that appealing to the benefits people will receive themselves after a donation nudges them to donate more money than appealing to the greater good.
How to get people to give away free money
The postcards that were sent to different study subjects. The one on the left highlighted benefits to the self, while the one on the right highlighted benefits to others.List et al. / Nature Human Behaviour
The study, published in Nature Human Behaviour, utilized the Pick.Click.Give program in Alaska. This program allows Alaska residents who qualify for dividends from the Alaska Permanent Fund, a yearly payment ranging from $800 to $2000 in recent years, to donate a portion of it to various in-state non-profit organizations.
The researchers randomly assigned households to either a control group or to receive a postcard in the mail encouraging them to donate a portion of their dividend to charity. That postcard could come in one of two forms, either highlighting the benefits to others or the benefits to themselves.
Those who got the postcard touting self-benefits were 6.6 percent more likely to give than those in the control group and gave 23 percent more on average. Those getting the benefits-to-others postcard were slightly more likely to give than those receiving no postcard, but their donations were no larger.
Additionally, the researchers were able to break the subject list down into a "warm list" of those who had given at least once before in the last two years and a "cold list" of those who had not. Those on the warm list, who were already giving, saw only minor increases in their likelihood to donate after getting a postcard in the mail compared to those on the cold list.
Additionally, the researchers found that warm-list subjects who received the self-interest postcard gave 11 percent more than warm-list subjects in the control group. Amazingly, among cold-list subjects, those who received a self-interest postcard gave 39 percent more.
These are substantial improvements. At the end of the study, the authors point out, "If we had sent the benefits to self message to all households in the state, aggregate contributions would have increased by nearly US$600,000."
To put this into perspective, in 2017 the total donations to the program were roughly $2,700,000.
Is altruism dead?
Are all actions inherently self-interested? Thankfully, no. The study focuses entirely on effective ways to increase charitable donations above levels that currently exist. It doesn't deny that some people are giving out of pure altruism, but rather that an appeal based on self-interest is effective. Plenty of people were giving before this study took place who didn't need a postcard as encouragement. It is also possible that some people donated part of their dividend check to a charity that does not work with Pick.Click.Give and were uncounted here.
It is also important to note that Pick.Click.Give does not provide services but instead gives money to a wide variety of organizations that do. Those organizations operate in fields from animal rescue to job training to public broadcasting. The authors note that it is possible that a more specific appeal to the benefits others will receive from a donation might prove more effective than the generic and all-inclusive "Make Alaska Better For Everyone" appeal that they used.
In an ideal world, charity is its own reward. In ours, it might help to remind somebody how warm and fuzzy they'll feel after donating to your cause.
The 'Monkeydactyl' was a flying reptile that evolved highly specialized adaptations in the Mesozoic Era.