Digital Natives Do Not Exist, Claims New Paper
A paper argues that the younger generation is no better at technology and multitasking than older people.
People born after 1984 are considered “digital natives” - those to whom digital technology should be second nature. It’s that kid who knows more about computers than you ever did. But taken as a whole, digital natives might be a myth, argues a paper published in Teaching and Teacher Education. Students who grew up in the digital world are no better at information skills simply because they were born into such an era. The study also presents evidence that these supposed “digital natives” are no better at multitasking either. In fact, assuming that they do may harm their education.
The term “digital native” was popularized by the author Marc Prensky in a 2001 article. He differentiated between “digital natives” as those who are “native speakers of the digital language of computers, video games and the Internet, and “digital immigrants” - those who are not born into the digital world but adopt the technology. The natives are supposed to have unique characteristics that make them completely different from the digital immigrants.
Authors Paul A. Kirschner from the Open University of the Netherlands in Heerlen and Belgian Pedro De Bruyckere say no such distinction really exists. They cite a growing number of international studies that show how students born after 1984 do not have any deeper knowledge of technology. The knowledge they have is often limited and consists of having basic office suite skills, emailing, text messaging, Facebooking and surfing the Internet. And the tech they use for learning and socialization is also not very expansive. They do not necessarily recognize the advanced functionality of the applications they use and need to be significantly trained to use the technology properly for learning and problem-solving. When using technology for learning, the “natives” mainly resort to passively consuming information.
The paper’s authors also conclude that there is little scientific proof that digital natives can successfully do many things at once in a way that’s different from previous generations. For example, reading text messages during lecture would have the cognitive cost of not being fully focused on the class. Similarly, a 2010 study cited by the researchers found that high-intensity Facebook users were not able to master content well and had significantly lower GPAs.
Being comfortable with digital technology does not imply special multitasking prowess. At best, the supposed “natives” may be good at “task-switching” - ability to quickly switch between different tasks. Multi-tasking, by and large, is a myth.
The researchers think that in education policy, in particular, it is imperative to not assume that the next generation is more digitally savvy just by default, changing the curriculum accordingly. The authors cite a 2011 EU Kids Online report that found “children knowing more than their parents has been exaggerated”. In fact, assuming that the kids are digital natives might take away the support they actually need to develop necessary digital skills. What the authors advocate is teaching the importance of focus and eliminating the negative effects of multitasking.
What if consciousness is just a blip in the universe, a momentary flowering of experience that is unique to life in early technological civilizations—but eventually vanishes?