Once a week.
Subscribe to our weekly newsletter.
Why schools should not teach general critical-thinking skills
Schools have become captivated by the idea that students must learn a set of generalized critical-thinking skills to flourish in the contemporary world.
At the heart of the job is a cognitive ability called 'situational awareness' that involves 'the continuous extraction of environmental information [and the] integration of this information with prior knowledge to form a coherent mental picture'. Vast amounts of fluid information must be held in the mind and, under extreme pressure, life-or-death decisions are made across rotating 24-hour work schedules. So stressful and mentally demanding is the job that, in most countries, air-traffic controllers are eligible for early retirement. In the United States, they must retire at 56 without exception.
In the 1960s, an interesting series of experiments was done on air-traffic controllers' mental capacities. Researchers wanted to explore if they had a general enhanced ability to 'keep track of a number of things at once' and whether that skill could be applied to other situations. After observing them at their work, researchers gave the air-traffic controllers a set of generic memory-based tasks with shapes and colours. The extraordinary thing was that, when tested on these skills outside their own area of expertise, the air-traffic controllers did no better than anyone else. Their remarkably sophisticated cognitive abilities did not translate beyond their professional area.
Since the early 1980s, however, schools have become ever more captivated by the idea that students must learn a set of generalised thinking skills to flourish in the contemporary world – and especially in the contemporary job market. Variously called '21st-century learning skills' or 'critical thinking', the aim is to equip students with a set of general problem-solving approaches that can be applied to any given domain; these are lauded by business leaders as an essential set of dispositions for the 21st century. Naturally, we want children and graduates to have a set of all-purpose cognitive tools with which to navigate their way through the world. It's a shame, then, that we've failed to apply any critical thinking to the question of whether any such thing can be taught.
As the 1960s studies on air-traffic controllers suggested, to be good in a specific domain you need to know a lot about it: it's not easy to translate those skills to other areas. This is even more so with the kinds of complex and specialised knowledge that accompanies much professional expertise: as later studies found, the more complex the domain, the more important domain-specific knowledge. This non-translatability of cognitive skill is well-established in psychological research and has been replicated many times. Other studies, for example, have shown that the ability to remember long strings of digits doesn't transfer to the ability to remember long strings of letters. Surely we're not surprised to hear this, for we all know people who are 'clever' in their professional lives yet who often seem to make stupid decisions in their personal lives.
In almost every arena, the higher the skill level, the more specific the expertise is likely to become. In a football team, for example, there are different 'domains' or positions: goalkeeper, defender, attacker. Within those, there are further categories: centre-back, full-back, attacking midfielder, holding midfielder, attacking player. Now, it might be fine for a bunch of amateurs, playing a friendly game, to move positions. But, at a professional level, if you put a left-back in a striker's position or a central midfielder in goal, the players would be lost. For them to make excellent, split-second decisions, and to enact robust and effective strategies, they need thousands of specific mental models – and thousands of hours of practice to create those models – all of which are specific and exclusive to a position.
Of course, critical thinking is an essential part of a student's mental equipment. However, it cannot be detached from context. Teaching students generic 'thinking skills' separate from the rest of their curriculum is meaningless and ineffective. As the American educationalist Daniel Willingham puts it:
[I]f you remind a student to 'look at an issue from multiple perspectives' often enough, he will learn that he ought to do so, but if he doesn't know much about an issue, he can't think about it from multiple perspectives … critical thinking (as well as scientific thinking and other domain-based thinking) is not a skill. There is not a set of critical thinking skills that can be acquired and deployed regardless of context.
This detachment of cognitive ideals from contextual knowledge is not confined to the learning of critical thinking. Some schools laud themselves for placing '21st-century learning skills' at the heart of their mission. It's even been suggested that some of these nebulous skills are now as important as literacy and should be afforded the same status. An example of this is brain-training games that claim to help kids become smarter, more alert and able to learn faster. However, recent research has shown that brain-training games are really only good for one thing – getting good a brain-training games. The claim that they offer students a general set of problem-solving skills was recently debunked by a study that reviewed more than 130 papers, which concluded:
[W]e know of no evidence for broad-based improvement in cognition, academic achievement, professional performance, and/or social competencies that derives from decontextualised practice of cognitive skills devoid of domain-specific content.
The same goes for teaching 'dispositions' such as the 'growth mindset' (focusing on will and effort as opposed to inherent talent) or 'grit' (determination in the face of obstacles). It's not clear that these dispositions can be taught, and there's no evidence that teaching them outside a specific subject matter has any effect.
Instead of teaching generic critical-thinking skills, we ought to focus on subject-specific critical-thinking skills that seek to broaden a student's individual subject knowledge and unlock the unique, intricate mysteries of each subject. For example, if a student of literature knows that Mary Shelley's mother died shortly after Mary was born and that Shelley herself lost a number of children in infancy, that student's appreciation of Victor Frankenstein's obsession with creating life from death, and the language used to describe it, is more enhanced than approaching the text without this knowledge. A physics student investigating why two planes behave differently in flight might know how to 'think critically' through the scientific method but, without solid knowledge of contingent factors such as outside air temperature and a bank of previous case studies to draw upon, the student will struggle to know which hypothesis to focus on and which variables to discount.
As Willingham writes: 'Thought processes are intertwined with what is being thought about.' Students need to be given real and significant things from the world to think with and about, if teachers want to influence how they do that thinking.
- 5 critical life skills everyone should have - Big Think ›
- Multiple-Choice Tests Hinder Critical Thinking. Should They Be ... ›
So much for rest in peace.
- Australian scientists found that bodies kept moving for 17 months after being pronounced dead.
- Researchers used photography capture technology in 30-minute intervals every day to capture the movement.
- This study could help better identify time of death.
We're learning more new things about death everyday. Much has been said and theorized about the great divide between life and the Great Beyond. While everyone and every culture has their own philosophies and unique ideas on the subject, we're beginning to learn a lot of new scientific facts about the deceased corporeal form.
An Australian scientist has found that human bodies move for more than a year after being pronounced dead. These findings could have implications for fields as diverse as pathology to criminology.
Dead bodies keep moving
Researcher Alyson Wilson studied and photographed the movements of corpses over a 17 month timeframe. She recently told Agence France Presse about the shocking details of her discovery.
Reportedly, she and her team focused a camera for 17 months at the Australian Facility for Taphonomic Experimental Research (AFTER), taking images of a corpse every 30 minutes during the day. For the entire 17 month duration, the corpse continually moved.
"What we found was that the arms were significantly moving, so that arms that started off down beside the body ended up out to the side of the body," Wilson said.
The researchers mostly expected some kind of movement during the very early stages of decomposition, but Wilson further explained that their continual movement completely surprised the team:
"We think the movements relate to the process of decomposition, as the body mummifies and the ligaments dry out."
During one of the studies, arms that had been next to the body eventually ended up akimbo on their side.
The team's subject was one of the bodies stored at the "body farm," which sits on the outskirts of Sydney. (Wilson took a flight every month to check in on the cadaver.)Her findings were recently published in the journal, Forensic Science International: Synergy.
Implications of the study
The researchers believe that understanding these after death movements and decomposition rate could help better estimate the time of death. Police for example could benefit from this as they'd be able to give a timeframe to missing persons and link that up with an unidentified corpse. According to the team:
"Understanding decomposition rates for a human donor in the Australian environment is important for police, forensic anthropologists, and pathologists for the estimation of PMI to assist with the identification of unknown victims, as well as the investigation of criminal activity."
While scientists haven't found any evidence of necromancy. . . the discovery remains a curious new understanding about what happens with the body after we die.
Metal-like materials have been discovered in a very strange place.
- Bristle worms are odd-looking, spiky, segmented worms with super-strong jaws.
- Researchers have discovered that the jaws contain metal.
- It appears that biological processes could one day be used to manufacture metals.
The bristle worm, also known as polychaetes, has been around for an estimated 500 million years. Scientists believe that the super-resilient species has survived five mass extinctions, and there are some 10,000 species of them.
Be glad if you haven't encountered a bristle worm. Getting stung by one is an extremely itchy affair, as people who own saltwater aquariums can tell you after they've accidentally touched a bristle worm that hitchhiked into a tank aboard a live rock.
Bristle worms are typically one to six inches long when found in a tank, but capable of growing up to 24 inches long. All polychaetes have a segmented body, with each segment possessing a pair of legs, or parapodia, with tiny bristles. ("Polychaeate" is Greek for "much hair.") The parapodia and its bristles can shoot outward to snag prey, which is then transferred to a bristle worm's eversible mouth.
The jaws of one bristle worm — Platynereis dumerilii — are super-tough, virtually unbreakable. It turns out, according to a new study from researchers at the Technical University of Vienna, this strength is due to metal atoms.
Metals, not minerals
Fireworm, a type of bristle wormCredit: prilfish / Flickr
This is pretty unusual. The study's senior author Christian Hellmich explains: "The materials that vertebrates are made of are well researched. Bones, for example, are very hierarchically structured: There are organic and mineral parts, tiny structures are combined to form larger structures, which in turn form even larger structures."
The bristle worm jaw, by contrast, replaces the minerals from which other creatures' bones are built with atoms of magnesium and zinc arranged in a super-strong structure. It's this structure that is key. "On its own," he says, "the fact that there are metal atoms in the bristle worm jaw does not explain its excellent material properties."
Just deformable enough
Credit: by-studio / Adobe Stock
What makes conventional metal so strong is not just its atoms but the interactions between the atoms and the ways in which they slide against each other. The sliding allows for a small amount of elastoplastic deformation when pressure is applied, endowing metals with just enough malleability not to break, crack, or shatter.
Co-author Florian Raible of Max Perutz Labs surmises, "The construction principle that has made bristle worm jaws so successful apparently originated about 500 million years ago."
Raible explains, "The metal ions are incorporated directly into the protein chains and then ensure that different protein chains are held together." This leads to the creation of three-dimensional shapes the bristle worm can pack together into a structure that's just malleable enough to withstand a significant amount of force.
"It is precisely this combination," says the study's lead author Luis Zelaya-Lainez, "of high strength and deformability that is normally characteristic of metals.
So the bristle worm jaw is both metal-like and yet not. As Zelaya-Lainez puts it, "Here we are dealing with a completely different material, but interestingly, the metal atoms still provide strength and deformability there, just like in a piece of metal."
Observing the creation of a metal-like material from biological processes is a bit of a surprise and may suggest new approaches to materials development. "Biology could serve as inspiration here," says Hellmich, "for completely new kinds of materials. Perhaps it is even possible to produce high-performance materials in a biological way — much more efficiently and environmentally friendly than we manage today."
Dealing with rudeness can nudge you toward cognitive errors.
- Anchoring is a common bias that makes people fixate on one piece of data.
- A study showed that those who experienced rudeness were more likely to anchor themselves to bad data.
- In some simulations with medical students, this effect led to higher mortality rates.
Cognitive biases are funny little things. Everyone has them, nobody likes to admit it, and they can range from minor to severe depending on the situation. Biases can be influenced by factors as subtle as our mood or various personality traits.
A new study soon to be published in the Journal of Applied Psychology suggests that experiencing rudeness can be added to the list. More disturbingly, the study's findings suggest that it is a strong enough effect to impact how medical professionals diagnose patients.
Life hack: don't be rude to your doctor
The team of researchers behind the project tested to see if participants could be influenced by the common anchoring bias, defined by the researchers as "the tendency to rely too heavily or fixate on one piece of information when making judgments and decisions." Most people have experienced it. One of its more common forms involves being given a particular value, say in negotiations on price, which then becomes the center of reasoning even when reason would suggest that number should be ignored.
It can also pop up in medicine. As co-author Dr. Trevor Foulk explains, "If you go into the doctor and say 'I think I'm having a heart attack,' that can become an anchor and the doctor may get fixated on that diagnosis, even if you're just having indigestion. If doctors don't move off anchors enough, they'll start treating the wrong thing."
Lots of things can make somebody more or less likely to anchor themselves to an idea. The authors of the study, who have several papers on the effects of rudeness, decided to see if that could also cause people to stumble into cognitive errors. Past research suggested that exposure to rudeness can limit people's perspective — perhaps anchoring them.
In the first version of the study, medical students were given a hypothetical patient to treat and access to information on their condition alongside an (incorrect) suggestion on what the condition was. This served as the anchor. In some versions of the tests, the students overheard two doctors arguing rudely before diagnosing the patient. Later variations switched the diagnosis test for business negotiations or workplace tasks while maintaining the exposure to rudeness.
Across all iterations of the test, those exposed to rudeness were more likely to anchor themselves to the initial, incorrect suggestion despite the availability of evidence against it. This was less significant for study participants who scored higher on a test of how wide of a perspective they tended to have. The disposition of these participants, who answered in the affirmative to questions like, "Before criticizing somebody, I try to imagine how I would feel if I were in his/her place," was able to effectively negate the narrowing effects of rudeness.
What this means for you and your healthcare
The effects of anchoring when a medical diagnosis is on the line can be substantial. Dr. Foulk explains that, in some simulations, exposure to rudeness can raise the mortality rate as doctors fixate on the wrong problems.
The authors of the study suggest that managers take a keener interest in ensuring civility in workplaces and giving employees the tools they need to avoid judgment errors after dealing with rudeness. These steps could help prevent anchoring.
Also, you might consider being nicer to people.