Once a week.
Subscribe to our weekly newsletter.
Metal-like materials have been discovered in a very strange place.
- Bristle worms are odd-looking, spiky, segmented worms with super-strong jaws.
- Researchers have discovered that the jaws contain metal.
- It appears that biological processes could one day be used to manufacture metals.
The bristle worm, also known as polychaetes, has been around for an estimated 500 million years. Scientists believe that the super-resilient species has survived five mass extinctions, and there are some 10,000 species of them.
Be glad if you haven't encountered a bristle worm. Getting stung by one is an extremely itchy affair, as people who own saltwater aquariums can tell you after they've accidentally touched a bristle worm that hitchhiked into a tank aboard a live rock.
Bristle worms are typically one to six inches long when found in a tank, but capable of growing up to 24 inches long. All polychaetes have a segmented body, with each segment possessing a pair of legs, or parapodia, with tiny bristles. ("Polychaeate" is Greek for "much hair.") The parapodia and its bristles can shoot outward to snag prey, which is then transferred to a bristle worm's eversible mouth.
The jaws of one bristle worm — Platynereis dumerilii — are super-tough, virtually unbreakable. It turns out, according to a new study from researchers at the Technical University of Vienna, this strength is due to metal atoms.
Metals, not minerals
Fireworm, a type of bristle wormCredit: prilfish / Flickr
This is pretty unusual. The study's senior author Christian Hellmich explains: "The materials that vertebrates are made of are well researched. Bones, for example, are very hierarchically structured: There are organic and mineral parts, tiny structures are combined to form larger structures, which in turn form even larger structures."
The bristle worm jaw, by contrast, replaces the minerals from which other creatures' bones are built with atoms of magnesium and zinc arranged in a super-strong structure. It's this structure that is key. "On its own," he says, "the fact that there are metal atoms in the bristle worm jaw does not explain its excellent material properties."
Just deformable enough
Credit: by-studio / Adobe Stock
What makes conventional metal so strong is not just its atoms but the interactions between the atoms and the ways in which they slide against each other. The sliding allows for a small amount of elastoplastic deformation when pressure is applied, endowing metals with just enough malleability not to break, crack, or shatter.
Co-author Florian Raible of Max Perutz Labs surmises, "The construction principle that has made bristle worm jaws so successful apparently originated about 500 million years ago."
Raible explains, "The metal ions are incorporated directly into the protein chains and then ensure that different protein chains are held together." This leads to the creation of three-dimensional shapes the bristle worm can pack together into a structure that's just malleable enough to withstand a significant amount of force.
"It is precisely this combination," says the study's lead author Luis Zelaya-Lainez, "of high strength and deformability that is normally characteristic of metals.
So the bristle worm jaw is both metal-like and yet not. As Zelaya-Lainez puts it, "Here we are dealing with a completely different material, but interestingly, the metal atoms still provide strength and deformability there, just like in a piece of metal."
Observing the creation of a metal-like material from biological processes is a bit of a surprise and may suggest new approaches to materials development. "Biology could serve as inspiration here," says Hellmich, "for completely new kinds of materials. Perhaps it is even possible to produce high-performance materials in a biological way — much more efficiently and environmentally friendly than we manage today."
Dealing with rudeness can nudge you toward cognitive errors.
- Anchoring is a common bias that makes people fixate on one piece of data.
- A study showed that those who experienced rudeness were more likely to anchor themselves to bad data.
- In some simulations with medical students, this effect led to higher mortality rates.
Cognitive biases are funny little things. Everyone has them, nobody likes to admit it, and they can range from minor to severe depending on the situation. Biases can be influenced by factors as subtle as our mood or various personality traits.
A new study soon to be published in the Journal of Applied Psychology suggests that experiencing rudeness can be added to the list. More disturbingly, the study's findings suggest that it is a strong enough effect to impact how medical professionals diagnose patients.
Life hack: don't be rude to your doctor
The team of researchers behind the project tested to see if participants could be influenced by the common anchoring bias, defined by the researchers as "the tendency to rely too heavily or fixate on one piece of information when making judgments and decisions." Most people have experienced it. One of its more common forms involves being given a particular value, say in negotiations on price, which then becomes the center of reasoning even when reason would suggest that number should be ignored.
It can also pop up in medicine. As co-author Dr. Trevor Foulk explains, "If you go into the doctor and say 'I think I'm having a heart attack,' that can become an anchor and the doctor may get fixated on that diagnosis, even if you're just having indigestion. If doctors don't move off anchors enough, they'll start treating the wrong thing."
Lots of things can make somebody more or less likely to anchor themselves to an idea. The authors of the study, who have several papers on the effects of rudeness, decided to see if that could also cause people to stumble into cognitive errors. Past research suggested that exposure to rudeness can limit people's perspective — perhaps anchoring them.
In the first version of the study, medical students were given a hypothetical patient to treat and access to information on their condition alongside an (incorrect) suggestion on what the condition was. This served as the anchor. In some versions of the tests, the students overheard two doctors arguing rudely before diagnosing the patient. Later variations switched the diagnosis test for business negotiations or workplace tasks while maintaining the exposure to rudeness.
Across all iterations of the test, those exposed to rudeness were more likely to anchor themselves to the initial, incorrect suggestion despite the availability of evidence against it. This was less significant for study participants who scored higher on a test of how wide of a perspective they tended to have. The disposition of these participants, who answered in the affirmative to questions like, "Before criticizing somebody, I try to imagine how I would feel if I were in his/her place," was able to effectively negate the narrowing effects of rudeness.
What this means for you and your healthcare
The effects of anchoring when a medical diagnosis is on the line can be substantial. Dr. Foulk explains that, in some simulations, exposure to rudeness can raise the mortality rate as doctors fixate on the wrong problems.
The authors of the study suggest that managers take a keener interest in ensuring civility in workplaces and giving employees the tools they need to avoid judgment errors after dealing with rudeness. These steps could help prevent anchoring.
Also, you might consider being nicer to people.
The distances between the stars are so vast that they can make your brain melt. Take for example the Voyager 1 probe, which has been traveling at 35,000 miles per hour for more than 40 years and was the first human object to cross into interstellar space. That sounds wonderful except, at its current speed, it will still take another 40,000 years to cross the typical distance between stars.
Worse still, if you are thinking about interstellar travel, nature provides a hard limit on acceleration and speed. As Einstein showed, it's impossible to accelerate any massive object beyond the speed of light. Since the galaxy is more than 100,000 light-years across, if you are traveling at less than light speed, then most interstellar distances would take more than a human lifetime to cross. If the known laws of physics hold, then it seems a galaxy-spanning human civilization is impossible.
Unless of course you can build a warp drive.
Ah, the warp drive, that darling of science fiction plot devices. So, what about a warp drive? Is that even a really a thing?
Let's start with the "warping" part of a warp drive. Without doubt, Albert Einstein's theory of general relativity ("GR") represents space and time as a 4-dimensional "fabric" that can be stretched and bent and folded. Gravity waves, representing ripples in the fabric of spacetime, have now been directly observed. So, yes spacetime can be warped. The warping part of a warp drive usually means distorting the shape of spacetime so that two distant locations can be brought close together — and you somehow "jump" between them.
This was a basic idea in science fiction long before Star Trek popularized the name "warp drive." But until 1994, it had remained science fiction, meaning there was no science behind it. That year, Miguel Alcubierre wrote down a solution to the basic equations of GR that represented a region that compressed spacetime ahead of it and expanded spacetime behind to create a kind of traveling warp bubble. This was really good news for warp drive fans.
The problems with a warp drive
There were some problems though. Most important was that this "Alcubierre drive" required lots of "exotic matter" or "negative energy" to work. Unfortunately, there's no such thing. These are things theorists dreamed up to stick into the GR equations in order to do cool things like make stable open wormholes or functioning warp drives.
It's also noteworthy that researchers have raised other concerns about an Alcubierre drive — like how it would violate quantum mechanics or how when you arrived at your destination it would destroy everything in front of the ship in an apocalyptic flash of radiation.
Warp drives: A new hope
Credit: Primada / 420366373 via Adobe Stock
Recently, however, there seemed to be good news on the warp drive front with the publication this April of a new paper by Alexey Bobrick and Gianni Martre entitled "Introducing Physical Warp Drives." The good thing about the Bobrick and Martre paper was it was extremely clear about the meaning of a warp drive.
Understanding the equations of GR means understanding what's on either side of the equals sign. On one side, there is the shape of spacetime, and on the other, there is the configuration of matter-energy. The traditional route with these equations is to start with a configuration of matter-energy and see what shape of spacetime it produces. But you can also go the other way around and assume the shape of spacetime you want (like a warp bubble) and determine what kind of configuration of matter-energy you will need (even if that matter-energy is the dream stuff of negative energy).
Warp drives are simpler and much less mysterious objects than the broader literature has suggested.
What Bobrick and Martre did was step back and look at the problem more generally. They showed how all warp drives were composed of three regions: an interior spacetime called the passenger space; a shell of material, with either positive or negative energy, called the warping region; and an outside that, far enough away, looks like normal unwarped spacetime. In this way they could see exactly what was and was not possible for any kind of warp drive. (Watch this lovely explainer by Sabine Hossenfelder for more details). They even showed that you could use good old normal matter to create a warp drive that, while it moved slower than light speed, produced a passenger area where time flowed at a different rate than in the outside spacetime. So even though it was a sub-light speed device, it was still an actual warp drive that could use normal matter.
That was the good news.
The bad news was this clear vision also showed them a real problem with the "drive" part of the Alcubierre drive. First of all, it still needed negative energy to work, so that bummer remains. But worse, Bobrick and Martre reaffirmed a basic understanding of relativity and saw that there was no way to accelerate an Alcubierre drive past light speed. Sure, you could just assume that you started with something moving faster than light, and the Alcubierre drive with its negative energy shell would make sense. But crossing the speed of light barrier was still prohibited.
So, in the end, the Star Trek version of the warp drive is still not a thing. I know this may bum you out if you were hoping to build that version of the Enterprise sometime soon (as I was). But don't be too despondent. The Bobrick and Martre paper really did make headway. As the authors put it in the end:
"One of the main conclusions of our study is that warp drives are simpler and much less mysterious objects than the broader literature has suggested"
That really is progress.
Being mortal makes life so much sweeter.
- Since the beginning of time, humans have fantasized over and quested for "eternal life."
- Lobsters and a kind of jellyfish offer us clues about what immortality might look like in the natural world.
- Evolution does not lend itself easily to longevity, and philosophy might suggest that life is more precious without immortality.
One of the oldest pieces of epic literature we have is known as the Epic of Gilgamesh. It's easy to get lost in all the ancient mythology — talking animals and heroic battles — but at its heart lies one of the most fundamental and universal quests of all time: the search for immortality. It's all about Gilgamesh wanting to live forever.
From Mesopotamian poetry to Indiana Jones and the Last Crusade, from golden apples to the philosopher's stone, humans, everywhere, have wanted and sought after eternal life.
And yet, perhaps the secret to immortality is not as elusive as we might think. Rather than holy objects or science fiction, we need only look to the animal world to see how nature, that most magical of places, might be able to answer one of the oldest questions there is.
If you ever find yourself at Red Lobster or about to munch into a lobster roll, take a moment to consider that you might just be eating a clue to perpetual youth. To see why, we have to know a tiny bit about aging.
As you get older, it's impossible not to notice how everything creaks a little more, how easy jobs now require great effort, and how hangovers are no longer a laughing matter. Our bodies are designed to degrade and wear away. This deterioration, known as "senescence" in biology, occurs at the cellular level. It's when the cells in our body stop dividing, yet remain in our body, active and alive. We need our cells to divide so that we can grow and repair. For instance, when we cut ourselves or lift weights in the gym, it is cell division that replaces and rebuilds the damage done. But, over time, our cells just stop dividing. They stay around to do the best they can, but like the macroscopic humans they make up, cells get slower and more error-prone — and so, we age.
But not lobsters. In normal cases of cell division, the shields at the end of our chromosomes — called telomeres — are remade a bit smaller, and so a bit less effective after each subsequent cell division at protecting our DNA. When this reaches a certain point, the cell enters senescence and will stop dividing. It won't self-destruct but will just carry on and wallow as it is. Lobsters, though, have a special enzyme (unsurprisingly, called telomerase) which makes sure that their cells' telomeres remain as long and brilliant as they've always been. Their cells will never enter senescence, and so a lobster just won't age.
However, what evolution giveth with one hand, it taketh with another. As crustaceans, their skeleton is on the outside, and having a constantly growing body means they are always outgrowing their exoskeletal homes. They need to abandon their old shells and regrow a new one all the time. This, of course, requires huge reserves of energy, and as the lobster reaches a certain size, it simply cannot consume enough calories to build the shell equivalent of a mansion. Lobsters do not die from old age but exhaustion (as well as disease and New England fisherman).
The jellyfish that reverses its life cycle
Although lobsters might not have perfected immortality, perhaps there's something to learn.
But there's another animal that does even better than the lobster, and it's the only creature recognized to be properly immortal. That's the jellyfish known as Turritopsis dohrnii. These jellyfish are tiny — about the size of a fly at their biggest — but they've mastered one ridiculous trick: they can reverse their life cycle.
An embryonic jellyfish starts as a fertilized egg before hooking onto some kind of surface to then grow up. In this stage, they will stretch out to look like any other jellyfish. Eventually, they will break away from this surface to become a mature, fully developed jellyfish, which is in turn ready to reproduce. So far, so normal.
Yet Turritopsis dohrnii does something remarkable. When things get tough — like the environment becomes hostile or there's a conspicuous absence of food — they can change back to one of the earlier stages in their lifecycle. It's like a frog becoming a tadpole or a fly becoming a maggot. It's the human equivalent of a mature adult saying, "Right, I've had enough of this job, that mortgage, this stress, and that anxiety, so I'm going to turn back into a toddler.". Or, it's like an old man deciding to become a fetus again, for one more round.
Obviously, a fingernail sized jellyfish is not immortal as we'd probably want the word to mean. They're as squishable and digestible as any animal. But, their ability to change back to earlier forms of life, ones which are better adapted to certain environments or where there are fewer food sources, means that they could, in theory, go on forever.
Why do we want to live forever?
Although the quest for immortality is as old as humanity itself, it's surprisingly hard to find across the diverse natural world. Truth be told, evolution doesn't care about how long we live, so long as we live long enough to pass on our genes and to make sure our children are vaguely looked after. Anything more than that is redundant, and evolution doesn't have much time for needless longevity.
The more philosophical question, though, is why do we want to live forever? We're all prone to existential anguish, and we all, at least some of the time, fear death. We don't want to leave our loved ones behind, we want to finish our projects, and we much prefer the known life to an unknown afterlife. Yet, death serves a purpose. As the German philosopher Martin Heidegger argued, death is what gives meaning to life.
Having the end makes the journey worthwhile. It's fair to say that playing a game is only fun because it doesn't go on forever, a play will always need its curtain call, and a word only makes sense at its last letter. As philosophy and religion has repeated throughout the ages: memento mori, or "remember you'll die."
Being mortal in this world makes life so much sweeter, which is surely why lobsters and tiny jellyfish have such ennui.Jonny Thomson teaches philosophy in Oxford. He runs a popular Instagram account called Mini Philosophy (@philosophyminis). His first book is Mini Philosophy: A Small Book of Big Ideas
Quantum theory has weird implications. Trying to explain them just makes things weirder.
- The weirdness of quantum theory flies in the face of what we experience in our everyday lives.
- Quantum weirdness quickly created a split in the physics community, each side championed by a giant: Albert Einstein and Niels Bohr.
- As two recent books espousing opposing views show, the debate still rages on nearly a century afterward. Each "resolution" comes with a high price tag.
Albert Einstein and Niels Bohr, two giants of 20th century science, espoused very different worldviews.
To Einstein, the world was ultimately rational. Things had to make sense. They should be quantifiable and expressible through a logical chain of cause-and-effect interactions, from what we experience in our everyday lives all the way to the depths of reality. To Bohr, we had no right to expect any such order or rationality. Nature, at its deepest level, need not follow any of our expectations of well-behaved determinism. Things could be weird and non-deterministic, so long as they became more like what we expect when we traveled from the world of atoms to our world of trees, frogs, and cars. Bohr divided the world into two realms, the familiar classical world, and the unfamiliar quantum world. They should be complementary to one another but with very different properties.
The two scientists spent decades arguing about the impact of quantum physics on the nature of reality. Each had groups of physicists as followers, all of them giants of their own. Einstein's group of quantum weirdness deniers included quantum physics pioneers Max Planck, Louis de Broglie, and Erwin Schrödinger, while Bohr's group had Werner Heisenberg (of uncertainty principle fame), Max Born, Wolfgang Pauli, and Paul Dirac.
Almost a century afterward, the debate rages on.
Einstein vs. Bohr, Redux
Two books — one authored by Sean Carroll and published last fall and another published very recently and authored by Carlo Rovelli — perfectly illustrate how current leading physicists still cannot come to terms with the nature of quantum reality. The opposing positions still echo, albeit with many modern twists and experimental updates, the original Einstein-Bohr debate.
Albert Einstein and Niels Bohr, two giants of 20th century science, espoused very different worldviews.
I summarized the ongoing dispute in my book The Island of Knowledge: Are the equations of quantum physics a computational tool that we use to make sense of the results of experiments (Bohr), or are they supposed to be a realistic representation of quantum reality (Einstein)? In other words, are the equations of quantum theory the way things really are or just a useful map?
Einstein believed that quantum theory, as it stood in the 1930s and 1940s, was an incomplete description of the world of the very small. There had to be an underlying level of reality, still unknown to us, that made sense of all its weirdness. De Broglie and, later, David Bohm, proposed an extension of the quantum theory known as hidden variable theory that tried to fill in the gap. It was a brilliant attempt to appease the urge Einstein and his followers had for an orderly natural world, predictable and reasonable. The price — and every attempt to deal with the problem of figuring out quantum theory has a price tag — was that the entire universe had to participate in determining the behavior of every single electron and all other quantum particles, implicating the existence of a strange cosmic order.
Later, in the 1960s, physicist John Bell proved a theorem that put such ideas to the test. A series of remarkable experiments starting in the 1970s and still ongoing have essentially disproved the de Broglie-Bohm hypothesis, at least if we restrict their ideas to what one would call "reasonable," that is, theories that have local interactions and causes. Omnipresence — what physicists call nonlocality — is a hard pill to swallow in physics.
Credit: Public domain
Yet, the quantum phenomenon of superposition insists on keeping things weird. Here's one way to picture quantum superposition. In a kind of psychedelic dream state, imagine that you had a magical walk-in closet filled with identical shirts, the only difference between them being their color. What's magical about this closet? Well, as you enter this closet, you split into identical copies of yourself, each wearing a shirt of a different color. There is a you wearing a blue shirt, another a red, another a white, etc., all happily coexisting. But as soon as you step out of the closet or someone or something opens the door, only one you emerges, wearing a single shirt. Inside the closet, you are in a superposition state with your other selves. But in the "real" world, the one where others see you, only one copy of you exists, wearing a single shirt. The question is whether the inside superposition of the many yous is as real as the one you that emerges outside.
To Einstein, the world was ultimately rational... To Bohr, we had no right to expect any such order or rationality.
The (modern version of the) Einstein team would say yes. The equations of quantum physics must be taken as the real description of what's going on, and if they predict superposition, so be it. The so-called wave function that describes this superposition is an essential part of physical reality. This point is most dramatically exposed by the many-worlds interpretation of quantum physics, espoused in Carroll's book. For this interpretation, reality is even weirder: the closet has many doors, each to a different universe. Once you step out, all of your copies step out together, each into a parallel universe. So, if I happen to see you wearing a blue shirt in this universe, in another, I'll see you wearing a red one. The price tag for the many-worlds interpretation is to accept the existence of an uncountable number of non-communicating parallel universes that enact all possibilities from a superstition state. In a parallel universe, there was no COVID-19 pandemic. Not too comforting.
Bohm's team would say take things as they are. If you stepped out of the closet and someone saw you wearing a shirt of a given color, then this is the one. Period. The weirdness of your many superposing selves remains hidden in the quantum closet. Rovelli defends his version of this worldview, called relational interpretation, in which events are defined by the interactions between the objects involved, be them observers or not. In this example, the color of your shirt is the property at stake, and when I see it, I am entangled with this specific shirt of yours. It could have been another color, but it wasn't. As Rovelli puts it, "Entanglement… is the manifestation of one object to another, in the course of an interaction, in which the properties of the objects become actual." The price to pay here is to give up the hope of ever truly understanding what goes on in the quantum world. What we measure is what we get and all we can say about it.
What should we believe?
Both Carroll and Rovelli are master expositors of science to the general public, with Rovelli being the more lyrical of the pair.
There is no resolution to be expected, of course. I, for one, am more inclined to Bohr's worldview and thus to Rovelli's, although the interpretation I am most sympathetic to, called QBism, is not properly explained in either book. It is much closer in spirit to Rovelli's, in that relations are essential, but it places the observer on center stage, given that information is what matters in the end. (Although, as Rovelli acknowledges, information is a loaded word.)
We create theories as maps for us human observers to make sense of reality. But in the excitement of research, we tend to forget the simple fact that theories and models are not nature but our representations of nature. Unless we nurture hopes that our theories are really how the world is (the Einstein camp) and not how we humans describe it (the Bohr camp), why should we expect much more than this?
Maybe eyes really are windows into the soul — or at least into the brain, as a new study finds.
- Researchers find a correlation between pupil size and differences in cognitive ability.
- The larger the pupil, the higher the intelligence.
- The explanation for why this happens lies within the brain, but more research is needed.
What can you tell by looking into someone's eyes? You can spot a glint of humor, signs of tiredness, or maybe that they don't like something or someone.
But outside of assessing an emotional state, a person's eyes may also provide clues about their intelligence, suggests new research. A study carried out at the Georgia Institute of Technology shows that pupil size is "closely related" to differences in intelligence between individuals.
The scientists found that larger pupils may be connected to higher intelligence, as demonstrated by tests that gauged reasoning skills, memory, and attention. In fact, the researchers claim that the relationship of intelligence to pupil size is so pronounced, that it came across their previous two studies as well and can be spotted just with your naked eyes, without any additional scientific instruments. You should be able to tell who scored the highest or the lowest on the cognitive tests just by looking at them, say the researchers.
The pupil-IQ link
The connection was first noticed across memory tasks, looking at pupil dilations as signs of mental effort. The studies involved more than 500 people aged 18 to 35 from the Atlanta area. The subjects' pupil sizes were measured by eye trackers, which use a camera and a computer to capture light reflecting off the pupil and cornea. As the scientists explained in Scientific American, pupil diameters range from two to eight millimeters. To determine average pupil size, they took measurements of the pupils at rest when the participants were staring at a blank screen for a few minutes.
Another part of the experiment involved having the subjects take a series of cognitive tests that evaluated "fluid intelligence" (the ability to reason when confronted with new problems), "working memory capacity" (how well people could remember information over time), and "attention control" (the ability to keep focusing attention even while being distracted). An example of the latter involves a test that attempts to divert a person's focus on a disappearing letter by showing a flickering asterisk on another part of the screen. If a person pays too much attention to the asterisk, they might miss the letter.
The conclusions of the research were that having a larger baseline pupil size was related to greater fluid intelligence, having more attention control, and even greater working memory capacity, although to a smaller extent. In an email exchange with Big Think, author Jason Tsukahara pointed out, "It is important to consider that what we find is a correlation — which should not be confused with causation."
The researchers also found that pupil size seemed to decrease with age. Older people had more constricted pupils but when the scientists standardized for age, the pupil-size-to-intelligence connection still remained.
Why are pupils linked to intelligence?
The connection between pupil size and IQ likely resides within the brain. Pupil size has been previously connected to the locus coeruleus, a part of the brain that's responsible for synthesizing the hormone and neurotransmitter norepinephrine (noradrenaline), which mobilizes the brain and body for action. Activity in the locus coeruleus affects our perception, attention, memory, and learning processes.
As the authors explain, this region of the brain "also helps maintain a healthy organization of brain activity so that distant brain regions can work together to accomplish challenging tasks and goals." Because it is so important, loss of function in the locus coeruleus has been linked to conditions like Alzheimer's disease, Parkinson's, clinical depression, and attention deficit hyperactivity disorder (ADHD).
The researchers hypothesize that people who have larger pupils while in a restful state, like staring at a blank computer screen, have "greater regulation of activity by the locus coeruleus." This leads to better cognitive performance. More research is necessary, however, to truly understand why having larger pupils is related to higher intelligence.
In an email to Big Think, Tsukahara shared, "If I had to speculate, I would say that it is people with greater fluid intelligence that develop larger pupils, but again at this point we only have correlational data."
Do other scientists believe this?
As the scientists point out in the beginning of their paper, their conclusions are controversial and, so far, other researchers haven't been able to duplicate their results. The research team addresses this criticism by explaining that other studies had methodological issues and examined only memory capacity but not fluid intelligence, which is what they measured.