from the world's big
A unique brain signal may be the key to human intelligence
Scientists exploring human neurons directly learn some remarkable things.
- Most research regarding human brains is performed with rodent brains on the assumption that it may also apply to us.
- An unusual study looked at recently resected human brain tissue that turned out to contain some big surprises.
- Human neurons' unexpected electrical signals and their behavior shed new light on human intelligence.
Though progress is being made, our brains remain organs of many mysteries. Among these are the exact workings of neurons, with some 86 billion of them in the human brain. Neurons are interconnected in complicated, labyrinthine networks across which they exchange information in the form of electrical signals. We know that signals exit an individual neuron through a fiber called an axon, and also that signals are received by each neuron through input fibers called dendrites.
Understanding the electrical capabilities of dendrites in particular — which, after all, may be receiving signals from countless other neurons at any given moment — is fundamental to deciphering neurons' communication. It may surprise you to learn, though, that much of everything we assume about human neurons is based on observations made of rodent dendrites — there's just not a lot of fresh, still-functional human brain tissue available for thorough examination.
For a new study published January 3 in the journal Science, however, scientists got a rare chance to explore some neurons from the outer layer of human brains, and they discovered startling dendrite behaviors that may be unique to humans, and may even help explain how our billions of neurons process the massive amount of information they exchange.
A puzzle, solved?
Image source: gritsalak karalak/Shutterstock
Electrical signals weaken with distance, and that poses a riddle to those seeking to understand the human brain: Human dendrites are known to be about twice as long as rodent dendrites, which means that a signal traversing a human dendrite could be much weaker arriving at its destination than one traveling a rodent's much shorter dendrite. Says paper co-author biologist Matthew Larkum of Humboldt University in Berlin speaking to LiveScience, "If there was no change in the electrical properties between rodents and people, then that would mean that, in the humans, the same synaptic inputs would be quite a bit less powerful." Chalk up another strike against the value of animal-based human research. The only way this would not be true is if the signals being exchanged in our brains are not the same as those in a rodent. This is exactly what the study's authors found.
The researchers worked with brain tissue sliced for therapeutic reasons from the brains of tumor and epilepsy patients. Neurons were resected from the disproportionately thick layers 2 and 3 of the cerebral cortex, a feature special to humans. In these layers reside incredibly dense neuronal networks.
Without blood-borne oxygen, though, such cells only last only for about two days, so Larkum's lab had no choice but to work around the clock during that period to get the most information from the samples. "You get the tissue very infrequently, so you've just got to work with what's in front of you," says Larkum. The team made holes in dendrites into which they could insert glass pipettes. Through these, they sent ions to stimulate the dendrites, allowing the scientists to observe their electrical behavior.
In rodents, two type of electrical spikes have been observed in dendrites: a short, one-millisecond spike with the introduction of sodium, and spikes that last 50- to 100-times longer in response to calcium.
In the human dendrites, one type of behavior was observed: super-short spikes occurring in rapid succession, one after the other. This suggests to the researchers that human neurons are "distinctly more excitable " than rodent neurons, allowing them to successfully traverse our longer dendrites.
In addition, the human neuronal spikes — though they behaved somewhat like rodent spikes prompted by the introduction of sodium — were found to be generated by calcium, essentially the opposite of rodents.
An even bigger surprise
Image source: bluebay/Shutterstock
The study also reports a second major finding. Looking to better understand how the brain utilizes these spikes, the team programmed computer models based on their findings. (The brains slices they'd examined could not, of course, be put back together and switched on somehow.)
The scientists constructed virtual neuronal networks, each of whose neurons could could be stimulated at thousands of points along its dendrites, to see how each handled so many input signals. Previous, non-human, research has suggested that neurons add these inputs together, holding onto them until the number of excitatory input signals exceeds the number of inhibitory signals, at which point the neuron fires the sum of them from its axon out into the network.
However, this isn't what Larkum's team observed in their model. Neurons' output was inverse to their inputs: The more excitatory signals they received, the less likely they were to fire off. Each had a seeming "sweet spot" when it came to input strength.
What the researchers believe is going on is that dendrites and neurons may be smarter than previously suspected, processing input information as it arrives. Mayank Mehta of UC Los Angeles, who's not involved in the research, tells LiveScience, "It doesn't look that the cell is just adding things up — it's also throwing things away." This could mean each neuron is assessing the value of each signal to the network and discarding "noise." It may also be that different neurons are optimized for different signals and thus tasks.
Much in the way that octopuses distribute decision-making across a decentralized nervous system, the implication of the new research is that, at least in humans, it's not just the neuronal network that's smart, it's all of the individual neurons it contains. This would constitute exactly the kind of computational super-charging one would hope to find somewhere in the amazing human brain.
Ready to see the future? Nanotronics CEO Matthew Putman talks innovation and the solutions that are right under our noses.
Innovation in manufacturing has crawled since the 1950s. That's about to speed up.
Evolution doesn't clean up after itself very well.
- An evolutionary biologist got people swapping ideas about our lingering vestigia.
- Basically, this is the stuff that served some evolutionary purpose at some point, but now is kind of, well, extra.
- Here are the six traits that inaugurated the fun.
The plica semilunaris<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8xOTA5NjgwMS9vcmlnaW4ucG5nIiwiZXhwaXJlc19hdCI6MTYxMTgyMzg1NX0.ZY8qmhtoZfbRMAqrNnmbgyk7GLabglx_9lBq3PKcy7g/img.png?width=980" id="99882" class="rm-shortcode" data-rm-shortcode-id="68e8758894b0359c6ef61b2c158832b2" data-rm-shortcode-name="rebelmouse-image" />
The human eye in alarming detail. Image source: Henry Gray / Wikimedia commons<p>At the inner corner of our eyes, closest to the nasal ridge, is that little pink thing, which is probably what most of us call it, called the caruncula. Next to it is the plica semilunairs, and it's what's left of a third eyelid that used to — ready for this? — blink horizontally. It's supposed to have offered protection for our eyes, and some birds, reptiles, and fish have such a thing.</p>
Palmaris longus<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8xOTA5NjgwNy9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTYzMzQ1NjUwMn0.dVor41tO_NeLkGY9Tx46SwqhSVaA8HZQmQAp532xLxA/img.jpg?width=980" id="879be" class="rm-shortcode" data-rm-shortcode-id="970e9c15f3c3d846dde05e2b2c6ebf12" data-rm-shortcode-name="rebelmouse-image" />
Palmaris longus muscle. Image source: Wikimedia commons<p> We don't have much need these days, at least most of us, to navigate from tree branch to tree branch. Still, about 86 percent of us still have the wrist muscle that used to help us do it. To see if you have it, place the back of you hand on a flat surface and touch your thumb to your pinkie. If you have a muscle that becomes visible in your wrist, that's the palmaris longus. If you don't, consider yourself more evolved (just joking).</p>
Darwin's tubercle<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8xOTA5NjgxMi9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTY0ODUyNjA1MX0.8RuU-OSRf92wQpaPPJtvFreOVvicEwn39_jnbegiUOk/img.jpg?width=980" id="687a0" class="rm-shortcode" data-rm-shortcode-id="b38a957408940673ccc744f0f6828d18" data-rm-shortcode-name="rebelmouse-image" />
Darwin's tubercle. Image source: Wikimedia commons<p> Yes, maybe the shell of you ear does feel like a dried apricot. Maybe not. But there's a ridge in that swirly structure that's a muscle which allowed us, at one point, to move our ears in the direction of interesting sounds. These days, we just turn our heads, but there it is.</p>
Goosebumps<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8xOTA5NzMxNC9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTYyNzEyNTc2Nn0.aVMa5fsKgiabW5vkr7BOvm2pmNKbLJF_50bwvd4aRo4/img.jpg?width=980" id="d8420" class="rm-shortcode" data-rm-shortcode-id="f735418322b34382dcd882299c9ccc48" data-rm-shortcode-name="rebelmouse-image" />
Goosebumps. Photo credit: Tyler Olson via Shutterstock<p>It's not entirely clear what purpose made goosebumps worth retaining evolutionarily, but there are two circumstances in which they appear: fear and cold. For fear, they may have been a way of making body hair stand up so we'd appear larger to predators, much the way a cat's tail puffs up — numerous creatures exaggerate their size when threatened. In the cold, they may have trapped additional heat for warmth.</p>
Tailbone<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8xOTA5NzMxNi9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTYxMDMzMDc3N30.p9BEtkf3-PV3EtDSQMUGUeopsimiCHUagx97P4f8IBw/img.jpg?width=980" id="e8ab8" class="rm-shortcode" data-rm-shortcode-id="0063ce99bdd22fbebe1279244b87935c" data-rm-shortcode-name="rebelmouse-image" />
Coccyx. Image source: decade3d-anatomy online via Shutterstock<p>Way back, we had tails that probably helped us balance upright, and was useful moving through trees. We still have the stump of one when we're embryos, from 4–6 weeks, and then the body mostly dissolves it during Weeks 6–8. What's left is the coccyx.</p>
The palmar grasp reflex<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8xOTA5NzMyMC9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTYzNjY0MDY5NX0.OSwReKLmNZkbAS12-AvRaxgCM7zyukjQUaG4vmhxTtM/img.jpg?width=980" id="8804c" class="rm-shortcode" data-rm-shortcode-id="45469ca5ee5f43433a782f7d4ac0a440" data-rm-shortcode-name="rebelmouse-image" />
Palmar reflex activated! Photo credit: Raul Luna on Flickr<p> You've probably seen how non-human primate babies grab onto their parents' hands to be carried around. We used to do this, too. So still, if you touch your finger to a baby's palm, or if you touch the sole of their foot, the palmar grasp reflex will cause the hand or foot to try and close around your finger.</p>
Other people's suggestions<p>Amir's followers dove right in, offering both cool and questionable additions to her list. </p>
Fangs?<blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="en" dir="ltr">Lower mouth plate behind your teeth. Some have protruding bone under the skin which is a throw back to large fangs. Almost like an upsidedown Sabre Tooth.</p>— neil crud (@neilcrud66) <a href="https://twitter.com/neilcrud66/status/1085606005000601600?ref_src=twsrc%5Etfw">January 16, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
Hiccups<blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="en" dir="ltr">Sure: <a href="https://t.co/DjMZB1XidG">https://t.co/DjMZB1XidG</a></p>— Stephen Roughley (@SteBobRoughley) <a href="https://twitter.com/SteBobRoughley/status/1085529239556968448?ref_src=twsrc%5Etfw">January 16, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
Hypnic jerk as you fall asleep<blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="en" dir="ltr">What about when you “jump” just as you’re drifting off to sleep, I heard that was a reflex to prevent falling from heights.</p>— Bann face (@thebanns) <a href="https://twitter.com/thebanns/status/1085554171879788545?ref_src=twsrc%5Etfw">January 16, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script> <p> This thing, often called the "alpha jerk" as you drop into alpha sleep, is properly called the hypnic jerk,. It may actually be a carryover from our arboreal days. The <a href="https://www.livescience.com/39225-why-people-twitch-falling-asleep.html" target="_blank" data-vivaldi-spatnav-clickable="1">hypothesis</a> is that you suddenly jerk awake to avoid falling out of your tree.</p>
Nails screeching on a blackboard response?<blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="en" dir="ltr">Everyone hate the sound of fingernails on a blackboard. It's _speculated_ that this is a vestigial wiring in our head, because the sound is similar to the shrill warning call of a chimp. <a href="https://t.co/ReyZBy6XNN">https://t.co/ReyZBy6XNN</a></p>— Pet Rock (@eclogiter) <a href="https://twitter.com/eclogiter/status/1085587006258888706?ref_src=twsrc%5Etfw">January 16, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
Ear hair<blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="en" dir="ltr">Ok what is Hair in the ears for? I think cuz as we get older it filters out the BS.</p>— Sarah21 (@mimix3) <a href="https://twitter.com/mimix3/status/1085684393593561088?ref_src=twsrc%5Etfw">January 16, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
Nervous laughter<blockquote class="twitter-tweet" data-lang="en"><p lang="en" dir="ltr">You may be onto something. Tooth-bearing with the jaw clenched is generally recognized as a signal of submission or non-threatening in primates. Involuntary smiling or laughing in tense situations might have signaled that you weren’t a threat.</p>— Jager Tusk (@JagerTusk) <a href="https://twitter.com/JagerTusk/status/1085316201104912384?ref_src=twsrc%5Etfw">January 15, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
Um, yipes.<blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="en" dir="ltr">Sometimes it feels like my big toe should be on the side of my foot, was that ever a thing?</p>— B033? K@($ (@whimbrel17) <a href="https://twitter.com/whimbrel17/status/1085559016011563009?ref_src=twsrc%5Etfw">January 16, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
So far, 30 student teams have entered the Indy Autonomous Challenge, scheduled for October 2021.
- The Indy Autonomous Challenge will task student teams with developing self-driving software for race cars.
- The competition requires cars to complete 20 laps within 25 minutes, meaning cars would need to average about 110 mph.
- The organizers say they hope to advance the field of driverless cars and "inspire the next generation of STEM talent."
Indy Autonomous Challenge<p>Completing the race in 25 minutes means the cars will need to average about 110 miles per hour. So, while the race may end up being a bit slower than a typical Indy 500 competition, in which winners average speeds of over 160 mph, it's still set to be the fastest autonomous race featuring full-size cars.</p><p style="margin-left: 20px;">"There is no human redundancy there," Matt Peak, managing director for Energy Systems Network, a nonprofit that develops technology for the automation and energy sectors, told the <a href="https://www.post-gazette.com/business/tech-news/2020/06/01/Indy-Autonomous-Challenge-Indy-500-Indianapolis-Motor-Speedway-Ansys-Aptiv-self-driving-cars/stories/202005280137" target="_blank">Pittsburgh Post-Gazette</a>. "Either your car makes this happen or smash into the wall you go."</p>
Illustration of the Indy Autonomous Challenge
Indy Autonomous Challenge<p>The Indy Autonomous Challenge <a href="https://www.indyautonomouschallenge.com/rules" target="_blank">describes</a> itself as a "past-the-post" competition, which "refers to a binary, objective, measurable performance rather than a subjective evaluation, judgement, or recognition."</p><p>This competition design was inspired by the 2004 DARPA Grand Challenge, which tasked teams with developing driverless cars and sending them along a 150-mile route in Southern California for a chance to win $1 million. But that prize went unclaimed, because within a few hours after starting, all the vehicles had suffered some kind of critical failure.</p>
Indianapolis Motor Speedway
Indy Autonomous Challenge<p>One factor that could prevent a similar outcome in the upcoming race is the ability to test-run cars on a virtual racetrack. The simulation software company Ansys Inc. has already developed a model of the Indianapolis Motor Speedway on which teams will test their algorithms as part of a series of qualifying rounds.</p><p style="margin-left: 20px;">"We can create, with physics, multiple real-life scenarios that are reflective of the real world," Ansys President Ajei Gopal told <a href="https://www.wsj.com/articles/autonomous-vehicles-to-race-at-indianapolis-motor-speedway-11595237401?mod=e2tw" target="_blank">The Wall Street Journal</a>. "We can use that to train the AI, so it starts to come up to speed."</p><p>Still, the race could reveal that self-driving cars aren't quite ready to race at speeds of over 110 mph. After all, regular self-driving cars already face enough logistical and technical roadblocks, including <a href="https://www.bbc.com/news/technology-53349313#:~:text=Tesla%20will%20be%20able%20to,no%20driver%20input%2C%20he%20said." target="_blank">crumbling infrastructure, communication issues</a> and the <a href="https://bigthink.com/paul-ratner/would-you-ride-in-a-car-thats-programmed-to-kill-you" target="_self">fateful moral decisions driverless cars will have to make in split seconds</a>.</p>But the Indy Autonomous Challenge <a href="https://static1.squarespace.com/static/5da73021d0636f4ec706fa0a/t/5dc0680c41954d4ef41ec2b2/1572890638793/Indy+Autonomous+Challenge+Ruleset+-+v5NOV2019+%282%29.pdf" target="_blank">says</a> its main goal is to advance the industry, by challenging "students around the world to imagine, invent, and prove a new generation of automated vehicle (AV) software and inspire the next generation of STEM talent."
A new Harvard study finds that the language you use affects patient outcome.
- A study at Harvard's McLean Hospital claims that using the language of chemical imbalances worsens patient outcomes.
- Though psychiatry has largely abandoned DSM categories, professor Joseph E Davis writes that the field continues to strive for a "brain-based diagnostic system."
- Chemical explanations of mental health appear to benefit pharmaceutical companies far more than patients.
Challenging the Chemical Imbalance Theory of Mental Disorders: Robert Whitaker, Journalist<span style="display:block;position:relative;padding-top:56.25%;" class="rm-shortcode" data-rm-shortcode-id="41699c8c2cb2aee9271a36646e0bee7d"><iframe type="lazy-iframe" data-runner-src="https://www.youtube.com/embed/-8BDC7i8Yyw?rel=0" width="100%" height="auto" frameborder="0" scrolling="no" style="position:absolute;top:0;left:0;width:100%;height:100%;"></iframe></span><p>This is a far cry from Howard Rusk's 1947 NY Times editorial calling for mental healt</p><p>h disorders to be treated similarly to physical disease (such as diabetes and cancer). This mindset—not attributable to Rusk alone; he was merely relaying the psychiatric currency of the time—has dominated the field for decades: mental anguish is a genetic and/or chemical-deficiency disorder that must be treated pharmacologically.</p><p>Even as psychiatry untethered from DSM categories, the field still used chemistry to validate its existence. Psychotherapy, arguably the most efficient means for managing much of our anxiety and depression, is time- and labor-intensive. Counseling requires an empathetic and wizened ear to guide the patient to do the work. Ingesting a pill to do that work for you is more seductive, and easier. As Davis writes, even though the industry abandoned the DSM, it continues to strive for a "brain-based diagnostic system." </p><p>That language has infiltrated public consciousness. The team at McLean surveyed 279 patients seeking acute treatment for depression. As they note, the causes of psychological distress have constantly shifted over the millennia: humoral imbalance in the ancient world; spiritual possession in medieval times; early childhood experiences around the time of Freud; maladaptive thought patterns dominant in the latter half of last century. While the team found that psychosocial explanations remain popular, biogenetic explanations (such as the chemical imbalance theory) are becoming more prominent. </p><p>Interestingly, the 80 people Davis interviewed for his book predominantly relied on biogenetic explanations. Instead of doctors diagnosing patients, as you might expect, they increasingly serve to confirm what patients come in suspecting. Patients arrive at medical offices confident in their self-diagnoses. They believe a pill is the best course of treatment, largely because they saw an advertisement or listened to a friend. Doctors too often oblige without further curiosity as to the reasons for their distress. </p>
Image: Illustration Forest / Shutterstock<p>While medicalizing mental health softens the stigma of depression—if a disorder is inheritable, it was never really your fault—it also disempowers the patient. The team at McLean writes,</p><p style="margin-left: 20px;">"More recent studies indicate that participants who are told that their depression is caused by a chemical imbalance or genetic abnormality expect to have depression for a longer period, report more depressive symptoms, and feel they have less control over their negative emotions."</p><p>Davis points out the language used by direct-to-consumer advertising prevalent in America. Doctors, media, and advertising agencies converge around common messages, such as everyday blues is a "real medical condition," everyone is susceptible to clinical depression, and drugs correct underlying somatic conditions that you never consciously control. He continues,</p><p style="margin-left: 20px;">"Your inner life and evaluative stance are of marginal, if any, relevance; counseling or psychotherapy aimed at self-insight would serve little purpose." </p><p>The McLean team discovered a similar phenomenon: patients expect little from psychotherapy and a lot from pills. When depression is treated as the result of an internal and immutable essence instead of environmental conditions, behavioral changes are not expected to make much difference. Chemistry rules the popular imagination.</p>