Once a week.
Subscribe to our weekly newsletter.
To the brain, reading computer code is not the same as reading language
Reading code activates a general-purpose brain network, but not language-processing centers.
It requires learning new symbols and terms, which must be organized correctly to instruct the computer what to do. The computer code must also be clear enough that other programmers can read and understand it.
In spite of those similarities, MIT neuroscientists have found that reading computer code does not activate the regions of the brain that are involved in language processing. Instead, it activates a distributed network called the multiple demand network, which is also recruited for complex cognitive tasks such as solving math problems or crossword puzzles.
However, although reading computer code activates the multiple demand network, it appears to rely more on different parts of the network than math or logic problems do, suggesting that coding does not precisely replicate the cognitive demands of mathematics either.
"Understanding computer code seems to be its own thing. It's not the same as language, and it's not the same as math and logic," says Anna Ivanova, an MIT graduate student and the lead author of the study.
Evelina Fedorenko, the Frederick A. and Carole J. Middleton Career Development Associate Professor of Neuroscience and a member of the McGovern Institute for Brain Research, is the senior author of the paper, which appears today in eLife. Researchers from MIT's Computer Science and Artificial Intelligence Laboratory and Tufts University were also involved in the study.
Language and cognition
A major focus of Fedorenko's research is the relationship between language and other cognitive functions. In particular, she has been studying the question of whether other functions rely on the brain's language network, which includes Broca's area and other regions in the left hemisphere of the brain. In previous work, her lab has shown that music and math do not appear to activate this language network.
"Here, we were interested in exploring the relationship between language and computer programming, partially because computer programming is such a new invention that we know that there couldn't be any hardwired mechanisms that make us good programmers," Ivanova says.
There are two schools of thought regarding how the brain learns to code, she says. One holds that in order to be good at programming, you must be good at math. The other suggests that because of the parallels between coding and language, language skills might be more relevant. To shed light on this issue, the researchers set out to study whether brain activity patterns while reading computer code would overlap with language-related brain activity.
The two programming languages that the researchers focused on in this study are known for their readability — Python and ScratchJr, a visual programming language designed for children age 5 and older. The subjects in the study were all young adults proficient in the language they were being tested on. While the programmers lay in a functional magnetic resonance (fMRI) scanner, the researchers showed them snippets of code and asked them to predict what action the code would produce.
The researchers saw little to no response to code in the language regions of the brain. Instead, they found that the coding task mainly activated the so-called multiple demand network. This network, whose activity is spread throughout the frontal and parietal lobes of the brain, is typically recruited for tasks that require holding many pieces of information in mind at once, and is responsible for our ability to perform a wide variety of mental tasks.
"It does pretty much anything that's cognitively challenging, that makes you think hard," Ivanova says.
Previous studies have shown that math and logic problems seem to rely mainly on the multiple demand regions in the left hemisphere, while tasks that involve spatial navigation activate the right hemisphere more than the left. The MIT team found that reading computer code appears to activate both the left and right sides of the multiple demand network, and ScratchJr activated the right side slightly more than the left. This finding goes against the hypothesis that math and coding rely on the same brain mechanisms.
Effects of experience
The researchers say that while they didn't identify any regions that appear to be exclusively devoted to programming, such specialized brain activity might develop in people who have much more coding experience.
"It's possible that if you take people who are professional programmers, who have spent 30 or 40 years coding in a particular language, you may start seeing some specialization, or some crystallization of parts of the multiple demand system," Fedorenko says. "In people who are familiar with coding and can efficiently do these tasks, but have had relatively limited experience, it just doesn't seem like you see any specialization yet."
In a companion paper appearing in the same issue of eLife, a team of researchers from Johns Hopkins University also reported that solving code problems activates the multiple demand network rather than the language regions.
The findings suggest there isn't a definitive answer to whether coding should be taught as a math-based skill or a language-based skill. In part, that's because learning to program may draw on both language and multiple demand systems, even if — once learned — programming doesn't rely on the language regions, the researchers say.
"There have been claims from both camps — it has to be together with math, it has to be together with language," Ivanova says. "But it looks like computer science educators will have to develop their own approaches for teaching code most effectively."
The research was funded by the National Science Foundation, the Department of the Brain and Cognitive Sciences at MIT, and the McGovern Institute for Brain Research.
- There's a New Programming Language Based on Donald Trump ... ›
- Is coding a basic life skill? Yes and no, say experts - Big Think ›
- A centuries-old sealed letter is read for the first time - Big Think ›
- Is code a language or math? MIT study uses brain scans for answers - Big Think ›
While legalization has benefits, a new study suggests it may have one big drawback.
- A new study finds that rates of marijuana use and addiction have gone up in states that have recently legalized the drug.
- The problem was most severe for those over age of 26, with cases of addiction rising by a third.
- The findings complicate the debate around legalization.
Cannabis Use Disorder, is that when you get so high you can’t figure out how to smoke anymore?
Cannabis use disorder, also known as CUD or cannabis/marijuana addiction, is a psychological disorder described in DSM 5 as "the continued use of cannabis despite clinically significant impairment." This includes people being unable to cut down on their usage despite wanting to, those who often use it despite finding it severely impairs their ability to function, or those who are putting themselves in danger to secure access to the drug.
While an understanding that marijuana can be addictive has existed for some time, and the image of the pothead who smokes so much they can hardly function is prevalent in our society, the effects of legalization on addiction rates have somehow gone understudied until now. Importantly, previous studies had failed to consider usage rates amongst populations over the age of 25.
In the new study, published in JAMA Psychiatry, focused on self-reported data on monthly drug use in four states where marijuana is now legal, Colorado, Washington, Alaska, and Oregon, from both before and after the drug was legalized in each state and compared it to others which have not yet legalized.
The data gave insights into the drug use habits of the respondents and specifically gave information about if they had smoked at all in the last month, the frequency of their drug use, and if they had ever had issues with how much they were using drugs.The researchers ultimately considered the responses of 505,796 individuals.
The increase in cannabis usage they found was considerable. The number of respondents over the age of 26 who claimed to have used the drug in the last month went up by 23% compared with their counterparts in states that have yet to legalize. Abuse of the drug by this group rose by 37%.
Teen usage rose by 25%, and addiction rates rose as well. This increase was small, though, and the authors have suggested it may be due to an unknown factor. The rate of usage or abuse for respondents between the ages of 18 and 25 did not increase at all.
After breaking the results down by demographics, the primary finding held; adults over the age of 26 are using marijuana more often when it is legalized, and they are starting to use it too much.
The grain of salt
As in any study where findings are self-reported, the exact numbers you see here should be taken with a grain of salt. They could be slightly higher or lower. As this study relies on people self-reporting their usage of a drug that is still illegal in many places, it is very possible that the apparent spike in addiction rates is caused by more accurate reporting, as people who live in an area where pot is still illegal may be less likely to report smoking it every day.
And it should be repeated a thousand times over that correlation and causation are not the same thing. There could be some unknown factor causing these increases in each case.
Despite these qualifications, the study is still useful in giving us a general sense of what may happen in states that have yet to legalize.
What does this mean for society and drug users?
While claims of "reefer madness" are greatly exaggerated, marijuana has several well established and thoroughly studied side effects. While occasional use isn't terribly harmful, addiction can be. Lead author Magdalena Cerdá of New York University explains in the study that heavy marijuana use is associated with "psychological and physical health concerns, lower educational attainment, decline in social class, unemployment, and motor vehicle crashes."
A substantial increase in the number of people who are addicted to the stuff will incur costs to society down the line.
Of course, a 37% increase in problematic usage means that the percentage of adults smoking too much went from .9% to 1.23% of the population responding to the survey. This makes it far less prevalent than issues with alcohol, which affected around 6% of all Americans in 2018.
Recently, Big Think's Philip Perry wrote a piece about how legalization could improve the health of millions by allowing the government to regulate the purity of commercially sold marijuana. This remains true. However, it must be weighed against the findings of this study, which suggests that at least some of these health gains will be wiped out by increased addiction rates.
What does this mean for legalization efforts?
The legalization steamroller will undoubtedly keep rolling along. While health concerns are one factor in the debate over marijuana, it is only one of many. In Illinois, where I live, weed will become legal on January 1st of 2020. The legalization campaign and legislation were more concerned with issues of social justice, the failures of prohibition, and finding a new source of tax revenue (since we're half broke) than with matters of potential addiction.
As Vox reports, the authors of the study aren't suggesting that legalization shouldn't take place; that is another, broader debate. They merely wish to present the fact that legalization has a particular side effect that we should be aware of.
While this study is unlikely to change anybody's stance on if weed should be legalized or not, it does show us a critical element to be considered when discussing drug policy. No drug is perfectly safe, and we have reason to believe that legalizing marijuana will mean that more people will have a hard time with it. Let's hope that legalization proponents keep that in mind as they rack up their victories.
For some reason, the bodies of deceased monks stay "fresh" for a long time.
It's definitely happening, and it's definitely weird. After the apparent death of some monks, their bodies remain in a meditating position without decaying for an extraordinary length of time, often as long as two or three weeks.
Tibetan Buddhists, who view death as a process rather than an event, might assert that the spirit has not yet finished with the physical body. For them, thukdam begins with a "clear light" meditation that allows the mind to gradually unspool, eventually dissipating into a state of universal consciousness no longer attached to the body. Only at that time is the body free to die.
Whether you believe this or not, it is a fascinating phenomenon: the fact remains that their bodies don't decompose like other bodies. (There have been a handful of other unexplained instances of delayed decomposition elsewhere in the world.)
The scientific inquiry into just what is going on with thukdam has attracted the attention and support of the Dalai Lama, the highest monk in Tibetan Buddhism. He has reportedly been looking for scientists to solve the riddle for about 20 years. He is a supporter of science, writing, "Buddhism and science are not conflicting perspectives on the world, but rather differing approaches to the same end: seeking the truth."
The most serious study of the phenomenon so far is being undertaken by The Thukdam Project of the University of Wisconsin-Madison's Center for Healthy Minds. Neuroscientist Richard Davidson is one of the founders of the center and has published hundreds of articles about mindfulness.
Davidson first encountered thukdam after his Tibetan monk friend Geshe Lhundub Sopa died, officially on August 28, 2014. Davidson last saw him five days later: "There was absolutely no change. It was really quite remarkable."
The science so far
Credit: GrafiStart / Adobe Stock
The Thukdam Project published its first annual report this winter. It discussed a recent study in which electroencephalograms failed to detect any brain activity in 13 monks who had practiced thukdam and had been dead for at least 26 hours. Davidson was senior author of the study.
While some might be inclined to say, well, that's that, Davidson sees the research as just a first step on a longer road. Philosopher Evan Thompson, who is not involved in The Thukdam Project, tells Tricycle, "If the thinking was that thukdam is something we can measure in the brain, this study suggests that's not the right place to look."
In any event, the question remains: why are these apparently deceased monks so slow to begin decomposition? While environmental factors can slow or speed up the process a bit, usually decomposition begins about four minutes after death and becomes quite obvious over the course of the next day or so.
As the Dalai Lama said:
"What science finds to be nonexistent we should all accept as nonexistent, but what science merely does not find is a completely different matter. An example is consciousness itself. Although sentient beings, including humans, have experienced consciousness for centuries, we still do not know what consciousness actually is: its complete nature and how it functions."
As thukdam researchers continue to seek a signal of post-mortem consciousness of some sort, it's fair to ask what — and where — consciousness is in the first place. It is a question with which Big Think readers are familiar. We write about new theories all the time: consciousness happens on a quantum level; consciousness is everywhere.
So far, though, says Tibetan medical doctor Tawni Tidwell, also a Thukdam Project member, searches beyond the brain for signs of consciousness have gone nowhere. She is encouraged, however, that a number of Tibetan monks have come to the U.S. for medical knowledge that they can take home. When they arrive back in Tibet, she says, "It's not the Westerners who are doing the measuring and poking and prodding. It's the monastics who trained at Emory."
When Olympic athletes perform dazzling feats of athletic prowess, they are using the same principles of physics that gave birth to stars and planets.
- Much of the beauty of gymnastics comes from the physics principle called the conservation of angular momentum.
- Conservation of angular momentum tells us that when a spinning object changes how its matter is distributed, it changes its rate of spin.
- Conservation of angular momentum links the formation of planets in star-forming clouds to the beauty of a gymnast's spinning dismount from the uneven bars.
It is that time again when we watch in awe as Olympic athletes perform dazzling feats of athletic prowess. But as we stare in rapt attention at the speed, grace, and strength they exhibit, it is also a good time to pay attention to how they embody, literally, fundamental principles that shape the entire universe. Yes, I'm talking about physics. On our screens, these athletes are giving us lessons in the principles that giants like Isaac Newton struggled mightily to articulate.
Naturally, there are many Olympic events from which we could learn some basic principles of physics. Swimming shows us hydrodynamic drag. Boxing teaches us about force and impulse. (Ouch!) But today, we will focus on gymnastics and the cosmic importance of the conservation of angular momentum.
The conservation of angular momentum
Much of the beauty of gymnastics comes from the spins and flips athletes perform as they launch themselves into the air from the vault or uneven bars. These are all examples of rotations — and so much of the structure and history of the universe, from planets to galaxies, comes down to the physics of rotating objects. And so much of the physics of rotating objects comes down to the conservation of angular momentum.
Let's start with the conservation of regular or "linear" momentum. Momentum is the product of mass and velocity. Way back in the age of Galileo and Newton, physicists came to understand that in the interactions between bodies, the sum of their momentums had to be conserved (which really means "does not change"). This is a familiar idea to anyone who has played billiards: when a moving pool ball strikes a stationary one, the first ball stops while the second scoots away. The total momentum of the system (the mass times velocity of both balls taken together) is conserved, leaving the originally moving ball unmoving and the originally stationary ball carrying all the system's momentum.
Credit: Sergey Nivens and Victoria VIAR PRO via Adobe Stock
Rotating objects also obey a conservation law, but now it is not just the mass of an object that matters. The distribution of mass — that is, where the mass is located relative to the center of the rotation — is also a factor. Conservation of angular momentum tells us that if a spinning object is not subject to any forces, then any changes in how its matter is distributed must lead to a change in its rate of spin. Comparing the conservation of angular momentum to the conservation of linear momentum, the "distribution of mass" is analogous to mass, and the "rate of spin" is analogous to velocity.
There are many places in cosmic physics where this conservation of angular momentum is key. My favorite example is the formation of stars. Every star begins its life as a giant cloud of slowly spinning interstellar gas. The clouds are usually supported against their own gravitational weight by gas pressure, but sometimes a small nudge from, say, a passing supernova blast wave will force the cloud to begin gravitational collapse. As the cloud begins to shrink, the conservation of angular momentum forces the spin rate of material in the cloud to speed up. As material is falling inward, it also rotates around the cloud's center at ever higher rates. Eventually, some of that gas is going so fast that a balance between the gravity of the newly forming star and what is called centrifugal force is achieved. That stuff then stops moving inward and goes into orbit around the young star, forming a disk, some material of which eventually becomes planets. So, the conservation of angular momentum is, literally, why we have planets in the universe!
Gymnastics, a cosmic sport
How does this appear in gymnastics? When athletes hurl themselves into the air to perform a flip, the only force acting on them is gravity. But since gravity only affects their "center of mass," it cannot apply forces in a way that changes the athlete's spin. But the gymnasts can do that for themselves by using the conservation of angular momentum.
By changing how their mass is arranged, gymnasts can change how fast they spin. You can see this in the dismount phase of the uneven bar competitions. When a gymnast comes off the bars and performs a flip by tucking their legs inward, they can quickly increase their rotation rate in midair. The sudden dramatic increase in the speed of their flip is what makes us gasp in astonishment. It is both scary and a beautiful testament to the athletes' ability to intuitively control the physics of their bodies. And it is also the exact same physics that controls the birth of planets.
"As above so below," goes the old saying. You should keep that in mind as you watch the glory that is the Olympics. That is because it is not just athletes that have this intuitive understanding of physics. We all have it, and we use it every day, from walking down the stairs to swinging a hammer. So, it is no exaggeration to claim that the first place we came to understand the deepest principles of physics was not in contemplating the heavens but moving through the world in our own earthbound flesh.