Once a week.
Subscribe to our weekly newsletter.
Researchers identify genes linked to severe repetitive behaviors
A lab identifies which genes are linked to abnormal repetitive behaviors found in addiction and schizophrenia.
These behaviors, termed stereotypies, are also apparent in animal models of drug addiction and autism.
In a new study published in the European Journal of Neuroscience, researchers at the McGovern Institute for Brain Research have identified genes that are activated in the brain prior to the initiation of these severe repetitive behaviors.
"Our lab has found a small set of genes that are regulated in relation to the development of stereotypic behaviors in an animal model of drug addiction," says MIT Institute Professor Ann Graybiel, who is the senior author of the paper. "We were surprised and interested to see that one of these genes is a susceptibility gene for schizophrenia. This finding might help to understand the biological basis of repetitive, stereotypic behaviors as seen in a range of neurologic and neuropsychiatric disorders, and in otherwise 'typical' people under stress."
A shared molecular pathway
In work led by Research Scientist Jill Crittenden, scientists in the Graybiel lab exposed mice to amphetamine, a psychomotor stimulant that drives hyperactivity and confined stereotypies in humans and in laboratory animals and that is used to model symptoms of schizophrenia.
They found that stimulant exposure that drives the most prolonged repetitive behaviors led to activation of genes regulated by Neuregulin 1, a signaling molecule that is important for a variety of cellular functions including neuronal development and plasticity. Neuregulin 1 gene mutations are risk factors for schizophrenia.
The new findings highlight a shared molecular and circuit pathway for stereotypies that are caused by drugs of abuse and in brain disorders, and have implications for why stimulant intoxication is a risk factor for the onset of schizophrenia.
"Experimental treatment with amphetamine has long been used in studies on rodents and other animals in tests to find better treatments for schizophrenia in humans, because there are some behavioral similarities across the two otherwise very different contexts," explains Graybiel, who is also an investigator at the McGovern Institute and a professor of brain and cognitive sciences at MIT. "It was striking to find Neuregulin 1 — potentially one hint to shared mechanisms underlying some of these similarities."
Drug exposure linked to repetitive behaviors
Although many studies have measured gene expression changes in animal models of drug addiction, this study is the first to evaluate genome-wide changes specifically associated with restricted repetitive behaviors.
Stereotypies are difficult to measure without labor-intensive direct observation, because they consist of fine movements and idiosyncratic behaviors. In this study, the authors administered amphetamine (or saline control) to mice and then measured with photobeam-breaks how much they ran around. The researchers identified prolonged periods when the mice were not running around (i.e., were potentially engaged in confined stereotypies), and then they videotaped the mice during these periods to observationally score the severity of restricted repetitive behaviors (e.g., sniffing or licking stereotypies).
They gave amphetamine to each mouse once a day for 21 days and found that, on average, mice showed very little stereotypy on the first day of drug exposure but that, by the seventh day of exposure, all of the mice showed a prolonged period of stereotypy that gradually became shorter and shorter over the subsequent two weeks.
"We were surprised to see the stereotypy diminishing after one week of treatment. We had actually planned a study based on our expectation that the repetitive behaviors would become more intense, but then we realized that this was an opportunity to look at what gene changes were unique to that day of high stereotypy," says first author Jill Crittenden.
The authors compared gene expression changes in the brains of mice treated with amphetamine for one day, seven days, or 21 days. They hypothesized that the gene changes associated specifically with high-stereotypy-associated seven days of drug treatment were the most likely to underlie extreme repetitive behaviors and could identify risk-factor genes for such symptoms in disease.
A shared anatomical pathway
Previous work from the Graybiel lab has shown that stereotypy is directly correlated to circumscribed gene activation in the striatum, a forebrain region that is key for habit formation. In animals with the most intense stereotypy, most of the striatum does not show gene activation, but immediate early gene induction remains high in clusters of cells called striosomes. Striosomes have recently been shown to have powerful control over cells that release dopamine, a neuromodulator that is severely disrupted in drug addiction and in schizophrenia. Strikingly, striosomes contain high levels of Neuregulin 1.
"Our new data suggest that the upregulation of Neuregulin-responsive genes in animals with severely repetitive behaviors reflects gene changes in the striosomal neurons that control the release of dopamine," Crittenden explains. "Dopamine can directly impact whether an animal repeats an action or explores new actions, so our study highlights a potential role for a striosomal circuit in controlling action-selection in health and in neuropsychiatric disease."
Patterns of behavior and gene expression
Striatal gene expression levels were measured by sequencing messenger RNAs (mRNAs) in dissected brain tissue. mRNAs are read out from "active" genes to instruct protein-synthesis machinery in how to make the protein that corresponds to the gene's sequence. Proteins are the main constituents of a cell, thereby controlling each cell's function. The number of times a particular mRNA sequence is found reflects the frequency at which the gene was being read out at the time that the cellular material was collected.
To identify genes that were read out into mRNA before the period of prolonged stereotypy, the researchers collected brain tissue 20 minutes after amphetamine injection, which is about 30 minutes before peak stereotypy. They then identified which genes had significantly different levels of corresponding mRNAs in drug-treated mice than in mice treated with saline.
A wide variety of genes showed modest mRNA increases after the first amphetamine exposure, which induced mild hyperactivity and a range of behaviors such as walking, sniffing, and rearing in the mice.
By the seventh day of treatment, all of the mice were engaged for prolonged periods in one specific repetitive behavior, such as sniffing the wall. Likewise, there were fewer genes that were activated by the seventh day relative to the first treatment day, but they were strongly activated in all mice that received the stereotypy-inducing amphetamine treatment.
By the 21st day of treatment, the stereotypy behaviors were less intense, as was the gene upregulation — fewer genes were strongly activated, and more were repressed, relative to the other treatments. "It seemed that the mice had developed tolerance to the drug, both in terms of their behavioral response and in terms of their gene activation response," says Crittenden.
"Trying to seek patterns of gene regulation starting with behavior is correlative work, and we did not prove 'causality' in this first small study," explains Graybiel. "But we hope that the striking parallels between the scope and selectivity of the mRNA and behavioral changes that we detected will help in further work on the tremendously challenging goal of treating addiction."
This work was funded by the National Institute of Child Health and Human Development, the Saks-Kavanaugh Foundation, the Broderick Fund for Phytocannabinoid Research at MIT, the James and Pat Poitras Research Fund, The Simons Foundation, and The Stanley Center for Psychiatric Research at the Broad Institute.
- Study may explain how infections reduce autism symptoms | MIT ... ›
- Our First Nature Needs Second Natures - Big Think ›
We explore the history of blood types and how they are classified to find out what makes the Rh-null type important to science and dangerous for those who live with it.
- Fewer than 50 people worldwide have 'golden blood' — or Rh-null.
- Blood is considered Rh-null if it lacks all of the 61 possible antigens in the Rh system.
- It's also very dangerous to live with this blood type, as so few people have it.
Golden blood sounds like the latest in medical quackery. As in, get a golden blood transfusion to balance your tantric midichlorians and receive a free charcoal ice cream cleanse. Don't let the New-Agey moniker throw you. Golden blood is actually the nickname for Rh-null, the world's rarest blood type.
As Mosaic reports, the type is so rare that only about 43 people have been reported to have it worldwide, and until 1961, when it was first identified in an Aboriginal Australian woman, doctors assumed embryos with Rh-null blood would simply die in utero.
But what makes Rh-null so rare, and why is it so dangerous to live with? To answer that, we'll first have to explore why hematologists classify blood types the way they do.
A (brief) bloody history
Our ancestors understood little about blood. Even the most basic of blood knowledge — blood inside the body is good, blood outside is not ideal, too much blood outside is cause for concern — escaped humanity's grasp for an embarrassing number of centuries.
Absence this knowledge, our ancestors devised less-than-scientific theories as to what blood was, theories that varied wildly across time and culture. To pick just one, the physicians of Shakespeare's day believed blood to be one of four bodily fluids or "humors" (the others being black bile, yellow bile, and phlegm).
Handed down from ancient Greek physicians, humorism stated that these bodily fluids determined someone's personality. Blood was considered hot and moist, resulting in a sanguine temperament. The more blood people had in their systems, the more passionate, charismatic, and impulsive they would be. Teenagers were considered to have a natural abundance of blood, and men had more than women.
Humorism lead to all sorts of poor medical advice. Most famously, Galen of Pergamum used it as the basis for his prescription of bloodletting. Sporting a "when in doubt, let it out" mentality, Galen declared blood the dominant humor, and bloodletting an excellent way to balance the body. Blood's relation to heat also made it a go-to for fever reduction.
While bloodletting remained common until well into the 19th century, William Harvey's discovery of the circulation of blood in 1628 would put medicine on its path to modern hematology.
Soon after Harvey's discovery, the earliest blood transfusions were attempted, but it wasn't until 1665 that first successful transfusion was performed by British physician Richard Lower. Lower's operation was between dogs, and his success prompted physicians like Jean-Baptiste Denis to try to transfuse blood from animals to humans, a process called xenotransfusion. The death of human patients ultimately led to the practice being outlawed.4
The first successful human-to-human transfusion wouldn't be performed until 1818, when British obstetrician James Blundell managed it to treat postpartum hemorrhage. But even with a proven technique in place, in the following decades many blood-transfusion patients continued to die mysteriously.
Enter Austrian physician Karl Landsteiner. In 1901 he began his work to classify blood groups. Exploring the work of Leonard Landois — the physiologist who showed that when the red blood cells of one animal are introduced to a different animal's, they clump together — Landsteiner thought a similar reaction may occur in intra-human transfusions, which would explain why transfusion success was so spotty. In 1909, he classified the A, B, AB, and O blood groups, and for his work he received the 1930 Nobel Prize for Physiology or Medicine.
What causes blood types?
It took us a while to grasp the intricacies of blood, but today, we know that this life-sustaining substance consists of:
- Red blood cells — cells that carry oxygen and remove carbon dioxide throughout the body;
- White blood cells — immune cells that protect the body against infection and foreign agents;
- Platelets — cells that help blood clot; and
- Plasma — a liquid that carries salts and enzymes.6,7
Each component has a part to play in blood's function, but the red blood cells are responsible for our differing blood types. These cells have proteins* covering their surface called antigens, and the presence or absence of particular antigens determines blood type — type A blood has only A antigens, type B only B, type AB both, and type O neither. Red blood cells sport another antigen called the RhD protein. When it is present, a blood type is said to be positive; when it is absent, it is said to be negative. The typical combinations of A, B, and RhD antigens give us the eight common blood types (A+, A-, B+, B-, AB+, AB-, O+, and O-).
Blood antigen proteins play a variety of cellular roles, but recognizing foreign cells in the blood is the most important for this discussion.
Think of antigens as backstage passes to the bloodstream, while our immune system is the doorman. If the immune system recognizes an antigen, it lets the cell pass. If it does not recognize an antigen, it initiates the body's defense systems and destroys the invader. So, a very aggressive doorman.
While our immune systems are thorough, they are not too bright. If a person with type A blood receives a transfusion of type B blood, the immune system won't recognize the new substance as a life-saving necessity. Instead, it will consider the red blood cells invaders and attack. This is why so many people either grew ill or died during transfusions before Landsteiner's brilliant discovery.
This is also why people with O negative blood are considered "universal donors." Since their red blood cells lack A, B, and RhD antigens, immune systems don't have a way to recognize these cells as foreign and so leaves them well enough alone.
How is Rh-null the rarest blood type?
Let's return to golden blood. In truth, the eight common blood types are an oversimplification of how blood types actually work. As Smithsonian.com points out, "[e]ach of these eight types can be subdivided into many distinct varieties," resulting in millions of different blood types, each classified on a multitude of antigens combinations.
Here is where things get tricky. The RhD protein previously mentioned only refers to one of 61 potential proteins in the Rh system. Blood is considered Rh-null if it lacks all of the 61 possible antigens in the Rh system. This not only makes it rare, but this also means it can be accepted by anyone with a rare blood type within the Rh system.
This is why it is considered "golden blood." It is worth its weight in gold.
As Mosaic reports, golden blood is incredibly important to medicine, but also very dangerous to live with. If a Rh-null carrier needs a blood transfusion, they can find it difficult to locate a donor, and blood is notoriously difficult to transport internationally. Rh-null carriers are encouraged to donate blood as insurance for themselves, but with so few donors spread out over the world and limits on how often they can donate, this can also put an altruistic burden on those select few who agree to donate for others.
Some bloody good questions about blood types
A nurse takes blood samples from a pregnant woman at the North Hospital (Hopital Nord) in Marseille, southern France.
Photo by BERTRAND LANGLOIS / AFP
There remain many mysteries regarding blood types. For example, we still don't know why humans evolved the A and B antigens. Some theories point to these antigens as a byproduct of the diseases various populations contacted throughout history. But we can't say for sure.
In this absence of knowledge, various myths and questions have grown around the concept of blood types in the popular consciousness. Here are some of the most common and their answers.
Do blood types affect personality?
Japan's blood type personality theory is a contemporary resurrection of humorism. The idea states that your blood type directly affects your personality, so type A blood carriers are kind and fastidious, while type B carriers are optimistic and do their own thing. However, a 2003 study sampling 180 men and 180 women found no relationship between blood type and personality.
The theory makes for a fun question on a Cosmopolitan quiz, but that's as accurate as it gets.
Should you alter your diet based on your blood type?
Remember Galen of Pergamon? In addition to bloodletting, he also prescribed his patients to eat certain foods depending on which humors needed to be balanced. Wine, for example, was considered a hot and dry drink, so it would be prescribed to treat a cold. In other words, belief that your diet should complement your blood type is yet another holdover of humorism theory.
Created by Peter J. D'Adamo, the Blood Type Diet argues that one's diet should match one's blood type. Type A carriers should eat a meat-free diet of whole grains, legumes, fruits, and vegetables; type B carriers should eat green vegetables, certain meats, and low-fat dairy; and so on.
However, a study from the University of Toronto analyzed the data from 1,455 participants and found no evidence to support the theory. While people can lose weight and become healthier on the diet, it probably has more to do with eating all those leafy greens than blood type.
Are there links between blood types and certain diseases?
There is evidence to suggest that different blood types may increase the risk of certain diseases. One analysis suggested that type O blood decreases the risk of having a stroke or heart attack, while AB blood appears to increase it. With that said, type O carriers have a greater chance of developing peptic ulcers and skin cancer.
None of this is to say that your blood type will foredoom your medical future. Many factors, such as diet and exercise, hold influence over your health and likely to a greater extent than blood type.
What is the most common blood type?
In the United States, the most common blood type is O+. Roughly one in three people sports this type of blood. Of the eight well-known blood types, the least common is AB-. Only one in 167 people in the U.S. have it.
Do animals have blood types?
They most certainly do, but they are not the same as ours. This difference is why those 17th-century patients who thought, "Animal blood, now that's the ticket!" ultimately had their tickets punched. In fact, blood types are distinct between species. Unhelpfully, scientists sometimes use the same nomenclature to describe these different types. Cats, for example, have A and B antigens, but these are not the same A and B antigens found in humans.
Interestingly, xenotransfusion is making a comeback. Scientists are working to genetically engineer the blood of pigs to potentially produce human compatible blood.
Scientists are also looking into creating synthetic blood. If they succeed, they may be able to ease the current blood shortage, while also devising a way to create blood for rare blood type carriers. While this may make golden blood less golden, it would certainly make it easier to live with.* While antigens are typically proteins, they can be other molecules as well, such as polysaccharides.
The author of 'How We Read' Now explains.
During the pandemic, many college professors abandoned assignments from printed textbooks and turned instead to digital texts or multimedia coursework.
As a professor of linguistics, I have been studying how electronic communication compares to traditional print when it comes to learning. Is comprehension the same whether a person reads a text onscreen or on paper? And are listening and viewing content as effective as reading the written word when covering the same material?
The answers to both questions are often “no," as I discuss in my book “How We Read Now," released in March 2021. The reasons relate to a variety of factors, including diminished concentration, an entertainment mindset and a tendency to multitask while consuming digital content.
Print versus digital reading
The benefits of print particularly shine through when experimenters move from posing simple tasks – like identifying the main idea in a reading passage – to ones that require mental abstraction – such as drawing inferences from a text. Print reading also improves the likelihood of recalling details – like “What was the color of the actor's hair?" – and remembering where in a story events occurred – “Did the accident happen before or after the political coup?"
Studies show that both grade school students and college students assume they'll get higher scores on a comprehension test if they have done the reading digitally. And yet, they actually score higher when they have read the material in print before being tested.
Educators need to be aware that the method used for standardized testing can affect results. Studies of Norwegian tenth graders and U.S. third through eighth graders report higher scores when standardized tests were administered using paper. In the U.S. study, the negative effects of digital testing were strongest among students with low reading achievement scores, English language learners and special education students.
My own research and that of colleagues approached the question differently. Rather than having students read and take a test, we asked how they perceived their overall learning when they used print or digital reading materials. Both high school and college students overwhelmingly judged reading on paper as better for concentration, learning and remembering than reading digitally.
The discrepancies between print and digital results are partly related to paper's physical properties. With paper, there is a literal laying on of hands, along with the visual geography of distinct pages. People often link their memory of what they've read to how far into the book it was or where it was on the page.
But equally important is mental perspective, and what reading researchers call a “shallowing hypothesis." According to this theory, people approach digital texts with a mindset suited to casual social media, and devote less mental effort than when they are reading print.
Podcasts and online video
Given increased use of flipped classrooms – where students listen to or view lecture content before coming to class – along with more publicly available podcasts and online video content, many school assignments that previously entailed reading have been replaced with listening or viewing. These substitutions have accelerated during the pandemic and move to virtual learning.
Surveying U.S. and Norwegian university faculty in 2019, University of Stavanger Professor Anne Mangen and I found that 32% of U.S. faculty were now replacing texts with video materials, and 15% reported doing so with audio. The numbers were somewhat lower in Norway. But in both countries, 40% of respondents who had changed their course requirements over the past five to 10 years reported assigning less reading today.
A primary reason for the shift to audio and video is students refusing to do assigned reading. While the problem is hardly new, a 2015 study of more than 18,000 college seniors found only 21% usually completed all their assigned course reading.
Maximizing mental focus
Researchers found similar results with university students reading an article versus listening to a podcast of the text. A related study confirms that students do more mind-wandering when listening to audio than when reading.
Results with younger students are similar, but with a twist. A study in Cyprus concluded that the relationship between listening and reading skills flips as children become more fluent readers. While second graders had better comprehension with listening, eighth graders showed better comprehension when reading.
Research on learning from video versus text echoes what we see with audio. For example, researchers in Spain found that fourth through sixth graders who read texts showed far more mental integration of the material than those watching videos. The authors suspect that students “read" the videos more superficially because they associate video with entertainment, not learning.
The collective research shows that digital media have common features and user practices that can constrain learning. These include diminished concentration, an entertainment mindset, a propensity to multitask, lack of a fixed physical reference point, reduced use of annotation and less frequent reviewing of what has been read, heard or viewed.
Digital texts, audio and video all have educational roles, especially when providing resources not available in print. However, for maximizing learning where mental focus and reflection are called for, educators – and parents – shouldn't assume all media are the same, even when they contain identical words.
Humans may have evolved to be tribalistic. Is that a bad thing?
- From politics to every day life, humans have a tendency to form social groups that are defined in part by how they differ from other groups.
- Neuroendocrinologist Robert Sapolsky, author Dan Shapiro, and others explore the ways that tribalism functions in society, and discuss how—as social creatures—humans have evolved for bias.
- But bias is not inherently bad. The key to seeing things differently, according to Beau Lotto, is to "embody the fact" that everything is grounded in assumptions, to identify those assumptions, and then to question them.
Ancient corridors below the French capital have served as its ossuary, playground, brewery, and perhaps soon, air conditioning.