Chocolate: A Dangerous Drug?
Nora D. Volkow, M.D., became Director of the National Institute on Drug Abuse (NIDA) at the National Institutes of Health in May 2003. NIDA supports most of the world's research on the health aspects of drug abuse and addiction.
Dr. Volkow's work has been instrumental in demonstrating that drug addiction is a disease of the human brain. As a research psychiatrist and scientist, Dr. Volkow pioneered the use of brain imaging to investigate the toxic effects of drugs and their addictive properties. Her studies have documented changes in the dopamine system affecting the actions of frontal brain regions involved with motivation, drive, and pleasure and the decline of brain dopamine function with age. She has also made important contributions to the neurobiology of obesity, ADHD, and the behavioral changes that occur with aging.
Question: Are there certain patterns of food consumption that lead to addiction?
Nora Volkow: Oh, yeah, and I do love chocolate. I just actually have found very few people that don't like chocolate. There are certain instances, of course, where you see people that are morbidly obese, where the notion of control of food is basically almost impossible. Where they don't want to be obese, I've never encountered anybody that wants to be obese, so I've been intrigued. That was the first question in my brain when I was saying, we are seeing these changes in the conditioning responses, in the ability to control in people that are addicted, but the behavior is similar to that, that you see in people that are morbidly obese. And so I've started to actually, systematically, that's how I started to use imaging to understand the brain. And again, dopamine is very important. Dopamine drives the motivation to it. In animals you can actually predict how much an animal is willing to press a lever in order to get the food on the basis of how much that stimuli is releasing dopamine. So the more it releases dopamine, the more the drive to get the food.
Now, why is it that some people are more sensitive to food versus something else or more sensitive to chocolate versus french fries? Well, again, food is more complicated than drugs. Because in drugs, where abuse start with is the rewarding responses. With food, what drives eating behavior is chemical signaling that are aiming at maintaining a balance of calories of energy requirements that responds to chemical signals that are just all throughout the body. That's one. And then the other one, pleasure and reward. Food can be very, very rewarding and reinforcing. And I would put the concept forward that most people that overeat, and we all overeat here or there, do it because food is pleasurable. And also, because food can decrease anxiety. So in a stressful situation, you can eat and that will decrease stress responses in your body. So food has the function of maintaining energy to activate reward and to also decrease stress. So it's not absurd that we have coined these term of comfort food, because it does decrease stress responses.
Now, when we are associated, normally and where glucose signals are saying you have enough energy, you can overcome the normal satiety response by putting food that you remember and you know tastes very, very good. So it's again, I don't think that it's satiety, that you have desert, and desert, one of the deserts is chocolate, right? Because you may be satiated, and there's no more dopamine that's going to be triggered by seeing a piece of chicken at that point. But may be triggered by a conditioned response that you have with that particular chocolate.
So just like we were discussing with drugs, where you've got condition, even if you are satiated, if you bring a stimuli that's salient enough, because you've had it in the past and it tastes very, very good, that will trigger the release of dopamine that will drive you to eat it.
Why do some of us fall into compulsive patterns and others do not? Many factors are going to be playing roles here. Again, aspects of vulnerability. But also, conditions. Look around yourself when you're in an airport and they cancel the flights. You'll immediately see people going in there and start to eat. So when you are stressed, you are much more likely to fall into one of these patterns of compulsively eating more food than you wanted to do. And this again has to do with some of the conditioned responses.
Our brains did not evolve at all for us to take drugs. What happened was that by just randomness of nature, certain chemicals, which we call drugs, are able to activate the same circuits that develop there in our brain to ensure that we will engage in a particular behavior. So therefore, it's not surprising that there's such a tremendous overlap. And the question that emerges is, why is then though, that if this is such an important process for survival, could it be that it goes wrong in such a way that people compulsively overeat and become obese at the expense of their own well being? Well, of course, this is of great interest because we are facing a massive obesity epidemic that is affecting the health of our society, and this is not just a problem in the United States, it's a problem everywhere. And this has to do with the social factors of easy access to food, food availability, diversity of food. Food that is extraordinarily appealing, that creates these conditioned responses.
So we got into the art of managed the most powerful food reinforcer, that for me, I would sort of say is chocolate, but for someone else it may be something else, but it's not just one chocolate. I can go there and there's all this diversity of chocolates. So I'm conditioned and if I take these chocolates and I get satiated by the taste, I can turn around to the left and there's all of these other variety that's it's now all intriguing. So we generated a system where many times, actually I ask myself, no wonder we have a problem of obesity in this country, I am surprised we don't have even a more serious one, because we are conditioned to the diversity of food. Food stimuli are everywhere. I walk, and now I'm in New York City, and oh, my God, I walk and they are all of these stores showing the most appealing food. And I'm conditioned to it, I mean, everybody's conditioned to it. I was commenting to you the story where you see that drug in a person that's addicted to drugs and you get that condition stimuli, we've done exactly the same study with food, people that are not obese, we just show them the food when they are food deprived, and you have exactly the same response. Dopamine system gets activated, you see the food, the dopamine system gets activated and that engages the motivational drive. So that's why when you go and see Godiva's chocolate on the glass, you want them, you want them! Of course you want them. And what you have to do is say to your brain, your frontal cortex, "No, I'm not going to eat it! No!" So you have to inhibit.
So we're constantly inhibiting. So that's what's happened to us when we get exposed to all of this food stimuli all over the place. We have to inhibit responses to want to eat it. It's just the way that our brain is hard wired. And that's modulated reaction.
Recorded on November 6, 2009
Scientist Nora Volkow’s research shows links between food and addiction. Food, just like drugs, is linked to dopamine.
Once a week.
Subscribe to our weekly newsletter.
China has reached a new record for nuclear fusion at 120 million degrees Celsius.
This article was originally published on our sister site, Freethink.
China wants to build a mini-star on Earth and house it in a reactor. Many teams across the globe have this same bold goal --- which would create unlimited clean energy via nuclear fusion.
But according to Chinese state media, New Atlas reports, the team at the Experimental Advanced Superconducting Tokamak (EAST) has set a new world record: temperatures of 120 million degrees Celsius for 101 seconds.
Yeah, that's hot. So what? Nuclear fusion reactions require an insane amount of heat and pressure --- a temperature environment similar to the sun, which is approximately 150 million degrees C.
If scientists can essentially build a sun on Earth, they can create endless energy by mimicking how the sun does it.
If scientists can essentially build a sun on Earth, they can create endless energy by mimicking how the sun does it. In nuclear fusion, the extreme heat and pressure create a plasma. Then, within that plasma, two or more hydrogen nuclei crash together, merge into a heavier atom, and release a ton of energy in the process.
Nuclear fusion milestones: The team at EAST built a giant metal torus (similar in shape to a giant donut) with a series of magnetic coils. The coils hold hot plasma where the reactions occur. They've reached many milestones along the way.
According to New Atlas, in 2016, the scientists at EAST could heat hydrogen plasma to roughly 50 million degrees C for 102 seconds. Two years later, they reached 100 million degrees for 10 seconds.
The temperatures are impressive, but the short reaction times, and lack of pressure are another obstacle. Fusion is simple for the sun, because stars are massive and gravity provides even pressure all over the surface. The pressure squeezes hydrogen gas in the sun's core so immensely that several nuclei combine to form one atom, releasing energy.
But on Earth, we have to supply all of the pressure to keep the reaction going, and it has to be perfectly even. It's hard to do this for any length of time, and it uses a ton of energy. So the reactions usually fizzle out in minutes or seconds.
Still, the latest record of 120 million degrees and 101 seconds is one more step toward sustaining longer and hotter reactions.
Why does this matter? No one denies that humankind needs a clean, unlimited source of energy.
We all recognize that oil and gas are limited resources. But even wind and solar power --- renewable energies --- are fundamentally limited. They are dependent upon a breezy day or a cloudless sky, which we can't always count on.
Nuclear fusion is clean, safe, and environmentally sustainable --- its fuel is a nearly limitless resource since it is simply hydrogen (which can be easily made from water).
With each new milestone, we are creeping closer and closer to a breakthrough for unlimited, clean energy.
The symbol for love is the heart, but the brain may be more accurate.
- How love makes us feel can only be defined on an individual basis, but what it does to the body, specifically the brain, is now less abstract thanks to science.
- One of the problems with early-stage attraction, according to anthropologist Helen Fisher, is that it activates parts of the brain that are linked to drive, craving, obsession, and motivation, while other regions that deal with decision-making shut down.
- Dr. Fisher, professor Ted Fischer, and psychiatrist Gail Saltz explain the different types of love, explore the neuroscience of love and attraction, and share tips for sustaining relationships that are healthy and mutually beneficial.
We explore the history of blood types and how they are classified to find out what makes the Rh-null type important to science and dangerous for those who live with it.
- Fewer than 50 people worldwide have 'golden blood' — or Rh-null.
- Blood is considered Rh-null if it lacks all of the 61 possible antigens in the Rh system.
- It's also very dangerous to live with this blood type, as so few people have it.
Golden blood sounds like the latest in medical quackery. As in, get a golden blood transfusion to balance your tantric midichlorians and receive a free charcoal ice cream cleanse. Don't let the New-Agey moniker throw you. Golden blood is actually the nickname for Rh-null, the world's rarest blood type.
As Mosaic reports, the type is so rare that only about 43 people have been reported to have it worldwide, and until 1961, when it was first identified in an Aboriginal Australian woman, doctors assumed embryos with Rh-null blood would simply die in utero.
But what makes Rh-null so rare, and why is it so dangerous to live with? To answer that, we'll first have to explore why hematologists classify blood types the way they do.
A (brief) bloody history
Our ancestors understood little about blood. Even the most basic of blood knowledge — blood inside the body is good, blood outside is not ideal, too much blood outside is cause for concern — escaped humanity's grasp for an embarrassing number of centuries.
Absence this knowledge, our ancestors devised less-than-scientific theories as to what blood was, theories that varied wildly across time and culture. To pick just one, the physicians of Shakespeare's day believed blood to be one of four bodily fluids or "humors" (the others being black bile, yellow bile, and phlegm).
Handed down from ancient Greek physicians, humorism stated that these bodily fluids determined someone's personality. Blood was considered hot and moist, resulting in a sanguine temperament. The more blood people had in their systems, the more passionate, charismatic, and impulsive they would be. Teenagers were considered to have a natural abundance of blood, and men had more than women.
Humorism lead to all sorts of poor medical advice. Most famously, Galen of Pergamum used it as the basis for his prescription of bloodletting. Sporting a "when in doubt, let it out" mentality, Galen declared blood the dominant humor, and bloodletting an excellent way to balance the body. Blood's relation to heat also made it a go-to for fever reduction.
While bloodletting remained common until well into the 19th century, William Harvey's discovery of the circulation of blood in 1628 would put medicine on its path to modern hematology.
Soon after Harvey's discovery, the earliest blood transfusions were attempted, but it wasn't until 1665 that first successful transfusion was performed by British physician Richard Lower. Lower's operation was between dogs, and his success prompted physicians like Jean-Baptiste Denis to try to transfuse blood from animals to humans, a process called xenotransfusion. The death of human patients ultimately led to the practice being outlawed.4
The first successful human-to-human transfusion wouldn't be performed until 1818, when British obstetrician James Blundell managed it to treat postpartum hemorrhage. But even with a proven technique in place, in the following decades many blood-transfusion patients continued to die mysteriously.
Enter Austrian physician Karl Landsteiner. In 1901 he began his work to classify blood groups. Exploring the work of Leonard Landois — the physiologist who showed that when the red blood cells of one animal are introduced to a different animal's, they clump together — Landsteiner thought a similar reaction may occur in intra-human transfusions, which would explain why transfusion success was so spotty. In 1909, he classified the A, B, AB, and O blood groups, and for his work he received the 1930 Nobel Prize for Physiology or Medicine.
What causes blood types?
It took us a while to grasp the intricacies of blood, but today, we know that this life-sustaining substance consists of:
- Red blood cells — cells that carry oxygen and remove carbon dioxide throughout the body;
- White blood cells — immune cells that protect the body against infection and foreign agents;
- Platelets — cells that help blood clot; and
- Plasma — a liquid that carries salts and enzymes.6,7
Each component has a part to play in blood's function, but the red blood cells are responsible for our differing blood types. These cells have proteins* covering their surface called antigens, and the presence or absence of particular antigens determines blood type — type A blood has only A antigens, type B only B, type AB both, and type O neither. Red blood cells sport another antigen called the RhD protein. When it is present, a blood type is said to be positive; when it is absent, it is said to be negative. The typical combinations of A, B, and RhD antigens give us the eight common blood types (A+, A-, B+, B-, AB+, AB-, O+, and O-).
Blood antigen proteins play a variety of cellular roles, but recognizing foreign cells in the blood is the most important for this discussion.
Think of antigens as backstage passes to the bloodstream, while our immune system is the doorman. If the immune system recognizes an antigen, it lets the cell pass. If it does not recognize an antigen, it initiates the body's defense systems and destroys the invader. So, a very aggressive doorman.
While our immune systems are thorough, they are not too bright. If a person with type A blood receives a transfusion of type B blood, the immune system won't recognize the new substance as a life-saving necessity. Instead, it will consider the red blood cells invaders and attack. This is why so many people either grew ill or died during transfusions before Landsteiner's brilliant discovery.
This is also why people with O negative blood are considered "universal donors." Since their red blood cells lack A, B, and RhD antigens, immune systems don't have a way to recognize these cells as foreign and so leaves them well enough alone.
How is Rh-null the rarest blood type?
Let's return to golden blood. In truth, the eight common blood types are an oversimplification of how blood types actually work. As Smithsonian.com points out, "[e]ach of these eight types can be subdivided into many distinct varieties," resulting in millions of different blood types, each classified on a multitude of antigens combinations.
Here is where things get tricky. The RhD protein previously mentioned only refers to one of 61 potential proteins in the Rh system. Blood is considered Rh-null if it lacks all of the 61 possible antigens in the Rh system. This not only makes it rare, but this also means it can be accepted by anyone with a rare blood type within the Rh system.
This is why it is considered "golden blood." It is worth its weight in gold.
As Mosaic reports, golden blood is incredibly important to medicine, but also very dangerous to live with. If a Rh-null carrier needs a blood transfusion, they can find it difficult to locate a donor, and blood is notoriously difficult to transport internationally. Rh-null carriers are encouraged to donate blood as insurance for themselves, but with so few donors spread out over the world and limits on how often they can donate, this can also put an altruistic burden on those select few who agree to donate for others.
Some bloody good questions about blood types
A nurse takes blood samples from a pregnant woman at the North Hospital (Hopital Nord) in Marseille, southern France.
Photo by BERTRAND LANGLOIS / AFP
There remain many mysteries regarding blood types. For example, we still don't know why humans evolved the A and B antigens. Some theories point to these antigens as a byproduct of the diseases various populations contacted throughout history. But we can't say for sure.
In this absence of knowledge, various myths and questions have grown around the concept of blood types in the popular consciousness. Here are some of the most common and their answers.
Do blood types affect personality?
Japan's blood type personality theory is a contemporary resurrection of humorism. The idea states that your blood type directly affects your personality, so type A blood carriers are kind and fastidious, while type B carriers are optimistic and do their own thing. However, a 2003 study sampling 180 men and 180 women found no relationship between blood type and personality.
The theory makes for a fun question on a Cosmopolitan quiz, but that's as accurate as it gets.
Should you alter your diet based on your blood type?
Remember Galen of Pergamon? In addition to bloodletting, he also prescribed his patients to eat certain foods depending on which humors needed to be balanced. Wine, for example, was considered a hot and dry drink, so it would be prescribed to treat a cold. In other words, belief that your diet should complement your blood type is yet another holdover of humorism theory.
Created by Peter J. D'Adamo, the Blood Type Diet argues that one's diet should match one's blood type. Type A carriers should eat a meat-free diet of whole grains, legumes, fruits, and vegetables; type B carriers should eat green vegetables, certain meats, and low-fat dairy; and so on.
However, a study from the University of Toronto analyzed the data from 1,455 participants and found no evidence to support the theory. While people can lose weight and become healthier on the diet, it probably has more to do with eating all those leafy greens than blood type.
Are there links between blood types and certain diseases?
There is evidence to suggest that different blood types may increase the risk of certain diseases. One analysis suggested that type O blood decreases the risk of having a stroke or heart attack, while AB blood appears to increase it. With that said, type O carriers have a greater chance of developing peptic ulcers and skin cancer.
None of this is to say that your blood type will foredoom your medical future. Many factors, such as diet and exercise, hold influence over your health and likely to a greater extent than blood type.
What is the most common blood type?
In the United States, the most common blood type is O+. Roughly one in three people sports this type of blood. Of the eight well-known blood types, the least common is AB-. Only one in 167 people in the U.S. have it.
Do animals have blood types?
They most certainly do, but they are not the same as ours. This difference is why those 17th-century patients who thought, "Animal blood, now that's the ticket!" ultimately had their tickets punched. In fact, blood types are distinct between species. Unhelpfully, scientists sometimes use the same nomenclature to describe these different types. Cats, for example, have A and B antigens, but these are not the same A and B antigens found in humans.
Interestingly, xenotransfusion is making a comeback. Scientists are working to genetically engineer the blood of pigs to potentially produce human compatible blood.
Scientists are also looking into creating synthetic blood. If they succeed, they may be able to ease the current blood shortage, while also devising a way to create blood for rare blood type carriers. While this may make golden blood less golden, it would certainly make it easier to live with.* While antigens are typically proteins, they can be other molecules as well, such as polysaccharides.
A new study suggests that reports of the impending infertility of the human male are greatly exaggerated.
- A new review of a famous study on declining sperm counts finds several flaws.
- The old report makes unfounded assumptions, has faulty data, and tends toward panic.
- The new report does not rule out that sperm counts are going down, only that this could be quite normal.
Several years ago, a meta-analysis of studies on human fertility came out warning us about the declining sperm counts of Western men. It was widely shared, and its findings were featured on the covers of popular magazines. Indeed, its findings were alarming: a nearly 60 percent decline in sperm per milliliter since 1973 with no end in sight. It was only a matter of time, the authors argued, until men were firing blanks, literally.
Well… never mind.
It turns out that the impending demise of humanity was greatly exaggerated. As the predicted infertility wave crashed upon us, there was neither a great rush of men to fertility clinics nor a sudden dearth of new babies. The only discussions about population decline focus on urbanization and the fact that people choose not to have kids rather than not being able to have them.
Now, a new analysis of the 2017 study says that lower sperm counts is nothing to be surprised by. Published in Human Fertility, its authors point to flaws in the original paper's data and interpretation. They suggest a better and smarter reanalysis.
Counting tiny things is difficult
The original 2017 report analyzed 185 studies on 43,000 men and their reproductive health. Its findings were clear: "a significant decline in sperm counts… between 1973 and 2011, driven by a 50-60 percent decline among men unselected by fertility from North America, Europe, Australia and New Zealand."
However, the new analysis points out flaws in the data. As many as a third of the men in the studies were of unknown age, an important factor in reproductive health. In 45 percent of cases, the year of the sample collection was unknown- a big detail to miss in a study measuring change over time. The quality controls and conditions for sample collection and analysis vary widely from study to study, which likely influenced the measured sperm counts in the samples.
Another study from 2013 also points out that the methods for determining sperm count were only standardized in the 1980s, which occurred after some of the data points were collected for the original study. It is entirely possible that the early studies gave inaccurately high sperm counts.
This is not to say that the 2017 paper is entirely useless; it had a much more rigorous methodology than previous studies on the subject, which also claimed to identify a decline in sperm counts. However, the original study had more problems.
Garbage in, garbage out
Predictable as always, the media went crazy. Discussions of the decline of masculinity took off, both in mainstream and less-than-reputable forums; concerns about the imagined feminizing traits of soy products continued to increase; and the authors of the original study were called upon to discuss the findings themselves in a number of articles.
However, as this new review points out, some of the findings of that meta-analysis are debatable at best. For example, the 2017 report suggests that "declining mean [sperm count] implies that an increasing proportion of men have sperm counts below any given threshold for sub-fertility or infertility," despite little empirical evidence that this is the case.
The WHO offers a large range for what it considers to be a healthy sperm count, from 15 to 250 million sperm per milliliter. The benefits to fertility above a count of 40 million are seen as minimal, and the original study found a mean sperm concentration of 47 million sperm per milliliter.
Healthy sperm, healthy man?
The claim that sperm count is evidence of larger health problems is also scrutinized in this new article. While it is true that many major health problems can impact reproductive health, there is little evidence that it is the "canary in the coal mine" for overall well-being. A number of studies suggest that any relation between lifestyle choices and this part of reproductive health is limited at best.
Lastly, ideas that environmental factors could be at play have been debunked since 2017. While the original paper considered the idea that pollutants, especially from plastics, could be at fault, it is now known that this kind of pollution is worse in the parts of the world that the original paper observed higher sperm counts in (i.e., non-Western nations).
There never was a male fertility crisis
The authors of the new review do not deny that some measurements are showing lower sperm counts, but they do question the claim that this is catastrophic or part of a larger pathological issue. They propose a new interpretation of the data. Dubbed the "Sperm Count Biovariability hypothesis," it is summarized as:
"Sperm count varies within a wide range, much of which can be considered non-pathological and species-typical. Above a critical threshold, more is not necessarily an indicator of better health or higher probability of fertility relative to less. Sperm count varies across bodies, ecologies, and time periods. Knowledge about the relationship between individual and population sperm count and life-historical and ecological factors is critical to interpreting trends in average sperm counts and their relationships to human health and fertility."
Still, the authors note that lower sperm counts "could decline due to negative environmental exposures, or that this may carry implications for men's health and fertility."
However, they disagree that the decline in absolute sperm count is necessarily a bad sign for men's health and fertility. We aren't at civilization ending catastrophe just yet.