Once a week.
Subscribe to our weekly newsletter.
The Dark Side of Antioxidants
Not all vitamins are good for all people, all the time. In fact, some can kill you. And guess what? We know where the bodies are buried.
The story of the dark side of antioxidant research isn't well known outside of medical circles. It's an unseemly story, profoundly unsettling; a story that refuses to be made pretty or happy or uplifting no matter how hard you try to duct-tape a silver lining around it. It doesn't fit the "antioxidants are good for you" mantra that sells billions of dollars per year of blueberry- and pomegranate-fortified granola bars and tocopherol-enrichened cereals, acai-berry Jell-O mixes, juices and yogurts with added vitamins, organic baby foods, and so forth, not to mention the billions of dollars of nutritional supplements sold each year (to say nothing of the sub-industry of books and magazines devoted to nutrition).
Still, it's a story that needs to be told. And some of us know where the bodies are buried.
For decades, mainstream medicine pooh-poohed the possibility that vitamins or supplements could "move the needle" on major diseases. Two-time Nobel laureate Linus Pauling was harshly criticized in the 1970s and 80s for suggesting a role for Vitamin C in prevention and treatment of cancer. Even so, laboratory workers had known for years that changes to diet could influence the rate of tumor appearance in lab animals. By the early 1980s, case-control studies and epidemiological evidence from a variety of sources had begun to accumulate, showing that persons who routinely ate large quantities of fresh fruits and vegetables consistently did better with regard to cardiovascular disease (and other diseases) than most people.
In 1981, Sir Richard Peto and colleagues published a paper in Nature that dared asked the simple question: "Can dietary beta-carotene materially reduce human cancer rates?" (Nature, 290:201-208) Shortly thereafter, the National Cancer Institute (whose Chemoprevention branch was headed by Dr. Michael B. Sporn, one of the coauthors of the Nature article) decided to green-light two large intervention-based studies of the cancer-preventing effects of nutritional supplements: a study in Finland involving beta-carotene and alpha-tocopherol (Vitamin E), and a U.S.-based study involving retinol (a form of Vitamin A) and beta-carotene.
The Finland study (conducted by Finland's National Institute for Health and Welfare) was initially designed to encompass 18,000 male smokers between the ages of 50 and 69. Why just smokers? And why male, and 50+ years old? Lung cancer is ten times more likely to affect smokers; hence a cancer study limited to smokers would need only a tenth as many participants as a study involving the general population. Based on what was known about the age-specific rates of lung cancer among Finnish men, study designers calculated that the desired effect size (a hoped-for 25% decrease in cancer incidence over a period of 6 years) would be measurable with the required level of statistical relevance if 18,000 older male smokers made up the study group. As it turned out, the age distribution of actual volunteers didn't match the demographics of the eligibility group (volunteers tended to be toward the young end of the eligibility range), and as a result the study's enrollment target had to be reset to 27,000 in order to get good statistical relevance.
Full-scale recruitment of subjects into the ATBC (Alpha-Tocopherol Beta-Carotene) Lung Cancer Prevention Study began in April 1985 and continued until a final enrollment of 29,246 men occurred in June 1988. Enrollees were randomized into one of four equal-sized groups, receiving either 50 mg/day (about 6 times the RDA) of alpha-tocopherol, or 20 mg/day of beta-carotene (equivalent to around 3 times the RDA of Vitamin A), or AT and BC together, or placebo only.
At the same time, which is to say starting in 1985 (after some very small, very brief pilot studies to validate recruitment mechanics), the Carotene and Retinol Efficacy Trial (CARET) started enrolling volunteers in the U.S. Unlike Finland's ATBC study, volunteers for CARET were both male and female and were heavy smokers or came from asbestos-exposed workplace environments. They ranged in age from 45 to 69 and were divided initially into four groups (30 mg/day beta carotene only, 25,000 IU retinol-only, carotene plus retinol, or placebo), but in 1988 the treatment groups were consolidated into one group taking both beta-carotene and retinol. The study design called for continuing the vitamin regimen through 1997, with reporting of results to occur in 1998.
Alas, things went horribly awry, and CARET never got that far.
When the Finns reported results from the ATBC study in April 1994, it sent shock waves through the medical world. Not only had alpha-tocopherol and beta-carotene not provided the expected protective effect against lung cancer; the supplement-treated groups actually experienced more cancer than the placebo group—18% more, in fact.
This was an astonishing result, utterly bewildering, as it contradicted numerous prior animal studies that had shown Vitamin E and beta-carotene to be promising cancer preventatives. Surely an error had occurred. Something had to have gone wrong. One thing it couldn't be was chance variation: with almost 30,000 participants (three quarters of them in treatment groups), this was not a small study. The results couldn't be a statistical fluke.
As it turns out, the Finnish investigators had actually done a meticulous job from start to finish. In analyzing their data, they had looked for possible confounding factors. The only thing they found of interest was that heavy drinkers in the treatment group got cancer more often than light drinkers.
Two weeks before the Finnish study hit, the National Cancer Institute was awash in conference calls. Accounts vary as to who knew what, when, but CARET's lead investigator, who had seen the Finnish group's data prior to publication, knew that NCI now had a serious problem on its hands. CARET was doing essentially the same experiment the Finns had done, except it was giving even bigger doses of supplements to its U.S. participants, and the study was due to run for another three and a half years. What if CARET's treatment group was also experiencing elevated cancer rates? Participants might be dying needlessly.
When statisticians presented interim results to CARET's Safety Endpoint Monitoring Committee in August 1994, four months after the Finnish study appeared in print, it became clear that CARET participants were, if anything, faring worse than the patients in the ATBC study. Even so, the safety committee found itself deadlocked on whether to call a premature halt to CARET. The study's formal stopping criteria (as given by something called the O’Brien–Fleming early-stopping boundary) had not been met. Ultimately a decision was made to continue to accumulate more data.
A second interim statistical analysis was presented to CARET's safety committee in September 1995, one year after the first analysis. According to the committee:
At that time it was clear that the excess of lung cancer had continued to accumulate in the intervention regimen at about the same rate during the time since the first interim analysis. Further, the cardiovascular disease excess persisted. The conditional power calculations showed that it was extremely unlikely that the trial could show a beneficial effect of the intervention, even if the adverse effect ceased to occur and a delayed protective effect began to appear.Therefore the SEMC voted unanimously to recommend to NCI that the trial regimen should be stopped but the follow-up should continue.
The study was halted—but not until January 1996, nearly two years after final publication of the Finnish results. (Even then, CARET participants were contacted by snail mail to let them know of the study's early termination and the reasons for it. See this writeup for details.)
CARET's results were published in The New England Journal of Medicine in May 1996. Once again, shock waves reverberated throughout the medical world. Participants who took beta-carotene and Vitamin A supplements had shown a 28% higher rate of lung cancer. They also fared 26% worse for cardiovascular-related mortality, and 17% worse for all-cause mortality.
There was great reluctance in the medical community to believe the results. Perhaps the even-worse results of the CARET study (relative to the Finnish experiment) had to do with the decision to include 2,044 asbestos-exposed individuals in the treatment group of 9,241 persons? Not so, it turns out. Segment analysis of the asbestos group's data relative to the heavy-smoker group showed that "There was no statistical evidence of heterogeneity of the relative risk among these subgroups."
What the CARET study had, in fact, done was not just replicate the ATBC results but provide the beginnings of a dose-response curve. The Finns had used 20 mg/day of beta-carotene; CARET employed a 50% higher dose. The result had been 50% more cancer.
It was hard to understand the results of the ATBC and CARET studies in light of the fact that another large trial involving beta-carotene, the Physicians' Health Study, had reported neither harm nor benefit from 50 mg of beta carotene taken every other day for 12 years. However, the Physicians' Health Study population was younger and healthier than ATBC or CARET study groups and was predominantly (89%) made up of non-smokers. This turned out to be quite important. (Read on.)
It's been almost 20 years since the ATBC and CARET results were reported. What have we learned in that time?
In 2007, Bjelakovic et al. undertook a systematic review of existing literature on antioxidant studies covering the time frame 1977 to 2006. The systematic review procedure was conducted using the well-regarded methodology of the Cochrane Collaboration, a group that specializes in (and is known for) high-quality meta-analyses. In analyzing the 47 most rigorously designed studies of supplement effectiveness, Bjelakovic et al. found that 15,366 study subjects (out of a total treatment population of 99,095 persons) died while taking antioxidants, whereas 9,131 placebo-takers, in control groups totalling 81,843 persons, died in those same studies. (This is not including ATBC or CARET results.) The studies in question used beta-carotene, Vitamin E, Vitamin A, Vitamin C, and/or selenium.
In a separate meta-analysis, Miller et al. found a dose-dependent relationship of Vitamin E with all-cause mortality for 135,967 participants in 19 clinical trials. At daily doses below about 150 International Units, Vitamin E appears to be helpful; above that, harmful. Miller et al. concluded:
In view of the increased mortality associated with high dosages of beta-carotene and now vitamin E, use of any high-dosage vitamin supplements should be discouraged until evidence of efﬁcacy is documented from appropriately designed clinical trials.
How are we to make sense of these results? Why have so many studies shown a harmful effect for antioxidants when so many other studies (particularly those carried out in animals, but also those carried out in predominantly healthy human populations) have shown a clear benefit?
The answer may have to do with something called apoptosis, otherwise known as programmed cell death. The body has ways of determining when cells have become dysfunctional to the point of needing to be told to shut down. Most cancer therapies exert their effect by inducing apoptosis, and it's fairly well accepted that in normal, healthy individuals, precancerous cells are constantly being formed, then destroyed through apoptosis. Antioxidants are known to interfere with apoptosis. In essence, they promote the survival of normal cells as well as cells that shouldn't be allowed to live.
If you're a young non-smoker in good health, the level of cell turnover (from apoptosis) in your body is nowhere near as high as the level of turnover in an older person, or someone at high risk of cancer. Therefore, antioxidants are apt to do more good than harm in a young, healthy person. But if your body is harboring cancer cells, you don't want antioxidants to encourage their growth by interfering with their apoptosis. That's the real lesson of antioxidant research.
The food industry and the people who make nutritional supplements have no interest in telling you any of the things you've read here. But now that you know the story of the dark side of antioxidants (a story made possible by thousands of ordinary people who died in the name of science), you owe it to yourself to take the story to heart. If you're a smoker or at high risk for heart disease or cancer, consider scaling back your use of antioxidant supplements (Vitamins A and E in particular); it could save your life. And please, if you found any of this information helpful, share it with family, friends, Facebook and Twitter followers, and others. The story needs to get out.
Why mega-eruptions like the ones that covered North America in ash are the least of your worries.
- The supervolcano under Yellowstone produced three massive eruptions over the past few million years.
- Each eruption covered much of what is now the western United States in an ash layer several feet deep.
- The last eruption was 640,000 years ago, but that doesn't mean the next eruption is overdue.
The end of the world as we know it
Panoramic view of Yellowstone National Park
Image: Heinrich Berann for the National Park Service – public domain
Of the many freak ways to shuffle off this mortal coil – lightning strikes, shark bites, falling pianos – here's one you can safely scratch off your worry list: an outbreak of the Yellowstone supervolcano.
As the map below shows, previous eruptions at Yellowstone were so massive that the ash fall covered most of what is now the western United States. A similar event today would not only claim countless lives directly, but also create enough subsidiary disruption to kill off global civilisation as we know it. A relatively recent eruption of the Toba supervolcano in Indonesia may have come close to killing off the human species (see further below).
However, just because a scenario is grim does not mean that it is likely (insert topical political joke here). In this case, the doom mongers claiming an eruption is 'overdue' are wrong. Yellowstone is not a library book or an oil change. Just because the previous mega-eruption happened long ago doesn't mean the next one is imminent.
Ash beds of North America
Ash beds deposited by major volcanic eruptions in North America.
Image: USGS – public domain
This map shows the location of the Yellowstone plateau and the ash beds deposited by its three most recent major outbreaks, plus two other eruptions – one similarly massive, the other the most recent one in North America.
The Huckleberry Ridge eruption occurred 2.1 million years ago. It ejected 2,450 km3 (588 cubic miles) of material, making it the largest known eruption in Yellowstone's history and in fact the largest eruption in North America in the past few million years.
This is the oldest of the three most recent caldera-forming eruptions of the Yellowstone hotspot. It created the Island Park Caldera, which lies partially in Yellowstone National Park, Wyoming and westward into Idaho. Ash from this eruption covered an area from southern California to North Dakota, and southern Idaho to northern Texas.
About 1.3 million years ago, the Mesa Falls eruption ejected 280 km3 (67 cubic miles) of material and created the Henry's Fork Caldera, located in Idaho, west of Yellowstone.
It was the smallest of the three major Yellowstone eruptions, both in terms of material ejected and area covered: 'only' most of present-day Wyoming, Colorado, Kansas and Nebraska, and about half of South Dakota.
The Lava Creek eruption was the most recent major eruption of Yellowstone: about 640,000 years ago. It was the second-largest eruption in North America in the past few million years, creating the Yellowstone Caldera.
It ejected only about 1,000 km3 (240 cubic miles) of material, i.e. less than half of the Huckleberry Ridge eruption. However, its debris is spread out over a significantly wider area: basically, Huckleberry Ridge plus larger slices of both Canada and Mexico, plus most of Texas, Louisiana, Arkansas, and Missouri.
This eruption occurred about 760,000 years ago. It was centered on southern California, where it created the Long Valley Caldera, and spewed out 580 km3 (139 cubic miles) of material. This makes it North America's third-largest eruption of the past few million years.
The material ejected by this eruption is known as the Bishop ash bed, and covers the central and western parts of the Lava Creek ash bed.
Mount St Helens
The eruption of Mount St Helens in 1980 was the deadliest and most destructive volcanic event in U.S. history: it created a mile-wide crater, killed 57 people and created economic damage in the neighborhood of $1 billion.
Yet by Yellowstone standards, it was tiny: Mount St Helens only ejected 0.25 km3 (0.06 cubic miles) of material, most of the ash settling in a relatively narrow band across Washington State and Idaho. By comparison, the Lava Creek eruption left a large swathe of North America in up to two metres of debris.
The difference between quakes and faults
The volume of dense rock equivalent (DRE) ejected by the Huckleberry Ridge event dwarfs all other North American eruptions. It is itself overshadowed by the DRE ejected at the most recent eruption at Toba (present-day Indonesia). This was one of the largest known eruptions ever and a relatively recent one: only 75,000 years ago. It is thought to have caused a global volcanic winter which lasted up to a decade and may be responsible for the bottleneck in human evolution: around that time, the total human population suddenly and drastically plummeted to between 1,000 and 10,000 breeding pairs.
Image: USGS – public domain
So, what are the chances of something that massive happening anytime soon? The aforementioned mongers of doom often claim that major eruptions occur at intervals of 600,000 years and point out that the last one was 640,000 years ago. Except that (a) the first interval was about 200,000 years longer, (b) two intervals is not a lot to base a prediction on, and (c) those intervals don't really mean anything anyway. Not in the case of volcanic eruptions, at least.
Earthquakes can be 'overdue' because the stress on fault lines is built up consistently over long periods, which means quakes can be predicted with a relative degree of accuracy. But this is not how volcanoes behave. They do not accumulate magma at constant rates. And the subterranean pressure that causes the magma to erupt does not follow a schedule.
What's more, previous super-eruptions do not necessarily imply future ones. Scientists are not convinced that there ever will be another big eruption at Yellowstone. Smaller eruptions, however, are much likelier. Since the Lava Creek eruption, there have been about 30 smaller outbreaks at Yellowstone, the last lava flow being about 70,000 years ago.
As for the immediate future (give or take a century): the magma chamber beneath Yellowstone is only 5 percent to 15 percent molten. Most scientists agree that is as un-alarming as it sounds. And that its statistically more relevant to worry about death by lightning, shark, or piano.
Strange Maps #1041
Got a strange map? Let me know at email@example.com.
The pandemic has many people questioning whether they ever want to go back to the office.
If one thing is clear about remote work, it's this: Many people prefer it and don't want their bosses to take it away.
When the pandemic forced office employees into lockdown and cut them off from spending in-person time with their colleagues, they almost immediately realized that they favor remote work over their traditional office routines and norms.
As remote workers of all ages contemplate their futures – and as some offices and schools start to reopen – many Americans are asking hard questions about whether they wish to return to their old lives, and what they're willing to sacrifice or endure in the years to come.
Even before the pandemic, there were people asking whether office life jibed with their aspirations.
We spent years studying “digital nomads" – workers who had left behind their homes, cities and most of their possessions to embark on what they call “location independent" lives. Our research taught us several important lessons about the conditions that push workers away from offices and major metropolitan areas, pulling them toward new lifestyles.
Legions of people now have the chance to reinvent their relationship to their work in much the same way.
Big-city bait and switch
Most digital nomads started out excited to work in career-track jobs for prestigious employers. Moving to cities like New York and London, they wanted to spend their free time meeting new people, going to museums and trying out new restaurants.
But then came the burnout.
Although these cities certainly host institutions that can inspire creativity and cultivate new relationships, digital nomads rarely had time to take advantage of them. Instead, high cost of living, time constraints and work demands contributed to an oppressive culture of materialism and workaholism.
Pauline, 28, who worked in advertising helping large corporate clients to develop brand identities through music, likened city life for professionals in her peer group to a “hamster wheel." (The names used in this article are pseudonyms, as required by research protocol.)
“The thing about New York is it's kind of like the battle of the busiest," she said. “It's like, 'Oh, you're so busy? No, I'm so busy.'"
Most of the digital nomads we studied had been lured into what urbanist Richard Florida termed “creative class" jobs – positions in design, tech, marketing and entertainment. They assumed this work would prove fulfilling enough to offset what they sacrificed in terms of time spent on social and creative pursuits.
Yet these digital nomads told us that their jobs were far less interesting and creative than they had been led to expect. Worse, their employers continued to demand that they be “all in" for work – and accept the controlling aspects of office life without providing the development, mentorship or meaningful work they felt they had been promised. As they looked to the future, they saw only more of the same.
Ellie, 33, a former business journalist who is now a freelance writer and entrepreneur, told us: “A lot of people don't have positive role models at work, so then it's sort of like 'Why am I climbing the ladder to try and get this job? This doesn't seem like a good way to spend the next twenty years.'"
By their late 20s to early 30s, digital nomads were actively researching ways to leave their career-track jobs in top-tier global cities.
Looking for a fresh start
Although they left some of the world's most glamorous cities, the digital nomads we studied were not homesteaders working from the wilderness; they needed access to the conveniences of contemporary life in order to be productive. Looking abroad, they quickly learned that places like Bali in Indonesia, and Chiang Mai in Thailand had the necessary infrastructure to support them at a fraction of the cost of their former lives.
With more and more companies now offering employees the choice to work remotely, there's no reason to think digital nomads have to travel to southeast Asia – or even leave the United States – to transform their work lives.
During the pandemic, some people have already migrated away from the nation's most expensive real estate markets to smaller cities and towns to be closer to nature or family. Many of these places still possess vibrant local cultures. As commutes to work disappear from daily life, such moves could leave remote workers with more available income and more free time.
The digital nomads we studied often used savings in time and money to try new things, like exploring side hustles. One recent study even found, somewhat paradoxically, that the sense of empowerment that came from embarking on a side hustle actually improved performance in workers' primary jobs.
The future of work, while not entirely remote, will undoubtedly offer more remote options to many more workers. Although some business leaders are still reluctant to accept their employees' desire to leave the office behind, local governments are embracing the trend, with several U.S. cities and states – along with countries around the world – developing plans to attract remote workers.
This migration, whether domestic or international, has the potential to enrich communities and cultivate more satisfying work lives.
The potential of CRISPR technology is incredible, but the threats are too serious to ignore.
- CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats) is a revolutionary technology that gives scientists the ability to alter DNA. On the one hand, this tool could mean the elimination of certain diseases. On the other, there are concerns (both ethical and practical) about its misuse and the yet-unknown consequences of such experimentation.
- "The technique could be misused in horrible ways," says counter-terrorism expert Richard A. Clarke. Clarke lists biological weapons as one of the potential threats, "Threats for which we don't have any known antidote." CRISPR co-inventor, biochemist Jennifer Doudna, echos the concern, recounting a nightmare involving the technology, eugenics, and a meeting with Adolf Hitler.
- Should this kind of tool even exist? Do the positives outweigh the potential dangers? How could something like this ever be regulated, and should it be? These questions and more are considered by Doudna, Clarke, evolutionary biologist Richard Dawkins, psychologist Steven Pinker, and physician Siddhartha Mukherjee.
Measuring a person's movements and poses, smart clothes could be used for athletic training, rehabilitation, or health-monitoring.