40 Years of Government Nutrition Data May Be Flawed
Americans severely under-report how much food they eat, and this has affected decades of nutrition data.
For forty years, the CDC's National Health and Nutrition Examination Survey (NHANES) has collected data on the caloric intake of Americans, answering the simple and important question of how much food we eat. This research has in turn been used to instruct public health policy. NHANES is actually the basis for national standards of height, weight, and blood pressure, and its findings are often used to develop programs in the public battle against obesity.
But there's just one "tiny" problem. According to an analysis conducted by exercise scientist Edward Archer at the University of South Carolina, NHANES is very likely invalid. And it's for a simple reason that almost anybody could point out: All of the data it collects on caloric intake is self-reported.
Try to remember precisely what you ate over the past twenty-four hours and you'll see why this is a problem. People aren't only inept at estimating how many calories are in the foods they eat, they're also bad at recalling what they consumed and when.
With this in mind, Archer performed calculations merely to gauge if NHANES' data is physiologically plausible. In 1991, a team of physiologists determined that average, free-living individuals must consume at minimum at least 35% more calories than their basal metabolic rate (BMR) -- the amount of calories they expend resting -- in order to maintain their weight and health. This accounts for the energy people need to perform everyday activities, from just walking around, to playing sports, to gardening, etc. Archer used a well-established equation to estimate the BMR of individuals in the NHANES study, multiplied those values by 1.35, and compared them to the self-reported energy intakes in NHANES. What did that comparison yield? The majority of respondents, totaling 28,993 men and 34,369 women, reported eating less calories than even the bare minimum necessary to survive!
Judging by how much Americans' waistlines have been infamously expanding, the respondents' eating estimations are obviously wrong. To Archer, it reinforces what previous research has clearly shown: Americans severely under-report how much food they eat, and this has affected decades of nutrition data.
As Archer told me, one possible fix is to replace self-reporting with more empirical methods of determining caloric intake. The doubly labeled water technique, which uses well-established knowledge of carbon dioxide metabolism to calculate how much energy living subjects expend, could be adapted for use in NHANES and other federally funded health studies, he says. Doubly labeled water is considered a key biomarker for long-term energy intake.
Archer, a former college football and professional polo player, exudes enthusiasm. He believes that science can truly help people live healthy and productive lives, but is dismayed that public health research consistently relies on imprecise methodology. Determined to change that, Archer hopes that his current study will impress upon people that the status quo of self-reporting isn't working.
"The nation's major surveillance tool for studying the relationships between nutrition and health is not valid. It is time to stop spending tens of millions of health research dollars collecting invalid data and find more accurate measures," he said.
Though the study was just published to PLoS ONE on Wednesday, Archer is already receiving flack. Some dissenters question his calculations, which certainly aren't completely precise considering he obviously couldn't measure each subject's metabolism or energy intake. Others insist that self-reporting is simply the only viable option. Critics also point to the fact that his research is funded by Coca-Cola, a fact I queried him about. Archer explained that it's an unrestricted research grant. In other words, the soft drink maker contractually has no input or say in what he researches or publishes.
But more important than the specifics of Archer's study is the message it shouts. Self-reports of personal health and food intake are utterly inaccurate, and if we are ever to get a clear picture of the obesity epidemic and how to solve it, we need to start by collecting credible data.
Source: Archer E, Hand GA, Blair SN (2013) Validity of U.S. Nutritional Surveillance: National Health and Nutrition Examination Survey Caloric Energy Intake Data, 1971-2010. PLoS ONE 8(10): e76632. doi:10.1371/journal.pone.0076632
(Image: Scale & Tape Measure via Shutterstock)
Dominique Crenn, the only female chef in America with three Michelin stars, joins Big Think Live this Thursday at 1pm ET.
Scientists discover the inner workings of an effect that will lead to a new generation of devices.
- Researchers discover a method of extracting previously unavailable information from superconductors.
- The study builds on a 19th-century discovery by physicist Edward Hall.
- The research promises to lead to a new generation of semiconductor materials and devices.
Credit: Gunawan/Nature magazine
The number of people with dementia is expected to triple by 2060.
The images and our best computer models don't agree.
A trio of intriguing galaxy clusters<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8yNDQzNDA0OS9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTYxNTkzNzUyOH0.0IRzkzvKsmPEHV-v1dqM1JIPhgE2W-UHx0COuB0qQnA/img.jpg?width=980" id="d69be" class="rm-shortcode" data-rm-shortcode-id="2d2664d9174369e0a06540cb3a3a9079" data-rm-shortcode-name="rebelmouse-image" />
The three galaxy clusters imaged for the study
Mapping dark matter<span style="display:block;position:relative;padding-top:56.25%;" class="rm-shortcode" data-rm-shortcode-id="d904b585c806752f261e1215014691a6"><iframe type="lazy-iframe" data-runner-src="https://www.youtube.com/embed/fO0jO_a9uLA?rel=0" width="100%" height="auto" frameborder="0" scrolling="no" style="position:absolute;top:0;left:0;width:100%;height:100%;"></iframe></span><p>The assumption has been that the greater the lensing effect, the higher the concentration of dark matter.</p><p>As scientists analyzed the clusters' large-scale lensing — the massive arc and elongation visual effects produced by dark matter — they noticed areas of smaller-scale lensing within that larger distortion. The scientists interpret these as concentrations of dark matter within individual galaxies inside the clusters.</p><p>The researchers used spectrographic data from the VLT to determine the mass of these smaller lenses. <a href="https://www.oas.inaf.it/en/user/pietro.bergamini/" target="_blank" rel="noopener noreferrer">Pietro Bergamini</a> of the INAF-Observatory of Astrophysics and Space Science in Bologna, Italy explains, "The speed of the stars gave us an estimate of each individual galaxy's mass, including the amount of dark matter." The leader of the spectrographic aspect of the study was <a href="http://docente.unife.it/docenti-en/piero.rosati1/curriculum?set_language=en" target="_blank">Piero Rosati</a> of the Università degli Studi di Ferrara, Italy who recalls, "the data from Hubble and the VLT provided excellent synergy. We were able to associate the galaxies with each cluster and estimate their distances." </p><p>This work allowed the team to develop a thoroughly calibrated, high-resolution map of dark matter concentrations throughout the three clusters.</p>
But the models say...<p>However, when the researchers compared their map to the concentrations of dark matter computer models predicted for galaxies bearing the same general characteristics, something was <em>way</em> off. Some small-scale areas of the map had 10 times the amount of lensing — and presumably 10 times the amount of dark matter — than the model predicted.</p><p>"The results of these analyses further demonstrate how observations and numerical simulations go hand in hand," notes one team member, <a href="https://nena12276.wixsite.com/elenarasia" target="_blank">Elena Rasia</a> of the INAF-Astronomical Observatory of Trieste, Italy. Another, <a href="http://adlibitum.oats.inaf.it/borgani/" target="_blank" rel="noopener noreferrer">Stefano Borgani</a> of the Università degli Studi di Trieste, Italy, adds that "with advanced cosmological simulations, we can match the quality of observations analyzed in our paper, permitting detailed comparisons like never before."</p><p>"We have done a lot of testing of the data in this study," Meneghetti says, "and we are sure that this mismatch indicates that some physical ingredient is missing either from the simulations or from our understanding of the nature of dark matter." <a href="https://physics.yale.edu/people/priyamvada-natarajan" target="_blank">Priyamvada Natarajan</a> of Yale University in Connecticut agrees: "There's a feature of the real Universe that we are simply not capturing in our current theoretical models."</p><p>Given that any theory in science lasts only until a better one comes along, Natarajan views the discrepancy as an opportunity, saying, "this could signal a gap in our current understanding of the nature of dark matter and its properties, as these exquisite data have permitted us to probe the detailed distribution of dark matter on the smallest scales."</p><p>At this point, it's unclear exactly what the conflict signifies. Do these smaller areas have unexpectedly high concentrations of dark matter? Or can dark matter, under certain currently unknown conditions, produce a tenfold increase in lensing beyond what we've been expecting, breaking the assumption that more lensing means more dark matter?</p><p>Obviously, the scientific community has barely begun to understand this mystery.</p>
Scientists have found evidence of hot springs near sites where ancient hominids settled, long before the control of fire.