PopularSurprising SciencePersonal GrowthMind & BrainSex & RelationshipsTechnology & InnovationCulture & ReligionPolitics & Current Affairs
Former Navy Seal
International Poker Champion
Former CIA Clandestine Operative
Retired Canadian Astronaut & Author
from the world's big
from the world's big
Don't believe every science study you read, because sometimes not even their authors believe them. Here are the issues corrupting good, honest science – and how to fix them.
09 September, 2016
Pseudoscience caught in the act! This article by TIME stretched the truth of a study that showed flavanols in cocoa are linked to the slowing or reversing of age-related cognitive decline, by reporting that eating chocolate can win you a Nobel Prize.
<p dir="ltr"><span>It’s a dirty little secret in the science community that most published scientific studies aren’t 100% true. As Nobel Prize-winning biologist Thomas Sudhof told </span><a href="http://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.1002547" target="_blank"><span>PLOS</span></a><span>, there are a host of problems with science journals. He summarizes those five problems as: </span></p> <p dir="ltr"><span>1. Hidden conflicts of interest between the journal and its reviewers</span></p> <p dir="ltr"><span>2. Trivial accountability measures for journals and reviewers</span></p> <p dir="ltr"><span>3. Expensive publishing costs and limited journals for authors to publish in</span></p> <p dir="ltr"><span>4. A murky, hodge-podge peer-review process</span></p> <p dir="ltr"><span>5. Experiments with unreproducible results</span></p> <p dir="ltr"><span>Once these studies are published they get into the media's unreliable little hands, some of whom are genuinely confused by the science, and others who are genuinely sensationalizing science for publicity gains. Depending on the day and the news outlet, coffee will either kill you or be the secret to eternal life (depending of course through which orifice you administer it). Owning a certain pet can make you infertile. Smelling farts can prevent cancer. Eating chocolate can turn you into a Nobel Prize winner. Watching pornography could make men better weightlifters. The list could, and unfortunately does, go on. </span></p> <p>It’s perhaps best said by <a href="https://www.youtube.com/watch?v=0Rnq1NpHdmw" target="_blank">John Oliver</a> in his excellent report on sham science studies: “In science, you don’t just get to cherry-pick the parts that justify what you were going to do anyway. That’s religion. <em>You’re thinking of religion</em>,” he says.</p> <p>Much of the information gets dumbed down or selectively sensationalized as it passes from news source to news source, and some of it was dodgy from the start due to publicity-hungry scientists, which you can kind of understand (but not entirely forgive) as their continued funding depends on finding things that are spectacular, even if a little fictional. And yet it appears grant money is pissing down over Aston University in England, where a study concluded that toast falling off a table will tend to fall butter-side down. This important information was published in the European Journal of Physics.</p> <p dir="ltr"><span>The five problems Sudhof described above are big. All of them need to be fixed. When they are, papers published in scientific journals would not only be more honest; they’d be more varied. More kinds of research would be published – smaller experiments, overlooked topics, and even experiments that had unfavorable or negative results. All of those outcomes would make scientific papers more approachable to the general public. It would also cut down on the amount of pseudoscience that attempts to explain the actual science and ends up confusing everyone.</span></p> <p dir="ltr"><span>So is there a way to fix those 5 problems? You bet! At least from the scientific end (the media is another kettle of fish). Sudhof offers 6 easy tips scientists can use to fix their publication problems and get the public interested in their work:</span></p> <p dir="ltr"><img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8xODQwODA2NC9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTYyMzcxNjg0MH0.-Iq-BX2rgN5yH7hKtP6WMjSH8-UQBoERREjjbIcpBKk/img.jpg?width=980" id="13e5c" class="rm-shortcode" data-rm-shortcode-id="cdad172f58b35c11428f54c6abca60fd" data-rm-shortcode-name="rebelmouse-image"></p> <p dir="ltr"><em>Credit: Laurie Vazquez/Big Think</em></p> <p dir="ltr"><em> </em><strong>1. Post research to preprint servers before publication, giving researchers time to improve their work</strong></p> <p dir="ltr"><span>When a scientist runs an experiment and has a significant result to report, their first step is to write it all up. Their second step is to find a journal to publish in. This is an enormous pain for many reasons, but one of the biggest is that every journal uses a different submission format. Journals collect and publish materials in different ways; streamlining the editorial process by putting all the journals on the same publishing system would let researchers focus more on honing their results, instead of futzing with formatting. Cold Spring Harbor Laboratory’s </span><a href="http://www.ariessys.com/views-press/press-releases/deployment-editorial-manager-ingest-service-biorxiv-preprint-server-simplifies-author-submission-journals-plos-one/" target="_blank"><span>bioRxiv</span></a><span> is already doing this. Hopefully more platforms follow. </span></p> <p dir="ltr"><span> </span><strong>2. Clarifying review forms to give workable feedback to authors</strong></p> <p dir="ltr"><span>Because each journal has its own submission format, they’ve also got their own publishing process. That means they use different methods to review papers, and those methods are often forms that are “cumbersome or insufficient to provide thoughtful and constructive feedback to authors,” Sudhof explains. Streamlining those forms would cut down on the amount of back and forth between the researcher and the journal, again allowing them to focus more on clarifying their work than formatting it.</span></p> <p dir="ltr"><span> </span><strong>3. Reviewer and editor training that puts burgeoning and established reviewers on the same playing field</strong></p> <p dir="ltr"><span>Journals have a variety of people reviewing proposed publications. Some of them were trained decades ago. Some of them are brand-new to reviewing. None of them have a standardized review process that tells them what to look for. Investing in training allows them to assess papers fairly and give constructive feedback to the researcher.</span></p> <p dir="ltr"><strong>4. Reduce the complexity of experiments to make the results easier to reproduce</strong></p> <p dir="ltr"><span>“Many experiments are by design impossible to repeat,” Sudhof writes. “Many current experiments are so complex that differences in outcome can always be attributed to differences in experimental conditions (as is the case for many recent neuroscience studies because of the complexity of the nervous system). If an experiment depends on multiple variables that cannot be reliably held constant, the scientific community should not accept the conclusions from such an experiment as true or false.”</span></p> <p dir="ltr"><strong>5. Validate the methods of the experiment</strong></p> <p dir="ltr"><span>Sudhof again: “Too often, papers in premier journals are published without sufficient experimental controls—they take up too much space in precious journal real estate!—or with reagents that have not been vetted after they were acquired.”</span></p> <p dir="ltr"><strong>6. Publish ALL results, not just ones that support the conclusion you want to make</strong></p> <p dir="ltr"><span>Journals are a business, and as such they tend to publish results that will encourage people to buy them. In this case, that means focusing on experiments with positive results. Sudhof takes particular issue with this, citing the “near impossibility of actually publishing negative results, owing to the reluctance of journals—largely motivated by economic pressures—to devote precious space to such papers, and to the reluctance of authors to acknowledge mistakes.” However, not all journals are like that. </span><a href="https://www.plos.org/who-we-are" target="_blank"><span>PLOS ONE</span></a><span> lets scientists publish “negative, null and inconclusive” results, not just ones that support the experiment. That allows for a more comprehensive understanding of the experiment, and can even provide more helpful data than positive results. Hopefully more journals follow suit.</span></p> <p dir="ltr"><span>By taking these 6 steps, scientists would make their results clearer to the public. That would make discoveries easier to understand, help increase scientific curiosity, and cut down misinformation. It would also </span><a href="http://bigthink.com/laurie-vazquez/a-quick-and-easy-guide-to-understanding-scientists" target="_blank"><span>force scientists to communicate in plain English</span></a><span>, which would make a serious dent in the amount of pseudoscience we hear on a daily basis. Physicist and renowned skeptic </span><a href="http://bigthink.com/neurobonkers/how-to-use-the-feynman-technique-to-identify-pseudoscience" target="_blank"><span>Richard Feynman</span></a><span> explained it to us this way: “'Without using the new word which you have just learned, try to rephrase what you have just learned in your own language.” Pseudoscience explanations are larded with jargon and often can’t be explained in plain English; without the jargon, the explanation falls apart at the seams. Actual science can – and should – do better.</span></p> <p><span>Plus, the sooner pseudoscience goes away, the happier – and smarter – we’ll all be. The ball’s in your court, scientists. Run with it. </span></p>
Keep reading Show less
pseudoscience Richard Feynman John Oliver PLOS Thomas Sudhof science misinformation Last Week Tonight science journalism science literature