We’ve all heard the expression “According to research…” followed by some scientific finding that we are expected, given this prefatory expression, to accept without question.

But as it turns out, even in a field as supposedly objective as biomedicine, reliability and validity are sorely wanting.

In a recent Wall Street Journal article, science writer Richard Harris bemoans the state of biomedical research:

“The issue isn’t just wasted time and money. Many observers now think that biomedical research world-wide has been so compromised that it is slowing and diverting the search for new treatments and cures.”

Small sample sizes and bias in research design result in findings that overstate their conclusions. “[M]ost published research findings,” Harris quotes a Stanford researcher as saying, “are false.”

The problem becomes evident if you consider what happens when other scientists try to replicate the findings of published studies. Replication is considered the gold standard in research. If a follow-up study doesn’t yield the same results as the initial study, the initial study’s findings cannot be relied upon.

A University of Bordeaux paper found that only half of 156 biomedical studies referred to in English-language newspapers could be replicated. A 2011 study by researchers at Bayer could replicate only 25 percent of sampled drug research. And G. Glenn Begley, chief cancer researcher at Amgen, could reproduce only six out of 53 studies he investigated.

Harris attributes the problem to several factors: contaminated research materials, bad research design (particularly inadequate sampling), poor training, and the incentives to produce sensational results.

Scientists in Cambridge, Massachusetts, studying research on Lou Gehrig’s disease, “found serious defects in almost all of the underlying research. The studies often used fewer than a dozen mice per experiment and didn’t take care to avoid significant sources of bias, such as genetic variability in the animals.”

Harris also refers to an NIH researcher who wanted to replicate the best methodology classes: “He put out a call to universities asking for suggestions but found essentially nothing.”

And the incentives?

“Scientists hoping to land good jobs or university tenure also need to have their studies published in one of a handful of top journals. No paper in the prestigious journal Nature? No job interview. That provides further incentive to pretty up one’s work by leaving out inconvenient findings, enhancing images or even avoiding experiments that could undercut a surprising conclusion.”

It’s something to think about the next time someone tells you what “research” has found.