Towards a Better Understanding of Flawed Science


Vox explores the increasing difficulty of getting scientific studies to be done scientifically, and how much that costs society. Is one solution for all of us to be more vigorously cognizant that science is mistake prone?

“From study design to dissemination of research, there are dozens of ways science can go off the rails,” reports Vox. “Many of the scientific studies that are published each year are poorly designed, redundant, or simply useless. Researchers looking into the problem have found that more than half of studies fail to take steps to reduce biases, such as blinding whether people receive treatment or placebo. In an analysis of 300 clinical research papers about epilepsy — published in 1981, 1991, and 2001 — 71 percent were categorized as having no enduring value. Of those, 55.6 percent were classified as inherently unimportant and 38.8 percent as not new. All told, according to one estimate, about $200 billion — or the equivalent of 85 percent of global spending on research — is routinely wasted on flawed and redundant studies.”

Science is often flawed. It’s time we embraced that. (Vox, May 13, 2015)


  1. It’s not that science is flawed – it’s more that people don’t understand or chose to pursue real science. Science is inherently skeptical – a new finding, rather than generating enthusiastic excitement and automatic acceptance, should instead generate a raft of efforts to disprove or qualify the finding. Rigorous science assumes that human beings are flawed observers and that only repeated, measured observations by multiple people over different times and settings can provide sufficient data to establish an actual scientific truth. And even such truths are only true until further data contradicts them.

    Nowadays, “Science” is used as a marketing tool to promote products, professions, or viewpoints, and anyone who looks can find a “scientist” to support his/her pet theory or product. That is not real science. The only thing science is supposed to promote is a search for what is true and reproducibly real, regardless of whether it turns someone a profit or advances their career or not. Marrying profit to science inevitably corrupts the process, as the scientist is no longer objective when his/her livelihood depends on turning out a particular “right” answer. Even very honest people can be corrupted in this way.

    The only real way out is to stop any funding of academic scientific research by industry. Industry can do their own studies, but they have to pay for them and the public knows the source is biased. Academic research should be funded only by government or non-profit sources who have no interest in the outcome. And of course, industry folks need to be completely removed from any kind of oversight role in the government. The FDA should be manned primarily by lay people who have no stake in the industry they are regulating. To do otherwise brings about exactly the situation we face today.

    —- Steve

    Report comment

  2. An article about approaching the research with caution… must itself be approached with caution. Follow the Vox piece to the study in question (Epilepsy Behav. 2013 Sep;28[3]:522-9) and you will find that it is not nearly so simple. The method of determining a paper’s value is rather arbitrary and definitely favors recently published research, since it defines value almost entirely by the number of citations. It only considers citations in 2011, citations between 2001 and 2011, and citations in a “standard” epilepsy text — plus another few points at the PI’s discretion. A major breakthrough in 1981 that was later supplanted by a more effective or safer treatment will be judged as having no enduring value. By that same standard we would also say that Charcot, Janet or Babinski did not produce anything of enduring value either.

    It is definitely true that there is a lot of research and clinical literature out there that is essentially useless before the ink even dries. Every field has its “throw-away” journals, which generally have lower standards for peer review and rather obvious links to pharma. This study just isn’t the best way to show that. And keep in mind that there are also journals that are much more reliable and more likely to publish data of lasting importance — e.g. JAMA, NEJM, or Lancet.

    Report comment

    • “Throw-away” journals are not the only problem. Elite journals are just as bad if not worse and if you look at the rate of retractions and corrections it’s pretty obvious. The whole system is flawed from funding to peer review to assessing the “importance” of studies.

      Report comment

  3. The problem with the Vox study, according to the Wall Street Journal which had a pretty detailed story, was obviously the money that inadvertantly changed hands, more importantly was evidence that seems to support that at least the editor was aware of or suspicious thst an issue with safety existed. He was certainly aware before the drug was pulled, and even then, it was over a year before an article was published by the journal to make people aware that a serious danger might exist. When asked why they waited, they said it was the writer of the study’s responsibility to correct or retract any information. A lot of times, studies are deliberately skewed, and the general public finds it hard to track down all the pieces, but imagine if medical professionals took to heart their obligation to truth.: all the stigma, stereotypes, and blatantly false information that the public could be aware of. They suspect that vox resulted in around, 55,000 deaths. I find it particularly difficult to see the editor and the journal as innocent if they could have stopped even just a handful of those deaths.

    Report comment