If you’re not familiar with Dipak Das, you might be familiar with his work as a researcher at the University of Connecticut, where he studied resveratrol. Resveratrol, a molecule found in red wine, has been touted as nothing short of a miracle. It has been suggested to have cardiovascular benefits, to protect against developing diabetes and obesity, and to improve longevity. Companies have poured millions of dollars into developing a supplement or medication based partly, if not indirectly, on Das’s research. In January, Dipak Das’s research came under scrutiny when details surrounding a 3-year investigation were made public. Citing 145 instances where he falsified or fabricated data, Das was subsequently released from his position at the University of Connecticut.

Then there’s the case of the social psychologist Diederik Stapel from Tilburg University, Netherlands. In 2011, Stapel admitted to falsifying data on dozens of his most acclaimed studies. Incidences like what happened at the University of Connecticut and in the Netherlands are no longer considered anomalous. A disturbing trend of retractions has emerged, citing major and minor errors or outright data fabrication in published research. In the past decade, there has been a 10-fold increase in scientific journal retractions. A meta-analysis of survey questions indicated that 1.97% to 14.12% of respondents have falsified or fabricated research data and 33.7% to 72.0% of respondents claimed to have committed questionable research practices. Data manipulation is not always a nefarious act, and multiple factors may influence how research data get manipulated.

In 2005, a meta-researcher named John Ioannidis published 2 articles on this subject, the first of which was “Why Most Published Research Findings Are False.” In his paper, Ioannidis provided a mathematical proof suggesting how modern research findings often turn out to be wrong. He identified several consistencies across troubled studies, including reoccurring researcher bias, poor research techniques, claiming an effect based on a single study, and a tendency for researchers to gravitate toward exciting theories instead of more plausible theories.

Continue Reading

Based on his calculations, Ioannidis identified 6 corollary factors that increase the likelihood that research results are false:

  1. A small sample size
  2. A small effect size
  3. Conducting research with low pre-study odds and too many relationships
  4. High flexibility in experimental designs, definitions, outcomes, and analytical modes
  5. Increased financial interests, prejudices, and conflicts of interests
  6. The “Proteus Phenomenon”: occurs in scientific fields with multiple independent investigators, particularly doing research in hot scientific topics

Getting published is how a researcher receives funding, tenure, and career advancement. This process can create an intellectual conflict of interest, Ioannidis suggests, which can result in unintended or unconscious data manipulation. Another problematic aspect of modern publishing is the peer review process, which he believes may be used as an instrument to suppress opposing ideas. As a rule, journals like to publish sensational, attention-grabbing headlines and steer away from research with null findings. With so many researchers and so few journals, it is easy to imagine how competition may bias researchers and possibly influence their data.

In Ioannidis’ second paper, he examined disproved research and any lasting effect it may have on subsequent research. He chose to look at 49 articles, all with mainstream acceptance in the scientific community. As a measurement tool, he examined only highly cited articles as an indicator of their effect or reach. The research topics ranged from issues like vitamin E and associated risk reductions of heart disease to the superiority of angioplasty intervention vs thrombolysis. Of the 49 published articles, 45 had claims of effective interventions. Of those 45 articles, 14 were shown to be exaggerated or wrong. What’s most alarming is the fact that after a study has been disproven, it is often cited as correct for many years. Ideas seem to die hard, and live on for years, especially within the public’s mind. To this day, many people still believe that their gastric ulcer is caused by stress, and haven’t even heard of H pylori.

One might think that researchers would have lambasted Ioannidis’ work. But the opposite happened, and his work has been widely accepted in the research community. His paper “Why Most Published Research Findings Are False” is the most downloaded article on PLoS Medicine. Ioannidis’ work helped identify and bring into the spotlight some potential inadequacies within research. Ioannidis offers suggestions on how to improve modern research, and hopefully they continue to have an impact on how research is conducted and evaluated. The environment for publishing research is changing and a call for more transparency and increased scrutiny has taken place. In 2010, Ivan Oransky and Adam Marcus launched the blog “Retraction Watch” in an effort to make the public aware of retracted publications and the reasons behind a journal’s decision to pull an article. This way, questionable research doesn’t just disappear while leaving the initial sensationalized influence behind, distorting perceptions about an intervention or a promising course of research.


  1. Fanelli D. How many scientists fabricate and falsify research? A systematic review and meta-analysis of survey data. PLos One. 4(5):e5738. doi:10.1371/journal.pone.0005738. May 29, 2009. http://www.plosone.org/article/info:doi%2F10.1371%2Fjournal.pone.0005738.
  2. Freedman DH. Lies, damned lies, and medical science. The Atlantic. November 2010. http://www.theatlantic.com/magazine/archive/2010/11/lies-damned-lies-and-medical-science/8269/#.
  3. Ioannidis JPA. Why most published research findings are false. PLos Medicine. 2(8): e124. doi:10.1371/journal.pmed.0020124. August 30, 2005. http://www.plosmedicine.org/article/info:doi/10.1371/journal.pmed.0020124.
  4. Ioannidis JPA. Contradicted and initially stronger effects in highly cited clinical research. JAMA. 2005;294(2)218-228. July 13, 2005. http://jama.jamanetwork.com/article.aspx?volume=294&issue=2&page=218.
  5. Publish and be wrong. The Economist. October 9, 2008. http://www.economist.com/node/12376658.
  6. Resveratrol fraud case update: Dipak Das loses editor’s chair, lawyer issues statement refuting all charges. Retraction Watch. http://retractionwatch.wordpress.com/2012/01/12/resveratrol-fraud-case-update-dipak-das-loses-editors-chair-laywer-issues-statement-refuting-all-charges/.
  7. Silberman E. Resveratrol researcher falsified 145 studies. Pharmalot. January 12, 2012. http://www.pharmalot.com/2012/01/heart-researcher-falsified-red-wine-research/.
  8. Van Noorden R. Science publishing: the trouble with retractions. Nature. 2011;478:26-28. October 5, 2011. http://www.nature.com/news/2011/111005/full/478026a.html.
  9. Vogel G. Report: Dutch ‘lord of the data’ forged dozens of studies. Science Insider. October 31, 2011. http://news.sciencemag.org/scienceinsider/2011/10/report-dutch-lord-of-the-data-fo.html.