in

Quantifying and addressing the prevalence and bias of study designs in the environmental and social sciences

  • 1.

    Donnelly, C. A. et al. Four principles to make evidence synthesis more useful for policy. Nature 558, 361–364 (2018).

    ADS  CAS  PubMed  Article  PubMed Central  Google Scholar 

  • 2.

    McKinnon, M. C., Cheng, S. H., Garside, R., Masuda, Y. J. & Miller, D. C. Sustainability: map the evidence. Nature 528, 185–187 (2015).

    ADS  CAS  PubMed  Article  PubMed Central  Google Scholar 

  • 3.

    Rubin, D. B. For objective causal inference, design trumps analysis. Ann. Appl. Stat. 2, 808–840 (2008).

    MathSciNet  MATH  Article  Google Scholar 

  • 4.

    Peirce, C. S. & Jastrow, J. On small differences in sensation. Mem. Natl Acad. Sci. 3, 73–83 (1884).

  • 5.

    Fisher, R. A. Statistical methods for research workers. (Oliver and Boyd, 1925).

  • 6.

    Angrist, J. D. & Pischke, J.-S. Mostly harmless econometrics: an empiricist’s companion. (Princeton University Press, 2008).

  • 7.

    de Palma, A. et al. Challenges with inferring how land-use affects terrestrial biodiversity: study design, time, space and synthesis. in Next Generation Biomonitoring: Part 1 163–199 (Elsevier Ltd., 2018).

  • 8.

    Sagarin, R. & Pauchard, A. Observational approaches in ecology open new ground in a changing world. Front. Ecol. Environ. 8, 379–386 (2010).

    Article  Google Scholar 

  • 9.

    Shadish, W. R., Cook, T. D. & Campbell, D. T. Experimental and quasi-experimental designs for generalized causal inference. (Houghton Mifflin, 2002).

  • 10.

    Rosenbaum, P. R. Design of observational studies. vol. 10 (Springer, 2010).

  • 11.

    Light, R. J., Singer, J. D. & Willett, J. B. By design: Planning research on higher education. By design: Planning research on higher education. (Harvard University Press, 1990).

  • 12.

    Ioannidis, J. P. A. Why most published research findings are false. PLOS Med. 2, e124 (2005).

    PubMed  PubMed Central  Article  Google Scholar 

  • 13.

    Open Science Collaboration. Estimating the reproducibility of psychological science. Science 349, aac4716 (2015).

    Article  CAS  Google Scholar 

  • 14.

    John, L. K., Loewenstein, G. & Prelec, D. Measuring the prevalence of questionable research practices with incentives for truth telling. Psychol. Sci. 23, 524–532 (2012).

    PubMed  Article  PubMed Central  Google Scholar 

  • 15.

    Kerr, N. L. HARKing: hypothesizing after the results are known. Personal. Soc. Psychol. Rev. 2, 196–217 (1998).

    CAS  Article  Google Scholar 

  • 16.

    Zhao, Q., Keele, L. J. & Small, D. S. Comment: will competition-winning methods for causal inference also succeed in practice? Stat. Sci. 34, 72–76 (2019).

    MATH  Article  Google Scholar 

  • 17.

    Friedman, J., Hastie, T. & Tibshirani, R. The Elements of Statistical Learning. vol. 1 (Springer series in statistics, 2001).

  • 18.

    Underwood, A. J. Beyond BACI: experimental designs for detecting human environmental impacts on temporal variations in natural populations. Mar. Freshw. Res. 42, 569–587 (1991).

    Article  Google Scholar 

  • 19.

    Stewart-Oaten, A. & Bence, J. R. Temporal and spatial variation in environmental impact assessment. Ecol. Monogr. 71, 305–339 (2001).

    Article  Google Scholar 

  • 20.

    Eddy, T. D., Pande, A. & Gardner, J. P. A. Massive differential site-specific and species-specific responses of temperate reef fishes to marine reserve protection. Glob. Ecol. Conserv. 1, 13–26 (2014).

    Article  Google Scholar 

  • 21.

    Sher, A. A. et al. Native species recovery after reduction of an invasive tree by biological control with and without active removal. Ecol. Eng. 111, 167–175 (2018).

    Article  Google Scholar 

  • 22.

    Imbens, G. W. & Rubin, D. B. Causal Inference in Statistics, Social, and Biomedical Sciences. (Cambridge University Press, 2015).

  • 23.

    Greenhalgh, T. How to read a paper: the basics of Evidence Based Medicine. (John Wiley & Sons, Ltd, 2019).

  • 24.

    Salmond, S. S. Randomized Controlled Trials: Methodological Concepts and Critique. Orthopaedic Nursing 27, (2008).

  • 25.

    Geijzendorffer, I. R. et al. How can global conventions for biodiversity and ecosystem services guide local conservation actions? Curr. Opin. Environ. Sustainability 29, 145–150 (2017).

    Article  Google Scholar 

  • 26.

    Dimick, J. B. & Ryan, A. M. Methods for evaluating changes in health care policy. JAMA 312, 2401 (2014).

    CAS  PubMed  Article  PubMed Central  Google Scholar 

  • 27.

    Ding, P. & Li, F. A bracketing relationship between difference-in-differences and lagged-dependent-variable adjustment. Political Anal. 27, 605–615 (2019).

    Article  Google Scholar 

  • 28.

    Christie, A. P. et al. Simple study designs in ecology produce inaccurate estimates of biodiversity responses. J. Appl. Ecol. 56, 2742–2754 (2019).

    Article  Google Scholar 

  • 29.

    Watson, M. et al. An analysis of the quality of experimental design and reliability of results in tribology research. Wear 426–427, 1712–1718 (2019).

    Article  CAS  Google Scholar 

  • 30.

    Kilkenny, C. et al. Survey of the quality of experimental design, statistical analysis and reporting of research using animals. PLoS ONE 4, e7824 (2009).

  • 31.

    Christie, A. P. et al. The challenge of biased evidence in conservation. Conserv, Biol. 13577, https://doi.org/10.1111/cobi.13577 (2020).

  • 32.

    Christie, A. P. et al. Poor availability of context-specific evidence hampers decision-making in conservation. Biol. Conserv. 248, 108666 (2020).

    Article  Google Scholar 

  • 33.

    Moscoe, E., Bor, J. & Bärnighausen, T. Regression discontinuity designs are underutilized in medicine, epidemiology, and public health: a review of current and best practice. J. Clin. Epidemiol. 68, 132–143 (2015).

    Article  Google Scholar 

  • 34.

    Goldenhar, L. M. & Schulte, P. A. Intervention research in occupational health and safety. J. Occup. Med. 36, 763–778 (1994).

    CAS  PubMed  PubMed Central  Google Scholar 

  • 35.

    Junker, J. et al. A severe lack of evidence limits effective conservation of the World’s primates. BioScience https://doi.org/10.1093/biosci/biaa082 (2020).

  • 36.

    Altindag, O., Joyce, T. J. & Reeder, J. A. Can Nonexperimental Methods Provide Unbiased Estimates of a Breastfeeding Intervention? A Within-Study Comparison of Peer Counseling in Oregon. Evaluation Rev. 43, 152–188 (2019).

    Article  Google Scholar 

  • 37.

    Chaplin, D. D. et al. The Internal And External Validity Of The Regression Discontinuity Design: A Meta-Analysis Of 15 Within-Study Comparisons. J. Policy Anal. Manag. 37, 403–429 (2018).

    Article  Google Scholar 

  • 38.

    Cook, T. D., Shadish, W. R. & Wong, V. C. Three conditions under which experiments and observational studies produce comparable causal estimates: New findings from within-study comparisons. J. Policy Anal. Manag. 27, 724–750 (2008).

    Article  Google Scholar 

  • 39.

    Ioannidis, J. P. A. et al. Comparison of evidence of treatment effects in randomized and nonrandomized studies. J. Am. Med. Assoc. 286, 821–830 (2001).

    CAS  Article  Google Scholar 

  • 40.

    dos Santos Ribas, L. G., Pressey, R. L., Loyola, R. & Bini, L. M. A global comparative analysis of impact evaluation methods in estimating the effectiveness of protected areas. Biol. Conserv. 246, 108595 (2020).

    Article  Google Scholar 

  • 41.

    Benson, K. & Hartz, A. J. A Comparison of Observational Studies and Randomized, Controlled Trials. N. Engl. J. Med. 342, 1878–1886 (2000).

    CAS  PubMed  Article  PubMed Central  Google Scholar 

  • 42.

    Smokorowski, K. E. et al. Cautions on using the Before-After-Control-Impact design in environmental effects monitoring programs. Facets 2, 212–232 (2017).

    Article  Google Scholar 

  • 43.

    França, F. et al. Do space-for-time assessments underestimate the impacts of logging on tropical biodiversity? An Amazonian case study using dung beetles. J. Appl. Ecol. 53, 1098–1105 (2016).

    Article  Google Scholar 

  • 44.

    Duvendack, M., Hombrados, J. G., Palmer-Jones, R. & Waddington, H. Assessing ‘what works’ in international development: meta-analysis for sophisticated dummies. J. Dev. Effectiveness 4, 456–471 (2012).

    Article  Google Scholar 

  • 45.

    Sutherland, W. J. et al. Building a tool to overcome barriers in research-implementation spaces: The Conservation Evidence database. Biol. Conserv. 238, 108199 (2019).

    Article  Google Scholar 

  • 46.

    Gusenbauer, M. & Haddaway, N. R. Which academic search systems are suitable for systematic reviews or meta-analyses? Evaluating retrieval qualities of Google Scholar, PubMed, and 26 other resources. Res. Synth. Methods 11, 181–217 (2020).

    PubMed  PubMed Central  Article  Google Scholar 

  • 47.

    Konno, K. & Pullin, A. S. Assessing the risk of bias in choice of search sources for environmental meta‐analyses. Res. Synth. Methods 11, 698–713 (2020).

    PubMed  PubMed Central  Google Scholar 

  • 48.

    Butsic, V., Lewis, D. J., Radeloff, V. C., Baumann, M. & Kuemmerle, T. Quasi-experimental methods enable stronger inferences from observational data in ecology. Basic Appl. Ecol. 19, 1–10 (2017).

  • 49.

    Brownstein, N. C., Louis, T. A., O’Hagan, A. & Pendergast, J. The role of expert judgment in statistical inference and evidence-based decision-making. Am. Statistician 73, 56–68 (2019).

    MathSciNet  Article  Google Scholar 

  • 50.

    Hahn, J., Todd, P. & Klaauw, W. Identification and estimation of treatment effects with a regression-discontinuity design. Econometrica 69, 201–209 (2001).

    Article  Google Scholar 

  • 51.

    Slavin, R. E. Best evidence synthesis: an intelligent alternative to meta-analysis. J. Clin. Epidemiol. 48, 9–18 (1995).

    CAS  PubMed  Article  PubMed Central  Google Scholar 

  • 52.

    Slavin, R. E. Best-evidence synthesis: an alternative to meta-analytic and traditional reviews. Educ. Researcher 15, 5–11 (1986).

    Article  Google Scholar 

  • 53.

    Shea, B. J. et al. AMSTAR 2: a critical appraisal tool for systematic reviews that include randomised or non-randomised studies of healthcare interventions, or both. BMJ (Online) 358, 1–8 (2017).

    Google Scholar 

  • 54.

    Sterne, J. A. C. et al. ROBINS-I: a tool for assessing risk of bias in non-randomised studies of interventions. BMJ 355, i4919 (2016).

    PubMed  PubMed Central  Article  Google Scholar 

  • 55.

    Guyatt, G. et al. GRADE guidelines: 11. Making an overall rating of confidence in effect estimates for a single outcome and for all outcomes. J. Clin. Epidemiol. 66, 151–157 (2013).

    PubMed  Article  PubMed Central  Google Scholar 

  • 56.

    Davies, G. M. & Gray, A. Don’t let spurious accusations of pseudoreplication limit our ability to learn from natural experiments (and other messy kinds of ecological monitoring). Ecol. Evolution 5, 5295–5304 (2015).

    Article  Google Scholar 

  • 57.

    Lortie, C. J., Stewart, G., Rothstein, H. & Lau, J. How to critically read ecological meta-analyses. Res. Synth. Methods 6, 124–133 (2015).

    PubMed  Article  PubMed Central  Google Scholar 

  • 58.

    Gutzat, F. & Dormann, C. F. Exploration of concerns about the evidence-based guideline approach in conservation management: hints from medical practice. Environ. Manag. 66, 435–449 (2020).

    Article  Google Scholar 

  • 59.

    Greenhalgh, T. Will COVID-19 be evidence-based medicine’s nemesis? PLOS Med. 17, e1003266 (2020).

    CAS  PubMed  PubMed Central  Article  Google Scholar 

  • 60.

    Barlow, J. et al. The future of hyperdiverse tropical ecosystems. Nature 559, 517–526 (2018).

    ADS  CAS  PubMed  Article  Google Scholar 

  • 61.

    Gurevitch, J. & Hedges, L. V. Statistical issues in ecological meta‐analyses. Ecology 80, 1142–1149 (1999).

    Article  Google Scholar 

  • 62.

    Stone, J. C., Glass, K., Munn, Z., Tugwell, P. & Doi, S. A. R. Comparison of bias adjustment methods in meta-analysis suggests that quality effects modeling may have less limitations than other approaches. J. Clin. Epidemiol. 117, 36–45 (2020).

    PubMed  Article  Google Scholar 

  • 63.

    Rhodes, K. M. et al. Adjusting trial results for biases in meta-analysis: combining data-based evidence on bias with detailed trial assessment. J. R. Stat. Soc.: Ser. A (Stat. Soc.) 183, 193–209 (2020).

    MathSciNet  CAS  Article  Google Scholar 

  • 64.

    Efthimiou, O. et al. Combining randomized and non-randomized evidence in network meta-analysis. Stat. Med. 36, 1210–1226 (2017).

    MathSciNet  PubMed  Article  Google Scholar 

  • 65.

    Welton, N. J., Ades, A. E., Carlin, J. B., Altman, D. G. & Sterne, J. A. C. Models for potentially biased evidence in meta-analysis using empirically based priors. J. R. Stat. Soc. Ser. A (Stat. Soc.) 172, 119–136 (2009).

    Article  Google Scholar 

  • 66.

    Turner, R. M., Spiegelhalter, D. J., Smith, G. C. S. & Thompson, S. G. Bias modelling in evidence synthesis. J. R. Stat. Soc.: Ser. A (Stat. Soc.) 172, 21–47 (2009).

    MathSciNet  Article  Google Scholar 

  • 67.

    Shackelford, G. E. et al. Dynamic meta-analysis: a method of using global evidence for local decision making. bioRxiv 2020.05.18.078840, https://doi.org/10.1101/2020.05.18.078840 (2020).

  • 68.

    Sutherland, W. J., Pullin, A. S., Dolman, P. M. & Knight, T. M. The need for evidence-based conservation. Trends Ecol. evolution 19, 305–308 (2004).

    Article  Google Scholar 

  • 69.

    Ioannidis, J. P. A. Meta-research: Why research on research matters. PLOS Biol. 16, e2005468 (2018).

    PubMed  PubMed Central  Article  CAS  Google Scholar 

  • 70.

    LaLonde, R. J. Evaluating the econometric evaluations of training programs with experimental data. Am. Econ. Rev. 76, 604–620 (1986).

  • 71.

    Long, Q., Little, R. J. & Lin, X. Causal inference in hybrid intervention trials involving treatment choice. J. Am. Stat. Assoc. 103, 474–484 (2008).

    MathSciNet  CAS  MATH  Article  Google Scholar 

  • 72.

    Thomson Reuters. ISI Web of Knowledge. http://www.isiwebofknowledge.com (2019).

  • 73.

    Stroup, W. W. Generalized linear mixed models: modern concepts, methods and applications. (CRC press, 2012).

  • 74.

    Bolker, B. M. et al. Generalized linear mixed models: a practical guide for ecology and evolution. Trends Ecol. Evolution 24, 127–135 (2009).

    Article  Google Scholar 

  • 75.

    R Core Team. R: A language and environment for statistical computing. R Foundation for Statistical Computing (2019).

  • 76.

    Bates, D., Mächler, M., Bolker, B. & Walker, S. Fitting linear mixed-effects models using lme4. J. Stat. Softw. 67, 1–48 (2015).

    Article  Google Scholar 

  • 77.

    Venables, W. N. & Ripley, B. D. Modern Applied Statistics with S. (Springer, 2002).

  • 78.

    Stan Development Team. RStan: the R interface to Stan. R package version 2.19.3 (2020).


  • Source: Ecology - nature.com

    A core microbiota dominates a rich microbial diversity in the bovine udder and may indicate presence of dysbiosis

    Case studies show climate variation linked to rise and fall of medieval nomadic empires