Full-text resources of CEJSH and other databases are now available in the new Library of Science.
Visit https://bibliotekanauki.pl

PL EN


Journal

2020 | 17 | 66 | 24

Article title

The Problem of New Evidence: P-Hacking and Pre-Analysis Plans

Content

Title variants

Languages of publication

Abstracts

EN
We provide a novel articulation of the epistemic peril of p-hacking using three resources from philosophy: predictivism, Bayesian confirmation theory, and model selection theory. We defend a nuanced position on p-hacking: p-hacking is sometimes, but not always, epistemically pernicious. Our argument requires a novel understanding of Bayesianism, since a standard criticism of Bayesian confirmation theory is that it cannot represent the influence of biased methods. We then turn to pre-analysis plans, a methodological device used to mitigate p-hacking. Some say that pre-analysis plans are epistemically meritorious while others deny this, and in practice pre-analysis plans are often violated. We resolve this debate with a modest defence of pre-analysis plans. Further, we argue that pre-analysis plans can be epistemically relevant even if the plan is not strictly followed—and suggest that allowing for flexible pre-analysis plans may be the best available policy option.

Journal

Year

Volume

17

Issue

66

Pages

24

Physical description

Dates

published
2020-10

Contributors

author
  • Harvard University
  • University of Cambridge

References

  • Backhouse R.E., Morgan M.S. (2000), “Introduction: is Data Mining a Methodological Problem?,” Journal of Economic Methodology 7 (2): 171–181.
  • Barnes E.C. (2008), The Paradox of Predictivism, Cambridge University Press, Cambridge.
  • Bright L.K. (2017), “On Fraud,” Philosophical Studies 174 (2): 291–310.
  • Brodeur A., Lé M., Sangnier M. et al. (2016), “Star Wars: The Empirics Strike Back,” American Economic Journal: Applied Economics 8 (1): 1–32.
  • Camerer C.F., Dreber A., Holzmeister F. et al. (2018), “Evaluating the Replicability of Social Science Experiments in Nature and Science Between 2010 and 2015,” Nature Human Behaviour 2 (9): 637–644.
  • Casey K., Glennerster R., Miguel E. (2012), “Reshaping Institutions: Evidence on Aid Impacts Using a Preanalysis Plan,” The Quarterly Journal of Economics 127 (4): 1755–1812.
  • Chambers C.D. (2013), “Registered Reports: A New Publishing Initiative at Cortex,” Cortex 49 (3): 609–610.
  • Chambers C.D., Feredoes E., Muthukumaraswamy S.D. et al. (2014), “Instead of ‘Playing the Game’ it is Time to Change the Rules: Registered Reports at AIMS Neuroscience and Beyond,” AIMS Neuroscience 1 (1): 4–17.
  • Christensen G., Miguel E. (2018), “Transparency, Reproducibility, and the Credibility of Economics Research,” Journal of Economic Literature 56 (3): 920–980.
  • Coffman L.C., Niederle M. (2015), “Pre-Analysis Plans have Limited Upside, Especially where Replications are Feasible,” Journal of Economic Perspectives 29 (3): 81–98.
  • Diaconis P., Mosteller F. (1989), “Methods for Studying Coincidences,” Journal of the American Statistical Association 84 (408): 853–861.
  • Douglas H., Magnus P.D. (2013), “State of the Field: Why Novel Prediction Matters,” Studies in History and Philosophy of Science Part A 44 (4): 580–589.
  • Dwan K., Altman D., Arnaiz J. et al. (2008), “Systematic Review of the Empirical Evidence of Study Publication Bias and Outcome Reporting Bias,” PLoS ONE 3 (8): e3081.
  • FDA (1997), “Food and Drug Administration Modernization Act,” 105th U.S. Congress, U.S. House of Representative Bill, URL = https://www.congress.gov/bill/105th-congress/senate-bill/830 [Accessed 13.07.2020].
  • Findley M.G., Jensen N.M., Malesky E.J. et al. (2016), “Can Results-Free Review Reduce Publication Bias? The Results and Implications of a Pilot Study,” Comparative Political Studies 49 (13): 1667–1703.
  • Foster A., Karlan D., Miguel E. (2018), “Registered Reports: Piloting a Pre-Results Review Process at the Journal of Development Economics,” World Bank Development Impact Blog, URL = https://blogs.worldbank.org/impactevaluations/registered-reportspiloting-pre-results-review-process-journal-development-economics [Accessed 02.07.2020].
  • Frankel A., Kasy M. (2020), “Which Findings should be Published?,” URL = https://maxkasy.github.io/home/files/papers/findings.pdf [Accessed 25.08.2020].
  • Frisch M. (2015), “Predictivism and Old Evidence: A Critical Look at Climate Model Tuning,” European Journal for Philosophy of Science 5 (2): 171–190.
  • Glymour C. (1980), Theory and Evidence, Princeton University Press, Princeton, N.J.
  • Head M.L., Holman L., Lanfear R. et al. (2015), “The Extent and Consequences of P-Hacking in Science,” PLoS Biology 13 (3): e1002106.
  • Howson C. (1991), “The ‘Old Evidence’ Problem,” The British Journal for the Philosophy of Science 42 (4): 547–555.
  • Howson C., Franklin A. (1991), “Maher, Mendeleev and Bayesianism,” Philosophy of Science 58 (4): 574–585.
  • Humphreys M., Sanchez De la Sierra R., Windt P.V.D. (2013), “Fishing, Commitment, and Communication: A Proposal for Comprehensive Nonbinding Research Registration,” Political Analysis 21 (1): 1–20.
  • Ioannidis J.P.A. (2005), “Why Most Published Research Findings are False,” PLoS Medicine 2 (8): e124.
  • Ioannidis J.P.A. (2008), “Why Most Discovered True Associations are Inflated,” Epidemiology 19 (5): 640–648.
  • Leamer E.E. (1983), “Let’s Take The Con out of Econometrics,” American Economic Review 73 (1): 31–43.
  • Leonelli S. (2016), Data-Centric Biology: A Philosophical Study, University of Chicago Press, Chicago.
  • Libgober J. (Forthcoming), “False Positives and Transparency,” American Economic Journal: Microeconomics, URL = https://www.aeaweb.org/articles?id=10.1257/mic.20190218 [Accessed 01.10.2020].
  • Maher P. (1988), “Prediction, Accommodation, and the Logic of Discovery,” [in:] PSA: Proceedings of the Biennial Meeting of the Philosophy of Science Association, vol. 1, A. Fine, J. Leplin (eds.), Philosophy of Science Association, East Lansing, MI: 273–285.
  • Mayo D.G. (1996), Error and the Growth of Experimental Knowledge, University of Chicago Press, Chicago.
  • Miguel E., Camerer C., Casey K. et al. (2014), “Promoting Transparency in Social Science Research,” Science 343 (6166): 30–31.
  • Nosek B.A., Lakens D. (2014), “Registered Reports: A Method to Increase the Credibility of Published Results,” Social Psychology 45 (3): 137–141.
  • Olken B.A. (2015), “Promises and Perils of Pre-Analysis Plans,” Journal of Economic Perspectives 29 (3): 61–80.
  • Pagan A. (1987), “Three Econometric Methodologies: A Critical Appraisal,” Journal of Economic Surveys 1 (1–2): 3–23.
  • Phillips P.C.B. (1988), “Reflections on Econometric Methodology,” Economic Record 64 (4): 344–359.
  • Pearson T.A., Manolio T.A. (2008), “How to Interpret a Genome-Wide Association Study,” Journal of the American Medical Association 299 (11): 1335–1344.
  • Simmons J.P., Nelson L.D., Simonsohn U. (2011), “False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant,” Psychological Science 22 (11): 1359–1366.
  • Sober E. (2015), Ockham’s Razors. A User’s Manual, Cambridge University Press, Cambridge.
  • White R. (2003), “The Epistemic Advantage of Prediction over Accommodation,” Mind 112 (448): 653–683.
  • Worrall J. (2014), “Prediction and Accommodation Revisited,” Studies in History and Philosophy of Science Part A 45 (1): 54–61.
  • DOI
  • View in Google Scholar

Document Type

Publication order reference

Identifiers

ISSN
1733-5566

YADDA identifier

bwmeta1.element.desklight-aff1d1e1-bae6-4945-a444-7adba0705955
JavaScript is turned off in your web browser. Turn it on to take full advantage of this site, then refresh the page.