IDEAS home Printed from https://ideas.repec.org/a/sae/anname/v578y2001i1p50-70.html
   My bibliography  Save this article

Does Research Design Affect Study Outcomes in Criminal Justice?

Author

Listed:
  • David Weisburd

    (Department of Criminology and Criminal Justice at the University of Maryland and a professor of criminology at the Hebrew University Law School in Jerusalem)

  • Cynthia M. Lum

    (Department of Criminology and Criminal Justice at the University of Maryland)

  • Anthony Petrosino

    (Center for Evaluation, Initiative for Children Program at the American Academy of Arts and Sciences and a research associate at Harvard University)

Abstract

Does the type of research design used in a crime and justice study influence its conclusions? Scholars agree in theory that randomized experimental studies have higher internal validity than do nonrandomized studies. But there is not consensus regarding the costs of using nonrandomized studies in coming to conclusions regarding criminal justice interventions. To examine these issues, the authors look at the relationship between research design and study outcomes in a broad review of research evidence on crime and justice commissioned by the National Institute of Justice. Their findings suggest that design does have a systematic effect on outcomes in criminal justice studies. The weaker a design, indicated by internal validity, the more likely a study is to report a result in favor of treatment and the less likely it is to report a harmful effect of treatment. Even when comparing randomized studies with strong quasi-experimental research designs, systematic and statistically significant differences are observed.

Suggested Citation

  • David Weisburd & Cynthia M. Lum & Anthony Petrosino, 2001. "Does Research Design Affect Study Outcomes in Criminal Justice?," The ANNALS of the American Academy of Political and Social Science, , vol. 578(1), pages 50-70, November.
  • Handle: RePEc:sae:anname:v:578:y:2001:i:1:p:50-70
    DOI: 10.1177/000271620157800104
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/000271620157800104
    Download Restriction: no

    File URL: https://libkey.io/10.1177/000271620157800104?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. LaLonde, Robert J, 1986. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," American Economic Review, American Economic Association, vol. 76(4), pages 604-620, September.
    2. Gary Burtless, 1995. "The Case for Randomized Field Trials in Economic and Policy Research," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 63-84, Spring.
    3. Sandra M. Nutley & Huw T.O. Davies, 1999. "The Fall and Rise of Evidence in Criminal Justice," Public Money & Management, Taylor & Francis Journals, vol. 19(1), pages 47-54, January.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. David Weisburd & Cynthia M. Lum & Sue-Ming Yang, 2003. "When can we Conclude that Treatments or Programs “Don’t Work†?," The ANNALS of the American Academy of Political and Social Science, , vol. 587(1), pages 31-48, May.
    2. Kevin Petersen & Robert C. Davis & David Weisburd & Bruce Taylor, 2022. "Effects of second responder programs on repeat incidents of family abuse: An updated systematic review and meta‐analysis," Campbell Systematic Reviews, John Wiley & Sons, vol. 18(1), March.
    3. Friedrich Lösel & Andreas Beelmann, 2003. "Effects of Child Skills Training in Preventing Antisocial Behavior: A Systematic Review of Randomized Evaluations," The ANNALS of the American Academy of Political and Social Science, , vol. 587(1), pages 84-109, May.
    4. David P. Farrington, 2003. "Methodological Quality Standards for Evaluation Research," The ANNALS of the American Academy of Political and Social Science, , vol. 587(1), pages 49-68, May.
    5. Brandon C. Welsh & David P. Farrington, 2003. "Effects of Closed-Circuit Television on Crime," The ANNALS of the American Academy of Political and Social Science, , vol. 587(1), pages 110-135, May.
    6. David Weisburd & John E. Eck, 2004. "What Can Police Do to Reduce Crime, Disorder, and Fear?," The ANNALS of the American Academy of Political and Social Science, , vol. 593(1), pages 42-65, May.
    7. Iain Chalmers, 2003. "Trying to do more Good than Harm in Policy and Practice: The Role of Rigorous, Transparent, Up-to-Date Evaluations," The ANNALS of the American Academy of Political and Social Science, , vol. 589(1), pages 22-40, September.
    8. Dominic Pearson & David Torgerson & Cynthia McDougall & Roger Bowles, 2010. "Parable of Two Agencies, One of Which Randomizes," The ANNALS of the American Academy of Political and Social Science, , vol. 628(1), pages 11-29, March.
    9. Lawrence W. Sherman & Heather Strang, 2004. "Experimental Ethnography: The Marriage of Qualitative and Quantitative Research," The ANNALS of the American Academy of Political and Social Science, , vol. 595(1), pages 204-222, September.
    10. Kevin Petersen & David Weisburd & Sydney Fay & Elizabeth Eggins & Lorraine Mazerolle, 2023. "Police stops to reduce crime: A systematic review and meta‐analysis," Campbell Systematic Reviews, John Wiley & Sons, vol. 19(1), March.
    11. Lawrence W. Sherman, 2003. "Experimental Evidence and Governmental Administration," The ANNALS of the American Academy of Political and Social Science, , vol. 589(1), pages 226-233, September.
    12. Mark W. Lipsey, 2003. "Those Confounded Moderators in Meta-Analysis: Good, Bad, and Ugly," The ANNALS of the American Academy of Political and Social Science, , vol. 587(1), pages 69-81, May.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    2. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    3. Miguel Angel Malo & Fernando Muñoz-Bullón, 2006. "Employment promotion measures and the quality of the job match for persons with disabilities," Hacienda Pública Española / Review of Public Economics, IEF, vol. 179(4), pages 79-111, September.
    4. David H. Dean & Robert C. Dolan & Robert M. Schmidt, 1999. "Evaluating the Vocational Rehabilitation Program Using Longitudinal Data," Evaluation Review, , vol. 23(2), pages 162-189, April.
    5. Astrid Grasdal, 2001. "The performance of sample selection estimators to control for attrition bias," Health Economics, John Wiley & Sons, Ltd., vol. 10(5), pages 385-398, July.
    6. Eichler, Martin & Lechner, Michael, 1996. "Public Sector Sponsored Continuous Vocational Training in East Germany : Institutional Arrangements, Participants, and Results of Empirical Evaluations," Discussion Papers 549, Institut fuer Volkswirtschaftslehre und Statistik, Abteilung fuer Volkswirtschaftslehre.
    7. Smith, Jeffrey, 2000. "Evaluation aktiver Arbeitsmarktpolitik : Erfahrungen aus Nordamerika (Evaluating Avtive Labor Market Policies : Lessons from North America)," Mitteilungen aus der Arbeitsmarkt- und Berufsforschung, Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany], vol. 33(3), pages 345-356.
    8. Dehejia, Rajeev H., 2005. "Program evaluation as a decision problem," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 141-173.
    9. Michael Lechner, 2000. "An Evaluation of Public-Sector-Sponsored Continuous Vocational Training Programs in East Germany," Journal of Human Resources, University of Wisconsin Press, vol. 35(2), pages 347-375.
    10. Deborah A. Cobb‐Clark & Thomas Crossley, 2003. "Econometrics for Evaluations: An Introduction to Recent Developments," The Economic Record, The Economic Society of Australia, vol. 79(247), pages 491-511, December.
    11. Thomas Brodaty & Bruno Crépon & Denis Fougère, 2007. "Les méthodes micro-économétriques d'évaluation et leurs applications aux politiques actives de l'emploi," Economie & Prévision, La Documentation Française, vol. 0(1), pages 93-118.
    12. Bryson, Alex & Dorsett, Richard & Purdon, Susan, 2002. "The use of propensity score matching in the evaluation of active labour market policies," LSE Research Online Documents on Economics 4993, London School of Economics and Political Science, LSE Library.
    13. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    14. Jeffrey Smith, 2000. "A Critical Survey of Empirical Methods for Evaluating Active Labor Market Policies," Swiss Journal of Economics and Statistics (SJES), Swiss Society of Economics and Statistics (SSES), vol. 136(III), pages 247-268, September.
    15. Heckman, James J. & Lalonde, Robert J. & Smith, Jeffrey A., 1999. "The economics and econometrics of active labor market programs," Handbook of Labor Economics, in: O. Ashenfelter & D. Card (ed.), Handbook of Labor Economics, edition 1, volume 3, chapter 31, pages 1865-2097, Elsevier.
    16. Ravallion, Martin, 2008. "Evaluating Anti-Poverty Programs," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 59, pages 3787-3846, Elsevier.
    17. Justine Burns & Malcolm Kewsell & Rebecca Thornton, 2009. "Evaluating the Impact of Health Programmes," SALDRU Working Papers 40, Southern Africa Labour and Development Research Unit, University of Cape Town.
    18. Smith, Jeffrey, 2000. "Evaluation aktiver Arbeitsmarktpolitik : Erfahrungen aus Nordamerika (Evaluating Avtive Labor Market Policies : Lessons from North America)," Mitteilungen aus der Arbeitsmarkt- und Berufsforschung, Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany], vol. 33(3), pages 345-356.
    19. Michael Lechner, 1999. "Nonparametric bounds on employment and income effects of continuous vocational training in East Germany," Econometrics Journal, Royal Economic Society, vol. 2(1), pages 1-28.
    20. Hujer, Reinhard & Caliendo, Marco, 2000. "Evaluation of Active Labour Market Policy: Methodological Concepts and Empirical Estimates," IZA Discussion Papers 236, Institute of Labor Economics (IZA).

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:anname:v:578:y:2001:i:1:p:50-70. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.