IDEAS home Printed from https://ideas.repec.org/a/sae/evarev/v13y1989i6p628-655.html
   My bibliography  Save this article

Analysis of No-Difference Findings in Evaluation Research

Author

Listed:
  • George Julnes

    (University of Michigan)

  • Lawrence B. Mohr

    (University of Michigan)

Abstract

Conclusions of no difference are becoming increasingly important in evaluation research. We delineate three major uses of no-difference findings and analyze their meanings. (1) No-differ ence findings in randomized experiments can be interpreted as support for conclusions of the absence of a meaningful treatment effect, but only if the proper analytic methods are used. (2) Statistically based conclusions in quasi-experiments do not allow causal statements about the treatment impact but do provide a metric to judge the size of the resulting difference. (3) Using no-difference findings to conclude equivalence on control variables is inefficient and potentially misleading. The final section of the article presents alternative methods by which conclusions of no difference may be supported when applicable. These methods include the use of arbitrarily high alpha levels, interval estimation, and power analysis.

Suggested Citation

  • George Julnes & Lawrence B. Mohr, 1989. "Analysis of No-Difference Findings in Evaluation Research," Evaluation Review, , vol. 13(6), pages 628-655, December.
  • Handle: RePEc:sae:evarev:v:13:y:1989:i:6:p:628-655
    DOI: 10.1177/0193841X8901300604
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/0193841X8901300604
    Download Restriction: no

    File URL: https://libkey.io/10.1177/0193841X8901300604?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Goodman, S.N. & Royall, R., 1988. "Evidence and scientific research," American Journal of Public Health, American Public Health Association, vol. 78(12), pages 1568-1574.
    2. Cohen, Patricia, 1982. "To be or not to be : Control and balancing of type I and type II errors," Evaluation and Program Planning, Elsevier, vol. 5(3), pages 247-253, January.
    3. Ashenfelter, Orley, 1987. "The case for evaluating training programs with randomized trials," Economics of Education Review, Elsevier, vol. 6(4), pages 333-338, August.
    4. Orley Ashenfelter, 1986. "The Case for Evaluating Training Programs with Randomized Trials," Working Papers 583, Princeton University, Department of Economics, Industrial Relations Section..
    5. Thomas Fraker & Rebecca Maynard, 1987. "The Adequacy of Comparison Group Designs for Evaluations of Employment-Related Programs," Journal of Human Resources, University of Wisconsin Press, vol. 22(2), pages 194-227.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. repec:mpr:mprres:4778 is not listed on IDEAS
    2. Deborah Peikes & Sean Orzol & Lorenzo Moreno & Nora Paxton, "undated". "State Partnership Initiative: Selection of Comparison Groups for the Evaluation and Selected Impact Estimates," Mathematica Policy Research Reports f8760335b9ab4a39bdf2c3533, Mathematica Policy Research.
    3. David Card, 2022. "Design-Based Research in Empirical Microeconomics," American Economic Review, American Economic Association, vol. 112(6), pages 1773-1781, June.
    4. James J. Heckman, 1991. "Randomization and Social Policy Evaluation Revisited," NBER Technical Working Papers 0107, National Bureau of Economic Research, Inc.
    5. Jeffrey Smith, 2000. "A Critical Survey of Empirical Methods for Evaluating Active Labor Market Policies," Swiss Journal of Economics and Statistics (SJES), Swiss Society of Economics and Statistics (SSES), vol. 136(III), pages 247-268, September.
    6. Ichimura, Hidehiko & Todd, Petra E., 2007. "Implementing Nonparametric and Semiparametric Estimators," Handbook of Econometrics, in: J.J. Heckman & E.E. Leamer (ed.), Handbook of Econometrics, edition 1, volume 6, chapter 74, Elsevier.
    7. Rajeev Dehejia, 2013. "The Porous Dialectic: Experimental and Non-Experimental Methods in Development Economics," WIDER Working Paper Series wp-2013-011, World Institute for Development Economic Research (UNU-WIDER).
    8. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    9. Clampit, Jack & Gaffney, Nolan & Fabian, Frances & Stafford, Thomas, 2023. "Institutional misalignment and escape-based FDI: A prospect theory lens," International Business Review, Elsevier, vol. 32(3).
    10. James J. Heckman, 1991. "Randomization and Social Policy Evaluation Revisited," NBER Technical Working Papers 0107, National Bureau of Economic Research, Inc.
    11. Bruno Van der Linden, 1997. "Effets des formations professionnelles et des aides à l'embauche : exploitation d'une enquête auprès d'employeurs belges," Économie et Prévision, Programme National Persée, vol. 131(5), pages 113-130.
    12. Katherine Baicker & Theodore Svoronos, 2019. "Testing the Validity of the Single Interrupted Time Series Design," NBER Working Papers 26080, National Bureau of Economic Research, Inc.
    13. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    14. J. Ignacio Gimenez-Nadal & José Alberto Molina, 2016. "Commuting Time And Household Responsibilities: Evidence Using Propensity Score Matching," Journal of Regional Science, Wiley Blackwell, vol. 56(2), pages 332-359, March.
    15. Metcalf, Charles E., 1997. "The Advantages of Experimental Designs for Evaluating Sex Education Programs," Children and Youth Services Review, Elsevier, vol. 19(7), pages 507-523, November.
    16. James J. Heckman & V. Joseph Hotz & Marcelo Dabos, 1987. "Do We Need Experimental Data To Evaluate the Impact of Manpower Training On Earnings?," Evaluation Review, , vol. 11(4), pages 395-427, August.
    17. Flores, Carlos A. & Mitnik, Oscar A., 2009. "Evaluating Nonexperimental Estimators for Multiple Treatments: Evidence from Experimental Data," IZA Discussion Papers 4451, Institute of Labor Economics (IZA).
    18. Gimenez-Nadal, José Ignacio & Molina, José Alberto & Silva Quintero, Edgar, 2016. "How Forced Displacements Caused by a Violent Conflict Affect Wages in Colombia," IZA Discussion Papers 9926, Institute of Labor Economics (IZA).
    19. Joshua D. Angrist & Jörn-Steffen Pischke, 2010. "The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 3-30, Spring.
    20. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    21. Ari Hyytinen & Jaakko Meriläinen & Tuukka Saarimaa & Otto Toivanen & Janne Tukiainen, 2018. "When does regression discontinuity design work? Evidence from random election outcomes," Quantitative Economics, Econometric Society, vol. 9(2), pages 1019-1051, July.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:evarev:v:13:y:1989:i:6:p:628-655. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.