IDEAS home Printed from https://ideas.repec.org/p/iza/izasps/sp52.html
   My bibliography  Save this paper

Kausale Evaluation von Pilotprojekten: Die Nutzung von Randomisierung in der Praxis

Author

Listed:
  • Arni, Patrick

    (University of Bristol)

Abstract

Wirkungsevaluationen stehen oft vor der Herausforderung, Kausalität zwischen der betrachteten neuen Politikmassnahme und den resultierenden Outcomes herzustellen. Mangelnde Vergleichbarkeit zwischen der Programmgruppe (neue Politik) und der Kontrollgruppe (Status Quo) macht oft eine kausale Interpretation der gefundenen Effekte schwierig (sind wirklich Programmeffekte oder eher Selektionseffekte für das Ergebnis verantwortlich?). Randomisierung – d.h. Zufallszuweisung in Programm- und Kontrollgruppe – sorgt für eine sehr hohe Vergleichbarkeit. In diesem Beitrag werden die Möglichkeiten der Nutzung von randomisierten Studien in der Evaluation von Pilotprojekten diskutiert. Erstens werden Gründe, die für Randomisierung sprechen, sowie Einschränkungen in der Anwendung der Methode diskutiert. Im zweiten Teil des Beitrages wird aufgezeigt, wo aktuell randomisierte Evaluationsstudien in Europa bereits eingesetzt werden – diese Beispiele demonstrieren die Einsatzmöglichkeiten und das Potenzial der Methode. Drittens wird auf die Praxis der Umsetzung und Planung von randomisierten Studien eingegangen: Eine Reihe von zentralen Punkten wird diskutiert, die bei der Implementierung von solchen kausalen Evaluationen im Auge behalten werden sollten.

Suggested Citation

  • Arni, Patrick, 2012. "Kausale Evaluation von Pilotprojekten: Die Nutzung von Randomisierung in der Praxis," IZA Standpunkte 52, Institute of Labor Economics (IZA).
  • Handle: RePEc:iza:izasps:sp52
    as

    Download full text from publisher

    File URL: https://docs.iza.org/sp52.pdf
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Gerard J. van den Berg & Bas van der Klaauw, 2006. "Counseling And Monitoring Of Unemployed Workers: Theory And Evidence From A Controlled Social Experiment," International Economic Review, Department of Economics, University of Pennsylvania and Osaka University Institute of Social and Economic Research Association, vol. 47(3), pages 895-936, August.
    2. Luc Behaghel & Bruno Cr?pon & Marc Gurgand, 2014. "Private and Public Provision of Counseling to Job Seekers: Evidence from a Large Controlled Experiment," American Economic Journal: Applied Economics, American Economic Association, vol. 6(4), pages 142-174, October.
    3. Bruno Crépon & Esther Duflo & Marc Gurgand & Roland Rathelot & Philippe Zamora, 2013. "Do Labor Market Policies have Displacement Effects? Evidence from a Clustered Randomized Experiment," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 128(2), pages 531-580.
    4. Richard Blundell & Monica Costa Dias, 2009. "Alternative Approaches to Evaluation in Empirical Microeconomics," Journal of Human Resources, University of Wisconsin Press, vol. 44(3).
    5. Guido Schwerdt & Dolores Messer & Ludger Woessmann & Stefan C. Wolter, 2011. "Effects of Adult Education Vouchers on the Labor Market: Evidence from a Randomized Field Experiment," CESifo Working Paper Series 3331, CESifo.
    6. repec:idb:brikps:58518 is not listed on IDEAS
    7. Falk, Armin & Lalive, Rafael & Zweimüller, Josef, 2005. "The success of job applications: a new approach to program evaluation," Labour Economics, Elsevier, vol. 12(6), pages 739-748, December.
    8. Schneider Hilmar & Zimmermann Klaus F. & Uhlendorff Arne, 2013. "Ökonometrie vs. Projektdesign: Lehren aus der Evaluation eines Modellprojekts zur Umsetzung des Workfare-Konzepts," Journal of Economics and Statistics (Jahrbuecher fuer Nationaloekonomie und Statistik), De Gruyter, vol. 233(1), pages 65-85, February.
    9. Peter F. Lutz & Malte Sandner, 2010. "Zur Effizienz früher Hilfen: Forschungsdesign und erste Ergebnisse eines randomisierten kontrollierten Experiments," Vierteljahrshefte zur Wirtschaftsforschung / Quarterly Journal of Economic Research, DIW Berlin, German Institute for Economic Research, vol. 79(3), pages 79-97.
    10. DiNardo, John & Lee, David S., 2011. "Program Evaluation and Research Designs," Handbook of Labor Economics, in: O. Ashenfelter & D. Card (ed.), Handbook of Labor Economics, edition 1, volume 4, chapter 5, pages 463-536, Elsevier.
    11. Duflo, Esther & Glennerster, Rachel & Kremer, Michael, 2008. "Using Randomization in Development Economics Research: A Toolkit," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 61, pages 3895-3962, Elsevier.
    12. List, John A. & Rasul, Imran, 2011. "Field Experiments in Labor Economics," Handbook of Labor Economics, in: O. Ashenfelter & D. Card (ed.), Handbook of Labor Economics, edition 1, volume 4, chapter 2, pages 103-228, Elsevier.
    13. Rosholm, Michael, 2008. "Experimental Evidence on the Nature of the Danish Employment Miracle," IZA Discussion Papers 3620, Institute of Labor Economics (IZA).
    14. Per Engström & Pathric Hägglund & Per Johansson, 2017. "Early Interventions and Disability Insurance: Experience from a Field Experiment," Economic Journal, Royal Economic Society, vol. 127(600), pages 363-392, March.
    15. Stijn Baert & Bart Cockx & Niels Gheyle & Cora Vandamme, 2013. "Do Employers Discriminate Less if Vacancies are Difficult to Fill? Evidence from a Field Experiment," CESifo Working Paper Series 4093, CESifo.
    16. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    17. Eva Kislingerová, 2012. "Recenze publikace Stanislawa Sudola Řízení vědy. Hlavní problémy a diskuse," Ekonomika a Management, Prague University of Economics and Business, vol. 2012(2), pages 1-65.
    18. David E. Card & Pablo Ibarraran & Juan Miguel Villa, 2011. "Building in an Evaluation Component for Active Labor Market Programs: A Practitioner's Guide," SPD Working Papers 1101, Inter-American Development Bank, Office of Strategic Planning and Development Effectiveness (SPD).
    19. White, Howard, 2006. "Impact evaluation: the experience of the Independent Evaluation Group of the World Bank," MPRA Paper 1111, University Library of Munich, Germany.
    20. Donald B. Rubin, 1977. "Assignment to Treatment Group on the Basis of a Covariate," Journal of Educational and Behavioral Statistics, , vol. 2(1), pages 1-26, March.
    21. Bruce D. Meyer, 1995. "Lessons from the U.S. Unemployment Insurance Experiments," Journal of Economic Literature, American Economic Association, vol. 33(1), pages 91-131, March.
    22. Ashenfelter, Orley, 1987. "The case for evaluating training programs with randomized trials," Economics of Education Review, Elsevier, vol. 6(4), pages 333-338, August.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Boockmann Bernhard & Buch Claudia M. & Schnitzer Monika, 2014. "Evidenzbasierte Wirtschaftspolitik in Deutschland: Defizite und Potentiale," Perspektiven der Wirtschaftspolitik, De Gruyter, vol. 15(4), pages 307-323, December.
    2. EFI - Commission of Experts for Research and Innovation (ed.), 2013. "Research, innovation and technological performance in Germany - EFI Report 2013," Reports on Research, Innovation and Technological Performance in Germany, Expertenkommission Forschung und Innovation (EFI) - Commission of Experts for Research and Innovation, Berlin, volume 127, number 2013e, March.
    3. Kugler Franziska & Schwerdt Guido & Wößmann Ludger, 2014. "Ökonometrische Methoden zur Evaluierung kausaler Effekte der Wirtschaftspolitik," Perspektiven der Wirtschaftspolitik, De Gruyter, vol. 15(2), pages 105-132, June.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Pieter Gautier & Paul Muller & Bas van der Klaauw & Michael Rosholm & Michael Svarer, 2018. "Estimating Equilibrium Effects of Job Search Assistance," Journal of Labor Economics, University of Chicago Press, vol. 36(4), pages 1073-1125.
    2. Michael C. Knaus & Michael Lechner & Anthony Strittmatter, 2022. "Heterogeneous Employment Effects of Job Search Programs: A Machine Learning Approach," Journal of Human Resources, University of Wisconsin Press, vol. 57(2), pages 597-636.
    3. Bernhard Boockmann & Tobias Brändle, 2019. "Coaching, Counseling, Case‐Working: Do They Help the Older Unemployed Out of Benefit Receipt and Back Into the Labor Market?," German Economic Review, Verein für Socialpolitik, vol. 20(4), pages 436-468, November.
    4. van der Klaauw, Bas & Ziegler, Lennart, 2019. "A Field Experiment on Labor Market Speeddates for Unemployed Workers," IZA Discussion Papers 12140, Institute of Labor Economics (IZA).
    5. Schneider Hilmar & Zimmermann Klaus F. & Uhlendorff Arne, 2013. "Ökonometrie vs. Projektdesign: Lehren aus der Evaluation eines Modellprojekts zur Umsetzung des Workfare-Konzepts," Journal of Economics and Statistics (Jahrbuecher fuer Nationaloekonomie und Statistik), De Gruyter, vol. 233(1), pages 65-85, February.
    6. Malory Rennoir & Ilan Tojerow, 2019. "Évaluation de l’ensemble du dispositif de contrôle de la disponibilité des chômeurs, tel que mis en œuvre au sein du Forem," ULB Institutional Repository 2013/292150, ULB -- Universite Libre de Bruxelles.
    7. Cuong NGUYEN, 2016. "An Introduction to Alternative Methods in Program Impact Evaluation," Journal of Economic and Social Thought, KSP Journals, vol. 3(3), pages 349-375, September.
    8. Ioana E. Marinescu, 2017. "Job search monitoring and assistance for the unemployed," IZA World of Labor, Institute of Labor Economics (IZA), pages 380-380, August.
    9. Thomas, Ranjeeta & Jones, Andrew M & Squire, Lyn, 2010. "Methods for Evaluating Innovative Health Programs (EIHP): A Multi-Country Study," MPRA Paper 29402, University Library of Munich, Germany.
    10. Bruno Crépon & Esther Duflo & Marc Gurgand & Roland Rathelot & Philippe Zamora, 2013. "Do Labor Market Policies have Displacement Effects? Evidence from a Clustered Randomized Experiment," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 128(2), pages 531-580.
    11. Karnani, Mohit, 2016. "Freshmen teachers and college major choice: Evidence from a random assignment in Chile," MPRA Paper 76062, University Library of Munich, Germany.
    12. Boockmann Bernhard & Buch Claudia M. & Schnitzer Monika, 2014. "Evidenzbasierte Wirtschaftspolitik in Deutschland: Defizite und Potentiale," Perspektiven der Wirtschaftspolitik, De Gruyter, vol. 15(4), pages 307-323, December.
    13. Benjamin Schünemann & Michael Lechner & Conny Wunsch, 2015. "Do Long-Term Unemployed Workers Benefit from Targeted Wage Subsidies?," German Economic Review, Verein für Socialpolitik, vol. 16(1), pages 43-64, February.
    14. Jeffrey Smith & Arthur Sweetman, 2016. "Viewpoint: Estimating the causal effects of policies and programs," Canadian Journal of Economics, Canadian Economics Association, vol. 49(3), pages 871-905, August.
    15. Michèle Belot & Philipp Kircher & Paul Muller, 2019. "Providing Advice to Jobseekers at Low Cost: An Experimental Study on Online Advice," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 86(4), pages 1411-1447.
    16. Damien Bricard & Zeynep Or & Anne Penneau, 2018. "Méthodologie de l’évaluation d’impact de l’expérimentation Parcours santé des aînés (Paerpa)," Working Papers DT74, IRDES institut for research and information in health economics, revised Jun 2018.
    17. Loi, Massimo & Rodrigues, Margarida, 2012. "A note on the impact evaluation of public policies: the counterfactual analysis," MPRA Paper 42444, University Library of Munich, Germany.
    18. Sokbae Lee & Yoon-Jae Whang, 2009. "Nonparametric Tests of Conditional Treatment Effects," Cowles Foundation Discussion Papers 1740, Cowles Foundation for Research in Economics, Yale University.
    19. Muller, Paul & van der Klaauw, Bas & Heyma, Arjan, 2017. "Comparing Econometric Methods to Empirically Evaluate Job-Search Assistance," IZA Discussion Papers 10531, Institute of Labor Economics (IZA).
    20. David Card & Jochen Kluve & Andrea Weber, 2018. "What Works? A Meta Analysis of Recent Active Labor Market Program Evaluations," Journal of the European Economic Association, European Economic Association, vol. 16(3), pages 894-931.

    More about this item

    Keywords

    Pilotprojekt; Kontrollgruppe; Feldexperiment; Randomisierung; Kausale Evaluation;
    All these keywords.

    JEL classification:

    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments
    • C81 - Mathematical and Quantitative Methods - - Data Collection and Data Estimation Methodology; Computer Programs - - - Methodology for Collecting, Estimating, and Organizing Microeconomic Data; Data Access
    • H43 - Public Economics - - Publicly Provided Goods - - - Project Evaluation; Social Discount Rate
    • I38 - Health, Education, and Welfare - - Welfare, Well-Being, and Poverty - - - Government Programs; Provision and Effects of Welfare Programs

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:iza:izasps:sp52. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Holger Hinte (email available below). General contact details of provider: https://edirc.repec.org/data/izaaade.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.