IDEAS home Printed from https://ideas.repec.org/p/tep/tepprr/rr11-09.html
   My bibliography  Save this paper

Evaluer l’impact d’un micro-programme social : Une étude de cas expérimentale

Author

Listed:
  • Yannick L'Horty
  • Emmanuel Duguet
  • Pascale Petit

Abstract

L'intérêt des méthodes expérimentales d'évaluation d'impact est de mesurer de façon rigoureuse les effets d'un programme social même en l'absence de bases de données pré- existantes et de cadre théorique structurel sur les mécanises en œuvre, tout en identifiant les effets spécifiques du programme indépendamment des caractéristiques observables et inobservables de ses bénéficiaires. Nous illustrons ces vertus des méthodes expérimentales avec une étude de cas qui porte sur un dispositif de petite taille, innovant, complexe et destiné à des populations très spécifiques, autant de caractéristiques qui le rendent a priori inévaluable par d'autres méthodes. Il s'agit d'un programme d'accompagnement à la recherche d'un stage destiné aux élèves de troisième qui résident dans des quartiers prioritaires de la politique de la ville. L'expérimentation porte sur 6 collèges classés RAR, soit 28 classes et 550 élèves, dans deux départements, l'Essonne et les Yvelines. Pour remédier aux différences de compositions entre le groupe test et le groupe témoin sur un si petit échantillon, nous avons reconstruit le groupe témoin avec la méthode du score de propension proposée par Rubin. Nous montrons que le dispositif d'accompagnement n'a d'effet ni sur l'effectivité du stage, ni sur la qualité du stage, évaluée au travers de la satisfaction du jeune, alors qu'il s'agit des objectifs visés par le dispositif. En revanche, il influence les choix d'orientation des élèves, ce qui correspond à l'objectif du stage lui même. Les élèves accompagnés refusent moins fréquemment l'orientation vers des filières courtes et professionnalisées.
(This abstract was borrowed from another version of this item.)

Suggested Citation

  • Yannick L'Horty & Emmanuel Duguet & Pascale Petit, 2011. "Evaluer l’impact d’un micro-programme social : Une étude de cas expérimentale," TEPP Research Report 2011-09, TEPP.
  • Handle: RePEc:tep:tepprr:rr11-09
    as

    Download full text from publisher

    File URL: http://www.tepp-repec.eu/RePEc/files/tepprr/TEPP-RR-2011-9-ylh-ed-pp.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Yannick L’Horty & Pascale Petit, 2011. "Evaluation aléatoire et expérimentations sociales," Revue française d'économie, Presses de Sciences-Po, vol. 0(1), pages 13-48.
    2. Luc Behaghel & Bruno Crépon & Marc Gurgand, 2009. "Evaluation d'impact de l'accompagnement des demandeurs d'emploi par les Opérateurs Privés de Placement et le programme Cap Vers l'Entreprise," PSE-Ecole d'économie de Paris (Postprint) halshs-00754917, HAL.
    3. Abhijit V. Banerjee & Esther Duflo, 2009. "The Experimental Approach to Development Economics," Annual Review of Economics, Annual Reviews, vol. 1(1), pages 151-178, May.
    4. Gary Burtless, 1995. "The Case for Randomized Field Trials in Economic and Policy Research," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 63-84, Spring.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Yannick L’Horty & Emmanuel Duguet & Pascale Petit, 2012. "Une évaluation expérimentale d'un micro-programme social," Revue française d'économie, Presses de Sciences-Po, vol. 0(1), pages 107-127.
    2. Emmanuel Duguet & Yannick L’Horty & Pascale Petit, 2011. "Faut-il accompagner les jeunes des quartiers ségrégés dans leur première expérience professionnelle ? Une évaluation expérimentale," Erudite Working Paper 2011-07, Erudite.
    3. Yannick L’Horty & Emmanuel Duguet & Pascale Petit, 2011. "Faut-il accompagner les jeunes des quartiers ségrégés dans leur première expérience professionnelle ? Une évaluation aléatoire," Documents de recherche 11-04, Centre d'Études des Politiques Économiques (EPEE), Université d'Evry Val d'Essonne.
    4. Beliyou Haile & Carlo Azzarri & Cleo Roberts & David J. Spielman, 2017. "Targeting, bias, and expected impact of complex innovations on developing-country agriculture: evidence from Malawi," Agricultural Economics, International Association of Agricultural Economists, vol. 48(3), pages 317-326, May.
    5. Ho, Thong Quoc & Nie, Zihan & Alpizar, Francisco & Carlsson, Fredrik & Nam, Pham Khanh, 2022. "Celebrity endorsement in promoting pro-environmental behavior," Journal of Economic Behavior & Organization, Elsevier, vol. 198(C), pages 68-86.
    6. Hasan Bakhshi & John Edwards & Stephen Roper & Judy Scully & Duncan Shaw & Lorraine Morley & Nicola Rathbone, 2013. "An Experimental Approach to Industrial Policy Evaluation: The case of Creative Credits," Research Papers 0004, Enterprise Research Centre.
    7. Emilie Bourdu & Olivier Bouba-Olga, 2012. "Évaluation d'impact d'un nouveau service public de formation professionnelle," Post-Print hal-00613021, HAL.
    8. Ashish Arora & Michelle Gittelman & Sarah Kaplan & John Lynch & Will Mitchell & Nicolaj Siggelkow & Aaron K. Chatterji & Michael Findley & Nathan M. Jensen & Stephan Meier & Daniel Nielson, 2016. "Field experiments in strategy research," Strategic Management Journal, Wiley Blackwell, vol. 37(1), pages 116-132, January.
    9. James J. Heckman, 1991. "Randomization and Social Policy Evaluation Revisited," NBER Technical Working Papers 0107, National Bureau of Economic Research, Inc.
    10. Jeffrey Smith, 2000. "A Critical Survey of Empirical Methods for Evaluating Active Labor Market Policies," Swiss Journal of Economics and Statistics (SJES), Swiss Society of Economics and Statistics (SSES), vol. 136(III), pages 247-268, September.
    11. Büttner, Thomas, 2008. "Ankündigungseffekt oder Maßnahmewirkung? Eine Evaluation von Trainingsmaßnahmen zur Überprüfung der Verfügbarkeit (Notification or participation : which treatment actually activates job-seekers? An ev," Zeitschrift für ArbeitsmarktForschung - Journal for Labour Market Research, Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany], vol. 41(1), pages 25-40.
    12. Belot, Michèle & James, Jonathan, 2014. "A new perspective on the issue of selection bias in randomized controlled field experiments," Economics Letters, Elsevier, vol. 124(3), pages 326-328.
    13. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    14. Pedro Carneiro & Sokbae Lee & Daniel Wilhelm, 2020. "Optimal data collection for randomized control trials [Microcredit impacts: Evidence from a randomized microcredit program placement experiment by Compartamos Banco]," The Econometrics Journal, Royal Economic Society, vol. 23(1), pages 1-31.
    15. Bruno Crépon & Marc Gurgand & Thierry Kamionka & Laurent Lequien, 2013. "Is Counseling Welfare Recipients Cost-Effective ? Lessons from a Random Experiment," Working Papers 2013-01, Center for Research in Economics and Statistics.
    16. Su, Huei-Chun & Colander, David, 2021. "The Economist As Scientist, Engineer, Or Plumber?," Journal of the History of Economic Thought, Cambridge University Press, vol. 43(2), pages 297-312, June.
    17. Guizar-Mateos, Isai & Miranda, Mario J. & Gonzalez-Vega, Claudio, 2013. "The Role of Credit and Deposits in the Dynamics of Technology Decisions and Poverty Traps," 2013 Annual Meeting, August 4-6, 2013, Washington, D.C. 149860, Agricultural and Applied Economics Association.
    18. Stefano DellaVigna & Elizabeth Linos, 2022. "RCTs to Scale: Comprehensive Evidence From Two Nudge Units," Econometrica, Econometric Society, vol. 90(1), pages 81-116, January.
    19. Dorsett, Richard & Oswald, Andrew J., 2014. "Human Well-being and In-Work Benefits: A Randomized Controlled Trial," IZA Discussion Papers 7943, Institute of Labor Economics (IZA).
    20. Victor R. Fuchs & Alan B. Krueger & James M. Poterba, 1997. "Why do Economists Disagree About Policy?," NBER Working Papers 6151, National Bureau of Economic Research, Inc.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:tep:tepprr:rr11-09. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sylvain (email available below). General contact details of provider: https://edirc.repec.org/data/teppnfr.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.