IDEAS home Printed from https://ideas.repec.org/a/aea/jecper/v13y1999i3p157-172.html
   My bibliography  Save this article

The Social Experiment Market

Author

Listed:
  • David Greenberg
  • Mark Shroder
  • Matthew Onstott

Abstract

In social experiments, individuals, households, or organizations are randomly assigned to two or more policy interventions. Elsewhere, we have summarized 143 experiments completed by autumn 1996. Here, we use the information we have gathered on these experiments and findings from informal telephone interviews to investigate the social experiment market--the buyers and sellers in the market that governs the production of experiments. We discuss target populations, types of interventions tested, trends in design, funding sources, industry concentration, the role of economists in social experimentation, the reasons few social experiments have been conducted outside the United States, and the future of the social experiment market.

Suggested Citation

  • David Greenberg & Mark Shroder & Matthew Onstott, 1999. "The Social Experiment Market," Journal of Economic Perspectives, American Economic Association, vol. 13(3), pages 157-172, Summer.
  • Handle: RePEc:aea:jecper:v:13:y:1999:i:3:p:157-172
    Note: DOI: 10.1257/jep.13.3.157
    as

    Download full text from publisher

    File URL: http://www.aeaweb.org/articles.php?doi=10.1257/jep.13.3.157
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. LaLonde, Robert J, 1986. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," American Economic Review, American Economic Association, vol. 76(4), pages 604-620, September.
    2. Burtless, Gary, 1990. "The Economist's Lament: Public Assistance in America," Journal of Economic Perspectives, American Economic Association, vol. 4(1), pages 57-78, Winter.
    3. Gueron, Judith M, 1990. "Work and Welfare: Lessons on Employment Programs," Journal of Economic Perspectives, American Economic Association, vol. 4(1), pages 79-98, Winter.
    4. Robert L & Rebecca Maynard, 1987. "How Precise Are Evaluations of Employment and Training Programs," Evaluation Review, , vol. 11(4), pages 428-451, August.
    5. Gary Burtless, 1995. "The Case for Randomized Field Trials in Economic and Policy Research," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 63-84, Spring.
    6. Daniel Friedlander & David H. Greenberg & Philip K. Robins, 1997. "Evaluating Government Training Programs for the Economically Disadvantaged," Journal of Economic Literature, American Economic Association, vol. 35(4), pages 1809-1855, December.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Antoni Bosch-Domènech & José G. Montalvo & Rosemarie Nagel & Albert Satorra, 2002. "One, Two, (Three), Infinity, ...: Newspaper and Lab Beauty-Contest Experiments," American Economic Review, American Economic Association, vol. 92(5), pages 1687-1701, December.
    2. Justman, Moshe, 2018. "Randomized controlled trials informing public policy: Lessons from project STAR and class size reduction," European Journal of Political Economy, Elsevier, vol. 54(C), pages 167-174.
    3. Spermann, Alexander & Strotmann, Harald, 2005. "The Targeted Negative Income Tax (TNIT) in Germany: Evidence from a Quasi Experiment," ZEW Discussion Papers 05-68, ZEW - Leibniz Centre for European Economic Research.
    4. Rothstein, Jesse & Von Wachter, Till, 2016. "Social Experiments in the Labor Market," Department of Economics, Working Paper Series qt7957p9g6, Department of Economics, Institute for Business and Economic Research, UC Berkeley.
    5. Gary Burtless, 1995. "The Case for Randomized Field Trials in Economic and Policy Research," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 63-84, Spring.
    6. Margaret Dalziel, 2018. "Why are there (almost) no randomised controlled trial-based evaluations of business support programmes?," Palgrave Communications, Palgrave Macmillan, vol. 4(1), pages 1-9, December.
    7. Carol Harvey & Michael J. Camasso & Radha Jagannathan, 2000. "Evaluating Welfare Reform Waivers under Section 1115," Journal of Economic Perspectives, American Economic Association, vol. 14(4), pages 165-188, Fall.
    8. Kling, Jeffrey R., 2007. "Methodological Frontiers of Public Finance Field Experiments," National Tax Journal, National Tax Association;National Tax Journal, vol. 60(1), pages 109-127, March.
    9. Ted Palmer & Anthony Petrosino, 2003. "The “Experimenting Agencyâ€," Evaluation Review, , vol. 27(3), pages 228-266, June.
    10. Levitt, Steven D. & List, John A., 2009. "Field experiments in economics: The past, the present, and the future," European Economic Review, Elsevier, vol. 53(1), pages 1-18, January.
    11. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    12. David Greenberg & Burt S. Barnow, 2014. "Flaws in Evaluations of Social Programs," Evaluation Review, , vol. 38(5), pages 359-387, October.
    13. Hasan Bakhshi & John Edwards & Stephen Roper & Judy Scully & Duncan Shaw & Lorraine Morley & Nicola Rathbone, 2013. "An Experimental Approach to Industrial Policy Evaluation: The case of Creative Credits," Research Papers 0004, Enterprise Research Centre.
    14. Moshe Justman, 2016. "Economic Research and Education Policy: Project STAR and Class Size Reduction," Melbourne Institute Working Paper Series wp2016n37, Melbourne Institute of Applied Economic and Social Research, The University of Melbourne.
    15. Burt S. Barnow & David Greenberg, 2015. "Do Estimated Impacts on Earnings Depend on the Source of the Data Used to Measure Them? Evidence From Previous Social Experiments," Evaluation Review, , vol. 39(2), pages 179-228, April.
    16. Jeffrey Smith & Arthur Sweetman, 2016. "Viewpoint: Estimating the causal effects of policies and programs," Canadian Journal of Economics/Revue canadienne d'économique, John Wiley & Sons, vol. 49(3), pages 871-905, August.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Heckman, James J. & Lalonde, Robert J. & Smith, Jeffrey A., 1999. "The economics and econometrics of active labor market programs," Handbook of Labor Economics, in: O. Ashenfelter & D. Card (ed.), Handbook of Labor Economics, edition 1, volume 3, chapter 31, pages 1865-2097, Elsevier.
    2. Jeffrey Smith, 2000. "A Critical Survey of Empirical Methods for Evaluating Active Labor Market Policies," Swiss Journal of Economics and Statistics (SJES), Swiss Society of Economics and Statistics (SSES), vol. 136(III), pages 247-268, September.
    3. Smith, Jeffrey, 2000. "Evaluation aktiver Arbeitsmarktpolitik : Erfahrungen aus Nordamerika (Evaluating Avtive Labor Market Policies : Lessons from North America)," Mitteilungen aus der Arbeitsmarkt- und Berufsforschung, Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany], vol. 33(3), pages 345-356.
    4. Donal O'Neill, 2000. "Evaluating Labour Market Interventions," Economics Department Working Paper Series n990300, Department of Economics, National University of Ireland - Maynooth.
    5. Michael Lechner, 2000. "An Evaluation of Public-Sector-Sponsored Continuous Vocational Training Programs in East Germany," Journal of Human Resources, University of Wisconsin Press, vol. 35(2), pages 347-375.
    6. Smith, Jeffrey, 2000. "Evaluation aktiver Arbeitsmarktpolitik : Erfahrungen aus Nordamerika (Evaluating Avtive Labor Market Policies : Lessons from North America)," Mitteilungen aus der Arbeitsmarkt- und Berufsforschung, Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany], vol. 33(3), pages 345-356.
    7. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    8. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    9. Metcalf, Charles E., 1997. "The Advantages of Experimental Designs for Evaluating Sex Education Programs," Children and Youth Services Review, Elsevier, vol. 19(7), pages 507-523, November.
    10. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    11. V. Joseph Hotz & Guido W. Imbens & Jacob A. Klerman, 2000. "The Long-Term Gains from GAIN: A Re-Analysis of the Impacts of the California GAIN Program," NBER Working Papers 8007, National Bureau of Economic Research, Inc.
    12. Paul Ryan, 2001. "The School-to-Work Transition: A Cross-National Perspective," Journal of Economic Literature, American Economic Association, vol. 39(1), pages 34-92, March.
    13. de Crombrugghe, D.P.I. & Espinoza, H. & Heijke, J.A.M., 2010. "Job-training programmes with low completion rates: the case of Projoven-Peru," ROA Research Memorandum 004, Maastricht University, Research Centre for Education and the Labour Market (ROA).
    14. Michael J. Puma & Nancy R. Burstein, 1994. "The national evaluation of the food stamp employment and training program," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 13(2), pages 311-330.
    15. Miguel Angel Malo & Fernando Muñoz-Bullón, 2006. "Employment promotion measures and the quality of the job match for persons with disabilities," Hacienda Pública Española / Review of Public Economics, IEF, vol. 179(4), pages 79-111, September.
    16. David H. Dean & Robert C. Dolan & Robert M. Schmidt, 1999. "Evaluating the Vocational Rehabilitation Program Using Longitudinal Data," Evaluation Review, , vol. 23(2), pages 162-189, April.
    17. Andersson, Fredrik W. & Holzer, Harry J. & Lane, Julia & Rosenblum, David & Smith, Jeffrey A., 2013. "Does Federally-Funded Job Training Work? Nonexperimental Estimates of WIA Training Impacts Using Longitudinal Data on Workers and Firms," IZA Discussion Papers 7621, Institute of Labor Economics (IZA).
    18. Cansino Muñoz-Repiso, José Manuel & Sánchez Braza, Antonio, 2011. "Effectiveness of Public Training Programs Reducing the Time Needed to Find a Job/Eficacia de los programas públicos de formación en la reducción del tiempo necesario para encontrar un empleo," Estudios de Economia Aplicada, Estudios de Economia Aplicada, vol. 29, pages 391(26á.)-3, Abril.
    19. Astrid Grasdal, 2001. "The performance of sample selection estimators to control for attrition bias," Health Economics, John Wiley & Sons, Ltd., vol. 10(5), pages 385-398, July.
    20. Eichler, Martin & Lechner, Michael, 1996. "Public Sector Sponsored Continuous Vocational Training in East Germany : Institutional Arrangements, Participants, and Results of Empirical Evaluations," Discussion Papers 549, Institut fuer Volkswirtschaftslehre und Statistik, Abteilung fuer Volkswirtschaftslehre.

    More about this item

    JEL classification:

    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:aea:jecper:v:13:y:1999:i:3:p:157-172. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Michael P. Albert (email available below). General contact details of provider: https://edirc.repec.org/data/aeaaaea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.