IDEAS home Printed from https://ideas.repec.org/p/enr/rpaper/0004.html
   My bibliography  Save this paper

An Experimental Approach to Industrial Policy Evaluation: The case of Creative Credits

Author

Listed:
  • Hasan Bakhshi

    (Nesta)

  • John Edwards

    (Aston University Business School)

  • Stephen Roper

    (Warwick University Business School)

  • Judy Scully

    (Aston University Business School)

  • Duncan Shaw

    (Warwick University Business School)

  • Lorraine Morley

    (Warwick University Business School)

  • Nicola Rathbone

    (Aston University Business School)

Abstract

Experimental methods of policy evaluation are well-established in social policy and development economics but are rare in industrial and innovation policy. In this paper we consider the arguments for applying experimental methods to industrial policy measures, and propose an experimental policy evaluation approach (which we call RCT+). This combines the randomised assignment of firms to treatment and control groups with a longitudinal data collection strategy incorporating quantitative and qualitative data (so-called mixed methods). The RCT+ approach is designed to provide a causative rather than purely summative evaluation, i.e. to assess both ‘whether’ and ‘how’ programme outcomes are achieved. We test the RCT+ approach in an evaluation of Creative Credits – a UK business-to-business innovation voucher initiative intended to promote new innovation partnerships between SMEs and creative service providers. The results suggest the potential value of experimental approaches to industrial policy evaluation, and the benefits of mixed methods and longitudinal data collection in industrial policy evaluations.

Suggested Citation

  • Hasan Bakhshi & John Edwards & Stephen Roper & Judy Scully & Duncan Shaw & Lorraine Morley & Nicola Rathbone, 2013. "An Experimental Approach to Industrial Policy Evaluation: The case of Creative Credits," Research Papers 0004, Enterprise Research Centre.
  • Handle: RePEc:enr:rpaper:0004
    as

    Download full text from publisher

    File URL: http://enterpriseresearch.ac.uk/wp-content/uploads/2013/12/ERC-RP4-Babski-et-al1.pdf
    File Function: First version, 2013
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Angus Deaton, 2010. "Instruments, Randomization, and Learning about Development," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 424-455, June.
    2. Nicholas Bloom & Benn Eifert & Aprajit Mahajan & David McKenzie & John Roberts, 2013. "Does Management Matter? Evidence from India," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 128(1), pages 1-51.
    3. Jens Ludwig & Jeffrey R. Kling & Sendhil Mullainathan, 2011. "Mechanism Experiments and Policy Evaluations," Journal of Economic Perspectives, American Economic Association, vol. 25(3), pages 17-38, Summer.
    4. Hewitt-Dundas, Nola & Roper, Stephen, 2011. "Creating advantage in peripheral regions: The role of publicly funded R&D centres," Research Policy, Elsevier, vol. 40(6), pages 832-841, July.
    5. Tuomas Takalo & Tanja Tanayama, 2010. "Adverse selection and financing of innovation: is there a need for R&D subsidies?," The Journal of Technology Transfer, Springer, vol. 35(1), pages 16-41, February.
    6. Niosi, Jorge, 2003. "Alliances are not enough explaining rapid growth in biotechnology firms," Research Policy, Elsevier, vol. 32(5), pages 737-750, May.
    7. Garcia, Abraham & Mohnen, Pierre, 2010. "Impact of government support on R&D and innovation," MERIT Working Papers 2010-034, United Nations University - Maastricht Economic and Social Research Institute on Innovation and Technology (MERIT).
    8. repec:pri:rpdevs:deaton_instruments_randomization_learning_all_04april_2010 is not listed on IDEAS
    9. Duflo, Esther & Glennerster, Rachel & Kremer, Michael, 2008. "Using Randomization in Development Economics Research: A Toolkit," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 61, pages 3895-3962, Elsevier.
    10. Donaldson, Stewart I. & Gooler, Laura E., 2003. "Theory-driven evaluation in action: lessons from a $20 million statewide Work and Health Initiative," Evaluation and Program Planning, Elsevier, vol. 26(4), pages 355-366, November.
    11. Bratberg, Espen & Grasdal, Astrid & Risa, Alf Erling, 2002. " Evaluating Social Policy by Experimental and Nonexperimental Methods," Scandinavian Journal of Economics, Wiley Blackwell, vol. 104(1), pages 147-171.
    12. Maarten Cornet & Björn Vroomen & Marc van der Steeg, 2006. "Do innovation vouchers help SMEs to cross the bridge towards science?," CPB Discussion Paper 58, CPB Netherlands Bureau for Economic Policy Analysis.
    13. Nola Hewitt-Dundas & Stephen Roper, 2009. "Output Additionality of Public Support for Innovation: Evidence for Irish Manufacturing Plants," European Planning Studies, Taylor & Francis Journals, vol. 18(1), pages 107-122, September.
    14. Espen Bratberg & Astrid Grasdal & Alf Erling Risa, 2002. "Evaluating Social Policy by Experimental and Nonexperimental Methods," Scandinavian Journal of Economics, Wiley Blackwell, vol. 104(1), pages 147-171, March.
    15. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    16. Abhijit V. Banerjee & Esther Duflo, 2009. "The Experimental Approach to Development Economics," Annual Review of Economics, Annual Reviews, vol. 1(1), pages 151-178, May.
    17. Gary Burtless, 1995. "The Case for Randomized Field Trials in Economic and Policy Research," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 63-84, Spring.
    18. Johan Lambrecht & Fabrice Pirnay, 2005. "An evaluation of public support measures for private external consultancies to SMEs in the Walloon Region of Belgium," Entrepreneurship & Regional Development, Taylor & Francis Journals, vol. 17(2), pages 89-108, March.
    19. Miriam Bruhn & David McKenzie, 2009. "In Pursuit of Balance: Randomization in Practice in Development Field Experiments," American Economic Journal: Applied Economics, American Economic Association, vol. 1(4), pages 200-232, October.
    20. David Greenberg & Mark Shroder & Matthew Onstott, 1999. "The Social Experiment Market," Journal of Economic Perspectives, American Economic Association, vol. 13(3), pages 157-172, Summer.
    21. Meuleman, Miguel & De Maeseneire, Wouter, 2012. "Do R&D subsidies affect SMEs’ access to external financing?," Research Policy, Elsevier, vol. 41(3), pages 580-591.
    22. Lerner, Josh, 1999. "The Government as Venture Capitalist: The Long-Run Impact of the SBIR Program," The Journal of Business, University of Chicago Press, vol. 72(3), pages 285-318, July.
    23. Rosenbusch, Nina & Brinckmann, Jan & Bausch, Andreas, 2011. "Is innovation always beneficial? A meta-analysis of the relationship between innovation and performance in SMEs," Journal of Business Venturing, Elsevier, vol. 26(4), pages 441-457, July.
    24. Trüby, Johannes & Rammer, Christian & Müller, Kathrin, 2008. "The Role of Creative Industries in Industrial Innovation," ZEW Discussion Papers 08-109, ZEW - Leibniz Centre for European Economic Research.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Bakhshi, Hasan & Edwards, John S. & Roper, Stephen & Scully, Judy & Shaw, Duncan & Morley, Lorraine & Rathbone, Nicola, 2015. "Assessing an experimental approach to industrial policy evaluation: Applying RCT+ to the case of Creative Credits," Research Policy, Elsevier, vol. 44(8), pages 1462-1472.
    2. Ashish Arora & Michelle Gittelman & Sarah Kaplan & John Lynch & Will Mitchell & Nicolaj Siggelkow & Aaron K. Chatterji & Michael Findley & Nathan M. Jensen & Stephan Meier & Daniel Nielson, 2016. "Field experiments in strategy research," Strategic Management Journal, Wiley Blackwell, vol. 37(1), pages 116-132, January.
    3. Jeffrey Smith & Arthur Sweetman, 2016. "Viewpoint: Estimating the causal effects of policies and programs," Canadian Journal of Economics, Canadian Economics Association, vol. 49(3), pages 871-905, August.
    4. Michael Christian Lehman, 2014. "Long-Run Effects Of Conditional Cash Transfers," Anais do XLI Encontro Nacional de Economia [Proceedings of the 41st Brazilian Economics Meeting] 223, ANPEC - Associação Nacional dos Centros de Pós-Graduação em Economia [Brazilian Association of Graduate Programs in Economics].
    5. McKenzie, David, 2012. "Beyond baseline and follow-up: The case for more T in experiments," Journal of Development Economics, Elsevier, vol. 99(2), pages 210-221.
    6. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    7. Susan Athey & Guido Imbens, 2016. "The Econometrics of Randomized Experiments," Papers 1607.00698, arXiv.org.
    8. Margaret Dalziel, 2018. "Why are there (almost) no randomised controlled trial-based evaluations of business support programmes?," Palgrave Communications, Palgrave Macmillan, vol. 4(1), pages 1-9, December.
    9. Baldwin Kate & Bhavnani Rikhil R., 2015. "Ancillary Studies of Experiments: Opportunities and Challenges," Journal of Globalization and Development, De Gruyter, vol. 6(1), pages 113-146, June.
    10. McKenzie, David, 2011. "How can we learn whether firm policies are working in africa ? challenges (and solutions?) for experiments and structural models," Policy Research Working Paper Series 5632, The World Bank.
    11. Brautzsch, Hans-Ulrich & Günther, Jutta & Loose, Brigitte & Ludwig, Udo & Nulsch, Nicole, 2015. "Can R&D subsidies counteract the economic crisis? – Macroeconomic effects in Germany," Research Policy, Elsevier, vol. 44(3), pages 623-633.
    12. Pedro Carneiro & Sokbae Lee & Daniel Wilhelm, 2020. "Optimal data collection for randomized control trials [Microcredit impacts: Evidence from a randomized microcredit program placement experiment by Compartamos Banco]," The Econometrics Journal, Royal Economic Society, vol. 23(1), pages 1-31.
    13. Wu, Aihua, 2017. "The signal effect of Government R&D Subsidies in China: Does ownership matter?," Technological Forecasting and Social Change, Elsevier, vol. 117(C), pages 339-345.
    14. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," The World Bank Research Observer, World Bank, vol. 33(1), pages 34-64.
    15. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    16. Demeulemeester, Sarah & Hottenrott, Hanna, 2015. "R&D subsidies and firms' cost of debt," DICE Discussion Papers 201, Heinrich Heine University Düsseldorf, Düsseldorf Institute for Competition Economics (DICE).
    17. Aufenanger, Tobias, 2018. "Treatment allocation for linear models," FAU Discussion Papers in Economics 14/2017, Friedrich-Alexander University Erlangen-Nuremberg, Institute for Economics, revised 2018.
    18. Jason T. Kerwin & Rebecca L. Thornton, 2021. "Making the Grade: The Sensitivity of Education Program Effectiveness to Input Choices and Outcome Measures," The Review of Economics and Statistics, MIT Press, vol. 103(2), pages 251-264, May.
    19. Gani Aldashev & Georg Kirchsteiger & Alexander Sebald, 2017. "Assignment Procedure Biases in Randomised Policy Experiments," Economic Journal, Royal Economic Society, vol. 127(602), pages 873-895, June.
    20. Lota Tamini & Ibrahima Bocoum & Ghislain Auger & Kotchikpa Gabriel Lawin & Arahama Traoré, 2019. "Enhanced Microfinance Services and Agricultural Best Management Practices: What Benefits for Smallholders Farmers? An Evidence from Burkina Faso," CIRANO Working Papers 2019s-11, CIRANO.

    More about this item

    Keywords

    evaluation; experimental; industrial policy; innovation; creative; qualitative research;
    All these keywords.

    JEL classification:

    • Z18 - Other Special Topics - - Cultural Economics - - - Public Policy
    • D04 - Microeconomics - - General - - - Microeconomic Policy: Formulation; Implementation; Evaluation
    • D83 - Microeconomics - - Information, Knowledge, and Uncertainty - - - Search; Learning; Information and Knowledge; Communication; Belief; Unawareness

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:enr:rpaper:0004. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Olivia Garcia (email available below). General contact details of provider: https://edirc.repec.org/data/wbswauk.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.