IDEAS home Printed from https://ideas.repec.org/p/osf/metaar/d5eud.html

Panel Data and Experimental Design

Author

Listed:
  • Burlig, Fiona
  • Preonas, Louis
  • Woerman, Matt

Abstract

How should researchers design experiments to detect treatment effects with panel data? In this paper, we derive analytical expressions for the variance of panel estimators under non-i.i.d. error structures, which inform power calculations in panel data settings. Using Monte Carlo simulation, we demonstrate that, with correlated errors, traditional methods for experimental design result in experiments that are incorrectly powered with proper inference. Failing to account for serial correlation yields overpowered experiments in short panels and underpowered experiments in long panels. Using both data from a randomized experiment in China and a high-frequency dataset of U.S. electricity consumption, we show that these results hold in real-world settings. Our theoretical results enable us to achieve correctly powered experiments in both simulated and real data. This paper provides researchers with the tools to design well-powered experiments in panel data settings.

Suggested Citation

  • Burlig, Fiona & Preonas, Louis & Woerman, Matt, 2017. "Panel Data and Experimental Design," MetaArXiv d5eud, Center for Open Science.
  • Handle: RePEc:osf:metaar:d5eud
    DOI: 10.31219/osf.io/d5eud
    as

    Download full text from publisher

    File URL: https://osf.io/download/58ba02b46c613b01f53b3292/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/d5eud?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. repec:oup:qjecon:v:129:y:2014:i:2:p:697-752. is not listed on IDEAS
    2. Joshua D. Angrist & Jörn-Steffen Pischke, 2010. "The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 3-30, Spring.
    3. Meredith Fowlie & Michael Greenstone & Catherine Wolfram, 2018. "Do Energy Efficiency Investments Deliver? Evidence from the Weatherization Assistance Program," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 133(3), pages 1597-1644.
    4. McKenzie, David, 2012. "Beyond baseline and follow-up: The case for more T in experiments," Journal of Development Economics, Elsevier, vol. 99(2), pages 210-221.
    5. Koichiro Ito & Takanori Ida & Makoto Tanaka, 2015. "The Persistence of Moral Suasion and Economic Incentives: Field Experimental Evidence from Energy Demand," NBER Working Papers 20910, National Bureau of Economic Research, Inc.
    6. Allcott, Hunt, 2011. "Social norms and energy conservation," Journal of Public Economics, Elsevier, vol. 95(9-10), pages 1082-1095, October.
    7. David Atkin & Azam Chaudhry & Shamyla Chaudry & Amit K. Khandelwal & Eric Verhoogen, 2017. "Organizational Barriers to Technology Adoption: Evidence from Soccer-Ball Producers in Pakistan," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 132(3), pages 1101-1164.
    8. Christopher Blattman & Nathan Fiala & Sebastian Martinez, 2014. "Generating Skilled Self-Employment in Developing Countries: Experimental Evidence from Uganda," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 129(2), pages 697-752.
    9. Anna, Petrenko, . "Мaркування готової продукції як складова частина інформаційного забезпечення маркетингової діяльності підприємств овочепродуктового підкомплексу," Agricultural and Resource Economics: International Scientific E-Journal, Agricultural and Resource Economics: International Scientific E-Journal, vol. 2(01).
    10. Allcott, Hunt, 2011. "Social norms and energy conservation," Journal of Public Economics, Elsevier, vol. 95(9), pages 1082-1095.
    11. David Atkin & Amit K. Khandelwal & Adam Osman, 2017. "Exporting and Firm Performance: Evidence from a Randomized Experiment," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 132(2), pages 551-615.
    12. Rachel Glennerster & Kudzai Takavarasha, 2013. "Running Randomized Evaluations: A Practical Guide," Economics Books, Princeton University Press, edition 1, number 10085, December.
    13. David McKenzie, 2017. "Identifying and Spurring High-Growth Entrepreneurship: Experimental Evidence from a Business Plan Competition," American Economic Review, American Economic Association, vol. 107(8), pages 2278-2307, August.
    14. Nicholas Bloom & James Liang & John Roberts & Zhichun Jenny Ying, 2015. "Does Working from Home Work? Evidence from a Chinese Experiment," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 130(1), pages 165-218.
    15. Katrina Jessoe & David Rapson, 2014. "Knowledge Is (Less) Power: Experimental Evidence from Residential Energy Use," American Economic Review, American Economic Association, vol. 104(4), pages 1417-1438, April.
    16. Alberto Abadie & Susan Athey & Guido W Imbens & Jeffrey M Wooldridge, 2023. "When Should You Adjust Standard Errors for Clustering?," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 138(1), pages 1-35.
    17. Meredith Fowlie & Catherine Wolfram & C. Anna Spurlock & Annika Todd & Patrick Baylis & Peter Cappers, 2017. "Default Effects and Follow-On Behavior: Evidence from an Electricity Pricing Program," NBER Working Papers 23553, National Bureau of Economic Research, Inc.
    18. Nicholas Bloom & Benn Eifert & Aprajit Mahajan & David McKenzie & John Roberts, 2013. "Does Management Matter? Evidence from India," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 128(1), pages 1-51.
    19. A. Colin Cameron & Douglas L. Miller, 2015. "A Practitioner’s Guide to Cluster-Robust Inference," Journal of Human Resources, University of Wisconsin Press, vol. 50(2), pages 317-372.
    20. David Card & Stefano DellaVigna & Ulrike Malmendier, 2011. "The Role of Theory in Field Experiments," Journal of Economic Perspectives, American Economic Association, vol. 25(3), pages 39-62, Summer.
    21. Hunt Allcott & Michael Greenstone, 2017. "Measuring the Welfare Effects of Residential Energy Efficiency Programs," NBER Working Papers 23386, National Bureau of Economic Research, Inc.
    22. Marianne Bertrand & Esther Duflo & Sendhil Mullainathan, 2004. "How Much Should We Trust Differences-In-Differences Estimates?," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 119(1), pages 249-275.
    23. Arellano, M, 1987. "Computing Robust Standard Errors for Within-Groups Estimators," Oxford Bulletin of Economics and Statistics, Department of Economics, University of Oxford, vol. 49(4), pages 431-434, November.
    24. Koichiro Ito & Takanori Ida & Makoto Tanaka, 2018. "Moral Suasion and Economic Incentives: Field Experimental Evidence from Energy Demand," American Economic Journal: Economic Policy, American Economic Association, vol. 10(1), pages 240-267, February.
    25. Sarah Baird & J. Aislinn Bohren & Craig McIntosh & Berk Özler, 2018. "Optimal Design of Experiments in the Presence of Interference," The Review of Economics and Statistics, MIT Press, vol. 100(5), pages 844-860, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Tonke, Sebastian, 2020. "Imperfect Procedural Knowledge: Evidence from a Field Experiment to Encourage Water Conservation," VfS Annual Conference 2020 (Virtual Conference): Gender Economics 224536, Verein für Socialpolitik / German Economic Association.
    2. Weber, Sylvain & Puddu, Stefano & Pacheco, Diana, 2017. "Move it! How an electric contest motivates households to shift their load profile," Energy Economics, Elsevier, vol. 68(C), pages 255-270.
    3. Wang, Yuan & Xu, Huayu & Xu, Xiaoguang & Zhou, Yongmei, 2025. "The power of children in energy conservation: Evidence from a randomized controlled trial," Journal of Development Economics, Elsevier, vol. 174(C).
    4. Tonke, Sebastian, 2024. "Providing procedural knowledge: A field experiment to encourage resource conservation in Namibia," Journal of Development Economics, Elsevier, vol. 166(C).
    5. Maya Papineau & Nicholas Rivers, "undated". "Visualizing Energy Efficiency: A Picture is Worth More Than 1,022 Words," Carleton Economic Papers 19-10, Carleton University, Department of Economics.
    6. Brent, Daniel A. & Friesen, Lana & Gangadharan, Lata & Leibbrandt, Andreas, 2017. "Behavioral Insights from Field Experiments in Environmental Economics," International Review of Environmental and Resource Economics, now publishers, vol. 10(2), pages 95-143, May.
    7. Singhal, Puja & Pahle, Michael & Kalkuhl, Matthias & Levesque, Antoine & Sommer, Stephan & Berneiser, Jessica, 2022. "Beyond good faith: Why evidence-based policy is necessary to decarbonize buildings cost-effectively in Germany," Energy Policy, Elsevier, vol. 169(C).
    8. Asensio, Omar Isaac & Delmas, Magali A., 2016. "The dynamics of behavior change: Evidence from energy conservation," Journal of Economic Behavior & Organization, Elsevier, vol. 126(PA), pages 196-212.
    9. Cardella, Eric & Ewing, Brad & Williams, Ryan Blake, "undated". "Green is Good – The Impact of Information Nudges on the Adoption of Voluntary Green Power Plans," 2018 Annual Meeting, February 2-6, 2018, Jacksonville, Florida 266583, Southern Agricultural Economics Association.
    10. Werner, Peter & Riedl, Arno, 2018. "The role of experiments for policy design," Research Memorandum 022, Maastricht University, Graduate School of Business and Economics (GSBE).
    11. Peters, Jörg & Langbein, Jörg & Roberts, Gareth, 2016. "Policy evaluation, randomized controlled trials, and external validity—A systematic review," Economics Letters, Elsevier, vol. 147(C), pages 51-54.
    12. Ossokina, Ioulia V. & Kerperien, Stephan & Arentze, Theo A., 2021. "Does information encourage or discourage tenants to accept energy retrofitting of homes?," Energy Economics, Elsevier, vol. 103(C).
    13. Praveen K. Kopalle & Jesse Burkhardt & Kenneth Gillingham & Lauren S. Grewal & Nailya Ordabayeva, 2024. "Delivering affordable clean energy to consumers," Journal of the Academy of Marketing Science, Springer, vol. 52(5), pages 1452-1474, October.
    14. Guido Friebel & Matthias Heinz & Miriam Krueger & Nikolay Zubanov, 2017. "Team Incentives and Performance: Evidence from a Retail Chain," American Economic Review, American Economic Association, vol. 107(8), pages 2168-2203, August.
    15. Wu, Libo & Zhou, Yang, 2025. "Social norms and energy conservation in China," Resource and Energy Economics, Elsevier, vol. 82(C).
    16. repec:spo:wpecon:info:hdl:2441/6jahov5tde8vt9aplqrgg3trl4 is not listed on IDEAS
    17. R. Aaron Hrozencik & Jordan F. Suter & Paul J. Ferraro & Nathan Hendricks, 2024. "Social comparisons and groundwater use: Evidence from Colorado and Kansas," American Journal of Agricultural Economics, John Wiley & Sons, vol. 106(2), pages 946-966, March.
    18. Ta, Chi L., 2024. "Do conservation contests work? An analysis of a large-scale energy competitive rebate program," Journal of Environmental Economics and Management, Elsevier, vol. 124(C).
    19. repec:aen:eeepjl:eeep5-1-ida is not listed on IDEAS
    20. Jeffrey D. Michler & Anna Josephson, 2022. "Recent developments in inference: practicalities for applied economics," Chapters, in: A Modern Guide to Food Economics, chapter 11, pages 235-268, Edward Elgar Publishing.
    21. Brülisauer, Marcel & Goette, Lorenz & Jiang, Zhengyi & Schmitz, Jan & Schubert, Renate, 2020. "Appliance-specific feedback and social comparisons: Evidence from a field experiment on energy conservation," Energy Policy, Elsevier, vol. 145(C).
    22. MacKinnon, James G. & Nielsen, Morten Ørregaard & Webb, Matthew D., 2023. "Cluster-robust inference: A guide to empirical practice," Journal of Econometrics, Elsevier, vol. 232(2), pages 272-299.

    More about this item

    JEL classification:

    • B4 - Schools of Economic Thought and Methodology - - Economic Methodology
    • C23 - Mathematical and Quantitative Methods - - Single Equation Models; Single Variables - - - Models with Panel Data; Spatio-temporal Models
    • C9 - Mathematical and Quantitative Methods - - Design of Experiments
    • O1 - Economic Development, Innovation, Technological Change, and Growth - - Economic Development
    • Q4 - Agricultural and Natural Resource Economics; Environmental and Ecological Economics - - Energy

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:metaar:d5eud. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/metaarxiv .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.