IDEAS home Printed from https://ideas.repec.org/a/wly/jpamgt/v25y2006i2p249-265.html
   My bibliography  Save this article

Presidential address: Evidence-based decision making: What will it take for the decision makers to care?

Author

Listed:
  • Rebecca A. Maynard

    (University of Pennsylvania)

Abstract

No abstract is available for this item.

Suggested Citation

  • Rebecca A. Maynard, 2006. "Presidential address: Evidence-based decision making: What will it take for the decision makers to care?," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 25(2), pages 249-265.
  • Handle: RePEc:wly:jpamgt:v:25:y:2006:i:2:p:249-265
    DOI: 10.1002/pam.20169
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1002/pam.20169
    File Function: Link to full text; subscription required
    Download Restriction: no

    File URL: https://libkey.io/10.1002/pam.20169?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    2. Steven Glazerman & Dan M. Levy & David Myers, 2003. "Nonexperimental Versus Experimental Estimates of Earnings Impacts," The ANNALS of the American Academy of Political and Social Science, , vol. 589(1), pages 63-93, September.
    3. repec:mpr:mprres:4632 is not listed on IDEAS
    4. repec:mpr:mprres:2739 is not listed on IDEAS
    5. repec:mpr:mprres:4607 is not listed on IDEAS
    6. Roberto Agodini & Mark Dynarski, "undated". "Are Experiments the Only Option? A Look at Dropout Prevention Programs," Mathematica Policy Research Reports 51241adbf9fa4a26add6d54c5, Mathematica Policy Research.
    7. Peter Z. Schochet, "undated". "Statistical Power for Random Assignment Evaluations of Education Programs," Mathematica Policy Research Reports 6749d31ad72d4acf988f7dce5, Mathematica Policy Research.
    8. Roberto Agodini & Mark Dynarski, 2004. "Are Experiments the Only Option? A Look at Dropout Prevention Programs," The Review of Economics and Statistics, MIT Press, vol. 86(1), pages 180-194, February.
    9. repec:mpr:mprres:3694 is not listed on IDEAS
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Hirokazu Yoshikawa & Robert G. Myers & Kathleen McCartney & Kristen L. Bub & Julieta Lugo-Gil & Maria A. Ramos & Felicia Knaul, 2007. "Early Childhood Education in Mexico: Expansion, quality improvement, and curricular reform," Papers inwopa07/41, Innocenti Working Papers.
    2. Hirokazu Yoshikawa & Robert G. Myers & Kathleen McCartney & Kristen L. Bub & Julieta Lugo-Gil & Maria A. Ramos & Felicia Knaul, 2008. "La educación durante la primera infancia en México: expansión, mejora de la calidad, y reforma curricular," Papers inwopa08/48, Innocenti Working Papers.
    3. Kum, Hye-Chung & Joy Stewart, C. & Rose, Roderick A. & Duncan, Dean F., 2015. "Using big data for evidence based governance in child welfare," Children and Youth Services Review, Elsevier, vol. 58(C), pages 127-136.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Kenneth Fortson & Natalya Verbitsky-Savitz & Emma Kopa & Philip Gleason, 2012. "Using an Experimental Evaluation of Charter Schools to Test Whether Nonexperimental Comparison Group Methods Can Replicate Experimental Impact Estimates," Mathematica Policy Research Reports 27f871b5b7b94f3a80278a593, Mathematica Policy Research.
    2. Jean Stockard, 2013. "Merging the accountability and scientific research requirements of the No Child Left Behind Act: using cohort control groups," Quality & Quantity: International Journal of Methodology, Springer, vol. 47(4), pages 2225-2257, June.
    3. Fortson, Kenneth & Gleason, Philip & Kopa, Emma & Verbitsky-Savitz, Natalya, 2015. "Horseshoes, hand grenades, and treatment effects? Reassessing whether nonexperimental estimators are biased," Economics of Education Review, Elsevier, vol. 44(C), pages 100-113.
    4. Thomas D. Cook & William R. Shadish & Vivian C. Wong, 2008. "Three conditions under which experiments and observational studies produce comparable causal estimates: New findings from within-study comparisons," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 27(4), pages 724-750.
    5. Andrew P. Jaciw, 2016. "Applications of a Within-Study Comparison Approach for Evaluating Bias in Generalized Causal Inferences From Comparison Groups Studies," Evaluation Review, , vol. 40(3), pages 241-276, June.
    6. Jason K. Luellen & William R. Shadish & M. H. Clark, 2005. "Propensity Scores," Evaluation Review, , vol. 29(6), pages 530-558, December.
    7. Vivian C. Wong & Peter M. Steiner & Kylie L. Anglin, 2018. "What Can Be Learned From Empirical Evaluations of Nonexperimental Methods?," Evaluation Review, , vol. 42(2), pages 147-175, April.
    8. Andrew P. Jaciw, 2016. "Assessing the Accuracy of Generalized Inferences From Comparison Group Studies Using a Within-Study Comparison Approach," Evaluation Review, , vol. 40(3), pages 199-240, June.
    9. William Bosshardt & Neela Manage, 2011. "Does Calculus Help in Principles of Economics Courses? Estimates Using Matching Estimators," The American Economist, Sage Publications, vol. 56(1), pages 29-37, May.
    10. Thomas D. Cook & Dominique Foray, 2007. "Building the Capacity to Experiment in Schools: A Case Study of the Institute of Educational Sciences in the US Department of Education," Economics of Innovation and New Technology, Taylor & Francis Journals, vol. 16(5), pages 385-402.
    11. Ferraro, Paul J. & Miranda, Juan José, 2014. "The performance of non-experimental designs in the evaluation of environmental programs: A design-replication study using a large-scale randomized experiment as a benchmark," Journal of Economic Behavior & Organization, Elsevier, vol. 107(PA), pages 344-365.
    12. Aga, Deribe Assefa, 2016. "Factors affecting the success of development projects : A behavioral perspective," Other publications TiSEM 867ae95e-d53d-4a68-ad46-6, Tilburg University, School of Economics and Management.
    13. Kenneth Fortson & Philip Gleason & Emma Kopa & Natalya Verbitsky-Savitz, "undated". "Horseshoes, Hand Grenades, and Treatment Effects? Reassessing Bias in Nonexperimental Estimators," Mathematica Policy Research Reports 1c24988cd5454dd3be51fbc2c, Mathematica Policy Research.
    14. repec:mpr:mprres:4565 is not listed on IDEAS
    15. Jared Coopersmith & Thomas D. Cook & Jelena Zurovac & Duncan Chaplin & Lauren V. Forrow, 2022. "Internal And External Validity Of The Comparative Interrupted Time‐Series Design: A Meta‐Analysis," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 41(1), pages 252-277, January.
    16. Fatih Unlu & Douglas Lee Lauen & Sarah Crittenden Fuller & Tiffany Berglund & Elc Estrera, 2021. "Can Quasi‐Experimental Evaluations That Rely On State Longitudinal Data Systems Replicate Experimental Results?," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 40(2), pages 572-613, March.
    17. Handa, Sudhanshu & Pineda, Heiling & Esquivel, Yannete & Lopez, Blancadilia & Gurdian, Nidia Veronica & Regalia, Ferdinando, 2009. "Non-formal basic education as a development priority: Evidence from Nicaragua," Economics of Education Review, Elsevier, vol. 28(4), pages 512-522, August.
    18. Alberto Abadie & Guido W. Imbens, 2008. "On the Failure of the Bootstrap for Matching Estimators," Econometrica, Econometric Society, vol. 76(6), pages 1537-1557, November.
    19. Maureen A. Pirog & Anne L. Buffardi & Colleen K. Chrisinger & Pradeep Singh & John Briney, 2009. "Are the alternatives to randomized assignment nearly as good? Statistical corrections to nonrandomized evaluations," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 28(1), pages 169-172.
    20. Jill M. Constantine & Neil S. Seftor & Emily Sama Martin & Tim Silva & David Myers, "undated". "A Study of the Effect of Talent Search on Secondary and Postsecondary Outcomes in Florida, Indiana, and Texas," Mathematica Policy Research Reports 469b015a64754790a4ab0944f, Mathematica Policy Research.
    21. Maciej Jakubowski, 2015. "Latent variables and propensity score matching: a simulation study with application to data from the Programme for International Student Assessment in Poland," Empirical Economics, Springer, vol. 48(3), pages 1287-1325, May.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:wly:jpamgt:v:25:y:2006:i:2:p:249-265. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: http://www3.interscience.wiley.com/journal/34787/home .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.