IDEAS home Printed from https://ideas.repec.org/a/wly/jpamgt/v25y2006i2p249-265.html

Presidential address: Evidence-based decision making: What will it take for the decision makers to care?

Author

Listed:
  • Rebecca A. Maynard

    (University of Pennsylvania)

Abstract

No abstract is available for this item.

Suggested Citation

  • Rebecca A. Maynard, 2006. "Presidential address: Evidence-based decision making: What will it take for the decision makers to care?," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 25(2), pages 249-265.
  • Handle: RePEc:wly:jpamgt:v:25:y:2006:i:2:p:249-265
    DOI: 10.1002/pam.20169
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1002/pam.20169
    File Function: Link to full text; subscription required
    Download Restriction: no

    File URL: https://libkey.io/10.1002/pam.20169?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. repec:mpr:mprres:2739 is not listed on IDEAS
    2. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    3. repec:mpr:mprres:3694 is not listed on IDEAS
    4. repec:mpr:mprres:4632 is not listed on IDEAS
    5. repec:mpr:mprres:4607 is not listed on IDEAS
    6. Steven Glazerman & Dan M. Levy & David Myers, 2003. "Nonexperimental Versus Experimental Estimates of Earnings Impacts," The ANNALS of the American Academy of Political and Social Science, , vol. 589(1), pages 63-93, September.
    7. Peter Z. Schochet, "undated". "Statistical Power for Random Assignment Evaluations of Education Programs," Mathematica Policy Research Reports 6749d31ad72d4acf988f7dce5, Mathematica Policy Research.
    8. Roberto Agodini & Mark Dynarski, 2004. "Are Experiments the Only Option? A Look at Dropout Prevention Programs," The Review of Economics and Statistics, MIT Press, vol. 86(1), pages 180-194, February.
    9. Roberto Agodini & Mark Dynarski, "undated". "Are Experiments the Only Option? A Look at Dropout Prevention Programs," Mathematica Policy Research Reports 51241adbf9fa4a26add6d54c5, Mathematica Policy Research.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Jeremy L. Hall, 2009. "Evidence-Based Practice and the Use of Information in State Agency Decision-Making," Working Papers 2009-10, University of Kentucky, Institute for Federalism and Intergovernmental Relations.
    2. Hirokazu Yoshikawa & Robert G. Myers & Kathleen McCartney & Kristen L. Bub & Julieta Lugo-Gil & Maria A. Ramos & Felicia Knaul, 2007. "Early Childhood Education in Mexico: Expansion, quality improvement, and curricular reform," Papers inwopa07/41, Innocenti Working Papers.
    3. Hirokazu Yoshikawa & Robert G. Myers & Kathleen McCartney & Kristen L. Bub & Julieta Lugo-Gil & Maria A. Ramos & Felicia Knaul, 2008. "La educación durante la primera infancia en México: expansión, mejora de la calidad, y reforma curricular," Papers inwopa08/48, Innocenti Working Papers.
    4. Kum, Hye-Chung & Joy Stewart, C. & Rose, Roderick A. & Duncan, Dean F., 2015. "Using big data for evidence based governance in child welfare," Children and Youth Services Review, Elsevier, vol. 58(C), pages 127-136.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Fortson, Kenneth & Gleason, Philip & Kopa, Emma & Verbitsky-Savitz, Natalya, 2015. "Horseshoes, hand grenades, and treatment effects? Reassessing whether nonexperimental estimators are biased," Economics of Education Review, Elsevier, vol. 44(C), pages 100-113.
    2. Kenneth Fortson & Natalya Verbitsky-Savitz & Emma Kopa & Philip Gleason, 2012. "Using an Experimental Evaluation of Charter Schools to Test Whether Nonexperimental Comparison Group Methods Can Replicate Experimental Impact Estimates," Mathematica Policy Research Reports 27f871b5b7b94f3a80278a593, Mathematica Policy Research.
    3. Jean Stockard, 2013. "Merging the accountability and scientific research requirements of the No Child Left Behind Act: using cohort control groups," Quality & Quantity: International Journal of Methodology, Springer, vol. 47(4), pages 2225-2257, June.
    4. Thomas D. Cook & William R. Shadish & Vivian C. Wong, 2008. "Three conditions under which experiments and observational studies produce comparable causal estimates: New findings from within-study comparisons," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 27(4), pages 724-750.
    5. Andrew P. Jaciw, 2016. "Applications of a Within-Study Comparison Approach for Evaluating Bias in Generalized Causal Inferences From Comparison Groups Studies," Evaluation Review, , vol. 40(3), pages 241-276, June.
    6. Jason K. Luellen & William R. Shadish & M. H. Clark, 2005. "Propensity Scores," Evaluation Review, , vol. 29(6), pages 530-558, December.
    7. Vivian C. Wong & Peter M. Steiner & Kylie L. Anglin, 2018. "What Can Be Learned From Empirical Evaluations of Nonexperimental Methods?," Evaluation Review, , vol. 42(2), pages 147-175, April.
    8. Andrew P. Jaciw, 2016. "Assessing the Accuracy of Generalized Inferences From Comparison Group Studies Using a Within-Study Comparison Approach," Evaluation Review, , vol. 40(3), pages 199-240, June.
    9. Jared Coopersmith & Thomas D. Cook & Jelena Zurovac & Duncan Chaplin & Lauren V. Forrow, 2022. "Internal And External Validity Of The Comparative Interrupted Time‐Series Design: A Meta‐Analysis," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 41(1), pages 252-277, January.
    10. Fatih Unlu & Douglas Lee Lauen & Sarah Crittenden Fuller & Tiffany Berglund & Elc Estrera, 2021. "Can Quasi‐Experimental Evaluations That Rely On State Longitudinal Data Systems Replicate Experimental Results?," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 40(2), pages 572-613, March.
    11. Elizabeth Ty Wilde & Robinson Hollister, 2007. "How close is close enough? Evaluating propensity score matching using data from a class size reduction experiment," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 26(3), pages 455-477.
    12. Kristen Harknett, 2006. "Does Receiving an Earnings Supplement Affect Union Formation? Estimating Effects for Program Participants Using Propensity Score Matching," Evaluation Review, , vol. 30(6), pages 741-778, December.
    13. Colacelli, Mariana & Blackburn, David J.H., 2009. "Secondary currency: An empirical analysis," Journal of Monetary Economics, Elsevier, vol. 56(3), pages 295-308, April.
    14. Handa, Sudhanshu & Pineda, Heiling & Esquivel, Yannete & Lopez, Blancadilia & Gurdian, Nidia Veronica & Regalia, Ferdinando, 2009. "Non-formal basic education as a development priority: Evidence from Nicaragua," Economics of Education Review, Elsevier, vol. 28(4), pages 512-522, August.
    15. Robin Jacob & Marie-Andree Somers & Pei Zhu & Howard Bloom, 2016. "The Validity of the Comparative Interrupted Time Series Design for Evaluating the Effect of School-Level Interventions," Evaluation Review, , vol. 40(3), pages 167-198, June.
    16. Alberto Abadie & Guido W. Imbens, 2008. "On the Failure of the Bootstrap for Matching Estimators," Econometrica, Econometric Society, vol. 76(6), pages 1537-1557, November.
    17. Maureen A. Pirog & Anne L. Buffardi & Colleen K. Chrisinger & Pradeep Singh & John Briney, 2009. "Are the alternatives to randomized assignment nearly as good? Statistical corrections to nonrandomized evaluations," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 28(1), pages 169-172.
    18. Sauermann, Jan & Stenberg, Anders, 2020. "Assessing Selection Bias in Non-Experimental Estimates of the Returns to Workplace Training," IZA Discussion Papers 13789, IZA Network @ LISER.
    19. Wendy Janssens, 2005. "Measuring Externalities in Program Evaluation," Tinbergen Institute Discussion Papers 05-017/2, Tinbergen Institute, revised 30 Mar 2006.
    20. Jill M. Constantine & Neil S. Seftor & Emily Sama Martin & Tim Silva & David Myers, "undated". "A Study of the Effect of Talent Search on Secondary and Postsecondary Outcomes in Florida, Indiana, and Texas," Mathematica Policy Research Reports 469b015a64754790a4ab0944f, Mathematica Policy Research.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:wly:jpamgt:v:25:y:2006:i:2:p:249-265. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: http://www3.interscience.wiley.com/journal/34787/home .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.