IDEAS home Printed from https://ideas.repec.org/a/eee/epplan/v48y2015icp124-131.html

Evidence-based programs registry: Blueprints for Healthy Youth Development

Author

Listed:
  • Mihalic, Sharon F.
  • Elliott, Delbert S.

Abstract

There is a growing demand for evidence-based programs to promote healthy youth development, but this growth has been accompanied by confusion related to varying definitions of evidence-based and mixed messages regarding which programs can claim this designation. The registries that identify evidence-based programs, while intended to help users sift through the findings and claims regarding programs, has oftentimes led to more confusion with their differing standards and program ratings. The advantages of using evidence-based programs and the importance of adopting a high standard of evidence, especially when taking programs to scale,are described. One evidence-based registry is highlighted—Blueprints for Healthy Youth Development hosted at the University of Colorado Boulder.Unlike any previous initiative of its kind, Blueprintsestablished unmatched standards for identifying evidence-based programs and has acted in a way similar to the FDA – evaluating evidence, data and research to determine which programs meet their high standard of proven efficacy.

Suggested Citation

  • Mihalic, Sharon F. & Elliott, Delbert S., 2015. "Evidence-based programs registry: Blueprints for Healthy Youth Development," Evaluation and Program Planning, Elsevier, vol. 48(C), pages 124-131.
  • Handle: RePEc:eee:epplan:v:48:y:2015:i:c:p:124-131
    DOI: 10.1016/j.evalprogplan.2014.08.004
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0149718914000925
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.evalprogplan.2014.08.004?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to

    for a different version of it.

    References listed on IDEAS

    as
    1. Susanne James‐Burdumy & Mark Dynarski & John Deke, 2008. "After‐School Program Effects On Behavior: Results From The 21st Century Community Learning Centers Program National Evaluation," Economic Inquiry, Western Economic Association International, vol. 46(1), pages 13-18, January.
    2. Thomas D. Cook & William R. Shadish & Vivian C. Wong, 2008. "Three conditions under which experiments and observational studies produce comparable causal estimates: New findings from within-study comparisons," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 27(4), pages 724-750.
    3. Núria Rodríguez-Planas, 2012. "Longer-Term Impacts of Mentoring, Educational Services, and Learning Incentives: Evidence from a Randomized Trial in the United States," American Economic Journal: Applied Economics, American Economic Association, vol. 4(4), pages 121-139, October.
    4. repec:mpr:mprres:5735 is not listed on IDEAS
    5. Susanne James-Burdumy & Mark Dynarski & John Deke, "undated". "When Elementary Schools Stay Open Late: Results from The National Evaluation of the 21st-Century Community Learning Centers Program (Journal Article)," Mathematica Policy Research Reports f422818e926344eca132aa7cd, Mathematica Policy Research.
    6. repec:mpr:mprres:5879 is not listed on IDEAS
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Axford, Nick & Morpeth, Louise & Bjornstad, Gretchen & Hobbs, Tim & Berry, Vashti, 2022. "“What works” registries of interventions to improve child and youth psychosocial outcomes: A critical appraisal," Children and Youth Services Review, Elsevier, vol. 137(C).
    2. Eric C. Brown & Pablo Montero-Zamora & Francisco Cardozo-Macías & María Fernanda Reyes-Rodríguez & John S. Briney & Juliana Mejía-Trujillo & Augusto Pérez-Gómez, 2021. "A Comparison of Cut Points for Measuring Risk Factors for Adolescent Substance Use and Antisocial Behaviors in the U.S. and Colombia," IJERPH, MDPI, vol. 18(2), pages 1-14, January.
    3. Henneberger, Angela K. & Mushonga, Dawnsha R., 2021. "Peer selection as a mechanism for preventing adolescent substance use: Current approaches and future directions," Children and Youth Services Review, Elsevier, vol. 120(C).
    4. Tamara M. Haegerich & Corinne David-Ferdon & Rita K. Noonan & Brian J. Manns & Holly C. Billie, 2017. "Technical Packages in Injury and Violence Prevention to Move Evidence Into Practice," Evaluation Review, , vol. 41(1), pages 78-108, February.
    5. Zack, Melissa K. & Karre, Jennifer K. & Olson, Jonathan & Perkins, Daniel F., 2019. "Similarities and differences in program registers: A case study," Evaluation and Program Planning, Elsevier, vol. 76(C), pages 1-1.
    6. Agans, Jennifer P. & Maley, Mary & Rainone, Nicolette & Cope, Marie & Turner, Andrew & Eckenrode, John & Pillemer, Karl, 2020. "Evaluating the evidence for youth outcomes in 4-H: A scoping review," Children and Youth Services Review, Elsevier, vol. 108(C).
    7. Gunnar Bjørnebekk & Dagfinn Mørkrid Thøgersen, 2021. "Possible Interventions for Preventing the Development of Psychopathic Traits among Children and Adolescents?," IJERPH, MDPI, vol. 19(1), pages 1-14, December.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Visaria, Sujata & Dehejia, Rajeev & Chao, Melody M. & Mukhopadhyay, Anirban, 2016. "Unintended consequences of rewards for student attendance: Results from a field experiment in Indian classrooms," Economics of Education Review, Elsevier, vol. 54(C), pages 173-184.
    2. Samari, Goleen & Catalano, Ralph & Alcalá, Héctor E. & Gemmill, Alison, 2020. "The Muslim Ban and preterm birth: Analysis of U.S. vital statistics data from 2009 to 2018," Social Science & Medicine, Elsevier, vol. 265(C).
    3. Jared Coopersmith & Thomas D. Cook & Jelena Zurovac & Duncan Chaplin & Lauren V. Forrow, 2022. "Internal And External Validity Of The Comparative Interrupted Time‐Series Design: A Meta‐Analysis," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 41(1), pages 252-277, January.
    4. Alejandro Cid & Martin Rossi, 2011. "Giving a Second Chance: an After-School Program in a Shantytown Interacting with Parents Type," Documentos de Trabajo/Working Papers 1108, Facultad de Ciencias Empresariales y Economia. Universidad de Montevideo..
    5. Dustan, Andrew, 2020. "Can large, untargeted conditional cash transfers increase urban high school graduation rates? Evidence from Mexico City's Prepa Sí," Journal of Development Economics, Elsevier, vol. 143(C).
    6. David Wittenburg & Kenneth Fortson & David Stapleton & Noelle Denny-Brown & Rosalind Keith & David R. Mann & Heinrich Hock & Heather Gordon, "undated". "Promoting Opportunity Demonstration: Design Report," Mathematica Policy Research Reports a7bdd8ca145748bd892b3438d, Mathematica Policy Research.
    7. Caitlin Kearns & Douglas Lee Lauen & Bruce Fuller, 2020. "Competing With Charter Schools: Selection, Retention, and Achievement in Los Angeles Pilot Schools," Evaluation Review, , vol. 44(2-3), pages 111-144, April.
    8. Armin Falk & Fabian Kosse & Pia Pinger, 2026. "Mentoring and Schooling Decisions: Causal Evidence," Journal of Political Economy, University of Chicago Press, vol. 134(1), pages 366-396.
    9. French, Robert & Oreopoulos, Philip, 2017. "Behavioral barriers transitioning to college," Labour Economics, Elsevier, vol. 47(C), pages 48-63.
    10. Blackman, Allen & Villalobos, Laura, 2021. "¿Usar o perder los bosques?: Extracción regulada de madera y pérdida de cobertura forestal en México," IDB Publications (Working Papers) 11094, Inter-American Development Bank.
    11. Will Dobbie & Roland G. Fryer, Jr, 2013. "The Medium-Term Impacts of High-Achieving Charter Schools on Non-Test Score Outcomes," NBER Working Papers 19581, National Bureau of Economic Research, Inc.
    12. Fatih Unlu & Douglas Lee Lauen & Sarah Crittenden Fuller & Tiffany Berglund & Elc Estrera, 2021. "Can Quasi‐Experimental Evaluations That Rely On State Longitudinal Data Systems Replicate Experimental Results?," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 40(2), pages 572-613, March.
    13. Uwe Dulleck & Juliana Silva-Goncalves & Benno Torgler, 2014. "Impact Evaluation of an Incentive Program on Educational Achievement of Indigenous Students," CREMA Working Paper Series 2014-13, Center for Research in Economics, Management and the Arts (CREMA).
    14. Yang Tang & Thomas D. Cook, 2018. "Statistical Power for the Comparative Regression Discontinuity Design With a Pretest No-Treatment Control Function: Theory and Evidence From the National Head Start Impact Study," Evaluation Review, , vol. 42(1), pages 71-110, February.
    15. Katherine Baicker & Theodore Svoronos, 2019. "Testing the Validity of the Single Interrupted Time Series Design," NBER Working Papers 26080, National Bureau of Economic Research, Inc.
    16. Jordan H. Rickles & Michael Seltzer, 2014. "A Two-Stage Propensity Score Matching Strategy for Treatment Effect Estimation in a Multisite Observational Study," Journal of Educational and Behavioral Statistics, , vol. 39(6), pages 612-636, December.
    17. Deribe Assefa Aga & N. Noorderhaven & B. Vallejo, 2018. "Project beneficiary participation and behavioural intentions promoting project sustainability: The mediating role of psychological ownership," Development Policy Review, Overseas Development Institute, vol. 36(5), pages 527-546, September.
    18. Daniel Litwok, 2020. "Using Nonexperimental Methods to Address Noncompliance," Upjohn Working Papers 20-324, W.E. Upjohn Institute for Employment Research.
    19. Newman, Sandra & Holupka, C. Scott, 2025. "Assisted housing and healthy child development," Journal of Housing Economics, Elsevier, vol. 70(C).
    20. David M. Rindskopf & William R. Shadish & M. H. Clark, 2018. "Using Bayesian Correspondence Criteria to Compare Results From a Randomized Experiment and a Quasi-Experiment Allowing Self-Selection," Evaluation Review, , vol. 42(2), pages 248-280, April.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:epplan:v:48:y:2015:i:c:p:124-131. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/evalprogplan .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.