IDEAS home Printed from https://ideas.repec.org/a/wly/camsys/v14y2018i1p1-52.html
   My bibliography  Save this article

Do evidence summaries increase health policy‐makers' use of evidence from systematic reviews? A systematic review

Author

Listed:
  • Jennifer Petkovic
  • Vivian Welch
  • Marie Helena Jacob
  • Manosila Yoganathan
  • Ana Patricia Ayala
  • Heather Cunningham
  • Peter Tugwell

Abstract

This review summarizes the evidence from six randomized controlled trials that judged the effectiveness of systematic review summaries on policymakers' decision making, or the most effective ways to present evidence summaries to increase policymakers' use of the evidence. This review included six randomized controlled studies. A randomized controlled study is one in which the participants are divided randomly (by chance) into separate groups to compare different treatments or other interventions. This method of dividing people into groups means that the groups will be similar and that the effects of the treatments they receive will be compared more fairly. At the time the study is done, it is not known which treatment is the better one. The researchers who did these studies invited people from Europe, North America, South America, Africa, and Asia to take part in them. Two studies looked at “policy briefs,” one study looked at an “evidence summary,” two looked at a “summary of findings table,” and one compared a “summary of findings table” to an evidence summary. None of these studies looked at how policymakers directly used evidence from systematic reviews in their decision making, but two studies found that there was little to no difference in how they used the summaries. The studies relied on reports from decision makers. These studies included questions such as, “Is this summary easy to understand?” Some of the studies looked at users' knowledge, understanding, beliefs, or how credible (trustworthy) they believed the summaries to be. There was little to no difference in the studies that looked at these outcomes. Study participants rated the graded entry format higher for usability than the full systematic review. The graded entry format allows the reader to select how much information they want to read. The study participants felt that all evidence summary formats were easier to understand than full systematic reviews. Plain language summary Policy briefs make systematic reviews easier to understand but little evidence of impact on use of study findings It is likely that evidence summaries are easier to understand than complete systematic reviews. Whether these summaries increase the use of evidence from systematic reviews in policymaking is not clear. What is this review about? Systematic reviews are long and technical documents that may be hard for policymakers to use when making decisions. Evidence summaries are short documents that describe research findings in systematic reviews. These summaries may simplify the use of systematic reviews. Other names for evidence reviews are policy briefs, evidence briefs, summaries of findings, or plain language summaries. The goal of this review was to learn whether evidence summaries help policymakers use evidence from systematic reviews. This review also aimed to identify the best ways to present the evidence summary to increase the use of evidence. What is the aim of this review? This review summarizes the evidence from six randomized controlled trials that judged the effectiveness of systematic review summaries on policymakers' decision making, or the most effective ways to present evidence summaries to increase policymakers' use of the evidence. What are the main findings of this review? This review included six randomized controlled studies. A randomized controlled study is one in which the participants are divided randomly (by chance) into separate groups to compare different treatments or other interventions. This method of dividing people into groups means that the groups will be similar and that the effects of the treatments they receive will be compared more fairly. At the time the study is done, it is not known which treatment is the better one. The researchers who did these studies invited people from Europe, North America, South America, Africa, and Asia to take part in them. Two studies looked at “policy briefs,” one study looked at an “evidence summary,” two looked at a “summary of findings table,” and one compared a “summary of findings table” to an evidence summary. None of these studies looked at how policymakers directly used evidence from systematic reviews in their decision making, but two studies found that there was little to no difference in how they used the summaries. The studies relied on reports from decision makers. These studies included questions such as, “Is this summary easy to understand?” Some of the studies looked at users' knowledge, understanding, beliefs, or how credible (trustworthy) they believed the summaries to be. There was little to no difference in the studies that looked at these outcomes. Study participants rated the graded entry format higher for usability than the full systematic review. The graded entry format allows the reader to select how much information they want to read.. The study participants felt that all evidence summary formats were easier to understand than full systematic reviews. What do the findings of this review mean? Our review suggests that evidence summaries help policymakers to better understand the findings presented in systematic reviews. In short, evidence summaries should be developed to make it easier for policymakers to understand the evidence presented in systematic reviews. However, right now there is very little evidence on the best way to present systematic review evidence to policymakers. How up to date is this review? The authors of this review searched for studies through June 2016. Executive summary/Abstract Background Systematic reviews are important for decision makers. They offer many potential benefits but are often written in technical language, are too long, and do not contain contextual details which makes them hard to use for decision‐making. Strategies to promote the use of evidence to decision makers are required, and evidence summaries have been suggested as a facilitator. Evidence summaries include policy briefs, briefing papers, briefing notes, evidence briefs, abstracts, summary of findings tables, and plain language summaries. There are many organizations developing and disseminating systematic review evidence summaries for different populations or subsets of decision makers. However, evidence on the usefulness and effectiveness of systematic review summaries is lacking. We present an overview of the available evidence on systematic review evidence summaries. Objectives This systematic review aimed to 1) assess the effectiveness of evidence summaries on policy‐makers' use of the evidence and 2) identify the most effective summary components for increasing policy‐makers' use of the evidence. Search methods We searched several online databases (Medline, EMBASE, CINAHL, Cochrane Central Register of Controlled Trials, Global Health Library, Popline, Africa‐wide, Public Affairs Information Services, Worldwide Political Science Abstracts, Web of Science, and DfiD), websites of research groups and organizations which produce evidence summaries, and reference lists of included summaries and related systematic reviews. These databases were searched in March‐April, 2016. Selection criteria Eligible studies included randomised controlled trials (RCTs), non‐randomised controlled trials (NRCTs), controlled before‐after (CBA) studies, and interrupted time series (ITS) studies. We included studies of policymakers at all levels as well as health system managers. We included studies examining any type of “evidence summary”, “policy brief”, or other product derived from systematic reviews that presented evidence in a summarized form. These interventions could be compared to active comparators (e.g. other summary formats) or no intervention. The primary outcomes were: 1) use of systematic review summaries decision‐making (e.g. self‐reported use of the evidence in policy‐making, decision‐making) and 2) policymaker understanding, knowledge, and/or beliefs (e.g. changes in knowledge scores about the topic included in the summary). We also assessed perceived relevance, credibility, usefulness, understandability, and desirability (e.g. format) of the summaries. Results Our database search combined with our grey literature search yielded 10,113 references after removal of duplicates. From these, 54 were reviewed in full text and we included 6 studies (reported in 7 papers, 1661 participants) as well as protocols from 2 ongoing studies. Two studies assessed the use of evidence summaries in decision‐making and found little to no difference in effect. There was also little to no difference in effect for knowledge, understanding or beliefs (4 studies) and perceived usefulness or usability (3 studies). Summary of Findings tables and graded entry summaries were perceived as slightly easier to understand compared to complete systematic reviews. Two studies assessed formatting changes and found that for Summary of Findings tables, certain elements, such as reporting study event rates and absolute differences were preferred as well as avoiding the use of footnotes. No studies assessed adverse effects. The risks of bias in these studies were mainly assessed as unclear or low however, two studies were assessed as high risk of bias for incomplete outcome data due to very high rates of attrition. Authors' conclusions Evidence summaries may be easier to understand than complete systematic reviews. However, their ability to increase the use of systematic review evidence in policymaking is unclear.

Suggested Citation

  • Jennifer Petkovic & Vivian Welch & Marie Helena Jacob & Manosila Yoganathan & Ana Patricia Ayala & Heather Cunningham & Peter Tugwell, 2018. "Do evidence summaries increase health policy‐makers' use of evidence from systematic reviews? A systematic review," Campbell Systematic Reviews, John Wiley & Sons, vol. 14(1), pages 1-52.
  • Handle: RePEc:wly:camsys:v:14:y:2018:i:1:p:1-52
    DOI: 10.4073/csr.2018.8
    as

    Download full text from publisher

    File URL: https://doi.org/10.4073/csr.2018.8
    Download Restriction: no

    File URL: https://libkey.io/10.4073/csr.2018.8?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Madhavan, Ravi & Mahoney, Joseph T., 2011. "Evidence-Based Management in "Macro" Areas: The Case of Strategic Management," Working Papers 11-0105, University of Illinois at Urbana-Champaign, College of Business.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Benoît Béchard & Joachim Kimmerle & Justin Lawarée & Pierre-Oliver Bédard & Sharon E. Straus & Mathieu Ouimet, 2022. "The Impact of Information Presentation and Cognitive Dissonance on Processing Systematic Review Summaries: A Randomized Controlled Trial on Bicycle Helmet Legislation," IJERPH, MDPI, vol. 19(10), pages 1-17, May.
    2. Paul Fenton Villar & Hugh Waddington, 2019. "Within study comparisons and risk of bias in international development: Systematic review and critical appraisal," Campbell Systematic Reviews, John Wiley & Sons, vol. 15(1-2), June.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Bogdan MINJINA, 2015. "The Use of the Evidence from the Behavioral Sciences in the Organizational Decision-Making Process," Management Dynamics in the Knowledge Economy, College of Management, National University of Political Studies and Public Administration, vol. 3(3), pages 381-408, September.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:wly:camsys:v:14:y:2018:i:1:p:1-52. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: https://doi.org/10.1111/(ISSN)1891-1803 .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.