Skip to main content

Table 3 Evidence summary formats and results

From: The effectiveness of evidence summaries on health policymakers and health system managers use of evidence from systematic reviews: a systematic review

Study Type of evidence summary Format of summary Method of delivery Components Outcomes
Brownson 2011 [23] Policy brief Printed leaflet/booklet, PDF version for those who prefer online Mailed, follow up telephone call, emailed if preferred Front cover varied according to story- versus data-driven, color printed (included data or story), 3rd and 4th pages are the same across all 4 briefs, data-driven briefs contained 2 statements with percentages related to mammography screening, story-driven had 2 personal stories related to mammography, all briefs had data about uninsured women, women not up to date on mammograms, breast cancer mortality compared to other causes, benefits of mammograms, and recommendations The briefs were considered understandable and credible (mean ratings ranged from 4.3 to 4.5 on 5.0 Likert scale). Likelihood of using the brief was different by study condition for staff members (p = 0.041) and legislators (p = 0.018). Staff members found the story-focused brief with state-level data the most useful. Legislators found the data-focused brief with state-level data the most useful
Carrasco-Labra 2016 [30] Summary of findings table Table Emailed link to online survey The new format of summary of findings table moved the number of participants and studies to the outcomes column, quality of evidence was presented with the main reasons for downgrading, “footnotes” was changed to “explanations”, baseline risk and corresponding risk were expressed as percentages, column presenting absolute risk reduction (risk difference) or mean difference, no comments column, addition of “what happens” column, no description of the GRADE evidence definitions Participants with the new summary of findings table format had higher proportion of correct answers for almost all questions. The new format was more accessible (easier to understand information about the effects (MD 0.4, SE 0.19); and displayed results in a way that was more helpful for decision-making (MD 0.5 SE 0.18); overall, participants preferred the new format (MD 2.8, SD 1.6)
Dobbins 2009 [25] Evidence summaries Text Targeted, tailored emails Short summary including key findings and recommendations The post-intervention change in Global Evidence-Informed Decision-making was 0.74 (95% CI 0.26–1.22) for the group receiving only access to healthevidence.ca; –0.42 (–1.10, 0.26) for the group receiving tailored, targeted emails; and –0.09 (–0.78, 0.60) for the knowledge broker group.
The changes in health policies and programs (HPP) after the intervention were –0.28 (–1.20, 0.65) for the group receiving only access to the healthevidence.ca website; 1.67 (0.37, 2.97) for the group receiving tailored, targeted messages; and –0.19 (–1.50, 1.12) for the group with access to a knowledge brokers.
The tailored, targeted messages are more effective than the knowledge broker intervention or access to www.health-evidence.ca in organizations with a culture that highly values research
Masset 2013 [26, 29] Policy brief Text, colored leaflet Email Introduction to the problem, description of methodology, conclusions, and policy implications, 2 versions had expert commentary Respondents with stronger beliefs about the agricultural interventions at baseline rated the policy brief more favourably
The policy brief was less effective in changing respondents’ ratings of the strength of the evidence and effectiveness of the intervention
Opiyo 2013 [27] Summary of findings table, graded entry summary of evidence Text, tables Email Summary of findings table
Graded entry format included a summary and interpretation of main findings and conclusions, a contextually framed narrative report, and summary of findings table
No differences between groups in the odds of correct responses to key clinical questions
Both packs B and C improved understanding. Pack C compared to pack A was associated with a significantly higher mean “value and accessibility” score. Pack C compared to pack A was associated with a 1.5 higher odds of judgments about the quality of evidence being clear and accessible. More than half of participants preferred narrative report formats to the full version of the SR (53% versus 25%). A higher respondent percentage (60%) found SRs to be more difficult to read compared to narrative reports, but some (17%) said that SRs were easy to read. About half of the participants (51%) found SRs to be easier to read compared to summary of findings tables (26%)
Vandvik 2012 [28] Summary of findings table Table Email Tables presented outcomes, number of participants, summary of findings, and quality assessment using GRADE Participants liked presentation of study event rates over no study event rates, absolute risk differences over absolute risks, and additional information in table cells over footnotes. Panelists presented with time frame information in the tables, and not only in footnotes, were more likely to properly answer questions regarding time frame and those presented with risk differences, and not absolute risks were more likely to rightly interpret confidence intervals for absolute effects. Information was considered easy to find and to comprehend and also helpful in making recommendations regardless of table format