Skip to main content

Applying GRADE-CERQual to qualitative evidence synthesis findings: introduction to the series

Abstract

The GRADE-CERQual (‘Confidence in the Evidence from Reviews of Qualitative research’) approach provides guidance for assessing how much confidence to place in findings from systematic reviews of qualitative research (or qualitative evidence syntheses). The approach has been developed to support the use of findings from qualitative evidence syntheses in decision-making, including guideline development and policy formulation. Confidence in the evidence from qualitative evidence syntheses is an assessment of the extent to which a review finding is a reasonable representation of the phenomenon of interest. CERQual provides a systematic and transparent framework for assessing confidence in individual review findings, based on consideration of four components: (1) methodological limitations, (2) coherence, (3) adequacy of data, and (4) relevance. A fifth component, dissemination (or publication) bias, may also be important and is being explored. As with the GRADE (Grading of Recommendations Assessment, Development, and Evaluation) approach for effectiveness evidence, CERQual suggests summarising evidence in succinct, transparent, and informative Summary of Qualitative Findings tables. These tables are designed to communicate the review findings and the CERQual assessment of confidence in each finding. This article is the first of a seven-part series providing guidance on how to apply the CERQual approach. In this paper, we describe the rationale and conceptual basis for CERQual, the aims of the approach, how the approach was developed, and its main components. We also outline the purpose and structure of this series and discuss the growing role for qualitative evidence in decision-making. Papers 3, 4, 5, 6, and 7 in this series discuss each CERQual component, including the rationale for including the component in the approach, how the component is conceptualised, and how it should be assessed. Paper 2 discusses how to make an overall assessment of confidence in a review finding and how to create a Summary of Qualitative Findings table. The series is intended primarily for those undertaking qualitative evidence syntheses or using their findings in decision-making processes but is also relevant to guideline development agencies, primary qualitative researchers, and implementation scientists and practitioners.

Why an approach to assessing confidence in the evidence from reviews of qualitative research is needed

Decisions on health, social care, and other interventions, programmes, and policies need to be based on the best available evidence [1]. While different stakeholders may attach different importance to different types of evidence [2, 3], there is a wide agreement that a broad range of evidence is needed to inform decisions. This is particularly so for more complex interventions or policies as well as for programmes or policies whose implementation may impact across institutions and systems, such as across schools or across the education, health, or social care system. For example, evidence may be needed on the values that people attach to different outcomes, on effects of an intervention on health or social outcomes, on the acceptability and feasibility of the intervention, on resource use and cost-effectiveness, on equity impacts, on ethics, and on implementation and scale-up considerations at different levels [1, 4, 5]. Diverse evidence may also be needed to understand why evidence-informed policies are not adopted in specific jurisdictions or are not implemented successfully [6,7,8]. This is an important consideration across all settings, but particularly in low- and middle-income countries where resources are limited and need to be used effectively [1, 9]. Data from qualitative research contributes critical information to addressing this need.

Qualitative research aims to explore people’s perceptions and experiences of the world around them, including their perspectives on health and illness, health and social care services, and wider health and social system policies and processes. In recent years, systematic reviews of qualitative research (also known as qualitative evidence syntheses) have become more common and the methods for undertaking these reviews are now well developed [10,11,12]. Evidence from qualitative evidence syntheses is increasingly incorporated into decision-making processes, including in health technology assessments, guideline development [13], and policy formulation, to complement evidence on the effects of interventions and on resource use. Qualitative evidence is also now being used within decision support tools such as the DECIDE evidence-to-decision frameworks [4] and SURE evidence-based policy briefs [14] and to inform decisions on implementation strategies. This wider use of qualitative evidence, including by organisations such as the World Health Organization (WHO), the European Commission Initiative on Breast Cancer and the National Institute for Health and Care Excellence (NICE) in the UK, has highlighted the need for the development of approaches that help users in deciding how much emphasis to give to such evidence in their decisions [15]. However, prior to the development of the approach described in this paper, there was no accepted, structured method for assessing confidence in the evidence from qualitative evidence syntheses [16]. The lack of such methods may constrain the use of qualitative evidence to inform decision-making.

The ‘Confidence in the Evidence from Reviews of Qualitative research’ (GRADE-CERQual) approach provides guidance for assessing how much confidence to place in findings from qualitative evidence syntheses. It complements other ‘Grading of Recommendations Assessment, Development, and Evaluation’ (GRADE) tools for assessing how much confidence to place in evidence on the effectiveness and harms of interventions and their resource use and in evidence about diagnostic tests [17, 18]. The guidance in this series has been developed in collaboration and agreement with the GRADE Working Group (www.gradeworkinggroup.org).

Aims of the CERQual approach

The GRADE-CERQual approach (hereafter referred to as CERQual) has been developed to support people using findings from qualitative evidence syntheses in decision-making processes. CERQual allows the user to make a transparent assessment of how much confidence decision-makers and other users can place in individual review findings from syntheses of qualitative evidence. We define a review finding as an analytic output from a qualitative evidence synthesis that, based on data from primary studies, describes a phenomenon or an aspect of a phenomenon [16]. Many involved in using the findings of qualitative evidence syntheses may already be making these assessments of confidence intuitively or informally. As we see it, there are two main concerns with this: firstly, such assessments are not transparent and it is therefore not possible for others to see how the assessments were made and decide whether they agree with these decisions. Secondly, different assessors may use different criteria for assessing confidence and so assessments are not systematised across assessors (or even from one assessment to another, for the same assessor). Combined with the lack of transparency, this makes it difficult to understand, and where necessary critique, the basis for assessments. Broadly speaking, CERQual seeks to systematise the process of assessing confidence in the evidence from qualitative evidence syntheses and make these assessments explicit and transparent.

In developing CERQual, we were informed by the principles and methods of qualitative research and have also sought to apply lessons learned from the GRADE Working Group’s development of similar tools for other types of evidence. Table 1 lists the strengths of the CERQual approach, many of which are shared with other GRADE tools. CERQual is an emerging approach, and our knowledge of how to apply it is evolving. We therefore anticipate that guidance on applying CERQual will also evolve over time.

Table 1 Strengths of the CERQual approach

Assumptions underlying the development of CERQual

As a pragmatic approach, CERQual makes several assumptions and acknowledgements regarding ongoing methodological debates in the field of qualitative research:

  • We acknowledge that some within the qualitative research community have argued that synthesising data across multiple qualitative studies challenges the integrity of the contributing primary studies and that findings from this synthesis process may therefore not be trustworthy (e.g., [19,20,21]). However, in our approach to qualitative evidence synthesis and the development of CERQual, we have adopted the ‘subtle realist’ position [22] which maintains that the existence of phenomena does not depend on our subjective perceptions of them. In other words, social reality is not entirely constructed. Based on this, we suggest that synthesis can potentially provide a deeper understanding of a phenomenon than is achievable by any one single study, that this understanding can be viewed as trustworthy, and that it is therefore desirable to synthesise data from multiple qualitative studies. Others have made similar arguments in relation to ethnography, noting that comparisons of different ethnographic studies ‘are fruitful because they lead to empirical generalisations, they expose analytical problems, and they allow for falsification of hypotheses’ ([23] page 207).

  • We acknowledge debates on the most appropriate methods for synthesis but argue that concerns regarding the synthesis process and its outputs should be addressed in how a synthesis is undertaken (for example, by using methods that help to preserve the context of the primary studies in the analysis process) rather than being seen as a complete barrier to conducting syntheses or to the practical usefulness of qualitative synthesis findings to decision-makers [24].

  • CERQual assumes that qualitative research, in addition to its more interpretative and exploratory functions, has an instrumental role to play in informing decisions. In other words, qualitative research holds the potential to produce knowledge that can directly inform decision-making processes.

  • CERQual acknowledges that a well-conducted qualitative evidence synthesis does not automatically produce useful findings applicable to a range of contexts. As with primary qualitative research, sophisticated processes of analysis and interpretation are required. CERQual aims to accommodate the interpretivist nature of qualitative synthesis by, for example, encouraging the review authors to examine possible theoretical contributions and to be sensitive to the importance of context when assessing confidence in the evidence

  • The CERQual approach is intended to be applied to well-conducted syntheses that report their methods and limitations in a transparent way. We believe that applying CERQual to the findings of a poorly conducted or poorly reported synthesis would be challenging and would not yield useful results. Paper 2 in the series provides guidance on assessing how well a review was conducted [25].

Additional file 1 describes the purpose of CERQual and what it is not intended to address.

How was the CERQual approach developed?

Overall, we used a pragmatic and iterative approach to develop each CERQual component by brainstorming concepts within the development team, undertaking formal and informal searching of the literature for definitions, following up relevant citations, talking to experts in the field of qualitative evidence synthesis, developing consensus through multiple face-to-face meetings and teleconferences, and seeking feedback from ongoing engagement with the qualitative evidence synthesis community, the GRADE Working Group and organisations that commission, produce, or use systematic reviews.

Initial development of the CERQual approach

The initial stages of the process for developing CERQual, which started in 2010, are outlined elsewhere [16] (see Additional file 2). This work led to an approach in which four components—methodological limitations, coherence, adequacy of data, and relevance—contribute to an overall assessment of confidence in an individual review finding. We presented this version of the approach in 2015 to a group of 27 invited methodologists, researchers, and end users from more than 12 international organisations and 10 countries, with a broad range of experience in qualitative research, the development of GRADE, or guideline development. This group, together with others who registered interest in the approach, constitutes a wider CERQual Project Group and played a significant role in the refinement of the approach.

Further development of the CERQual approach

We took several other steps to further develop the approach. Firstly, we undertook a coordinated programme of implementation activity involving training workshops, seminars, and presentations during which we actively sought, collated, and shared feedback to enhance understanding and further development of the CERQual components and their practical application. Between 2015 and 2017, at least 10 workshops and seminars and eight presentations were undertaken. Secondly, in 2015 and 2016, we implemented a small-group feedback approach in which we facilitated brief discussions of individual CERQual components, either within our host organisations or in response to specific invitations from other organisations. Thirdly, we applied the CERQual approach within diverse qualitative evidence syntheses in the areas of health and social care [6,7,8, 26,27,28,29,30,31,32,33] and also supported other teams in using CERQual by providing guidance through face-to-face or virtual training meetings and commenting on draft Summaries of Qualitative Findings tables. At least 10 syntheses were supported in this way (for example, [34, 35]). We then gathered structured feedback from early users of CERQual through an online feedback form that was made available to all CERQual users and through short individual discussions with six members of review teams and two members of the CERQual Project Group. The questions included in the online feedback form and individual discussions are available in Additional file 3. These experiences and the feedback we received from users contributed to the further refinement of the approach, including how each component should be conceptualised and applied. As far as possible, we used a consensus approach in these processes. While no formal guidelines exist for the development of an assessment approach of this kind, our process closely resembles the recommended approach for developing guidelines for reporting research processes [36].

The role of dissemination bias

In earlier discussions of CERQual, we identified dissemination bias (sometimes called publication bias) as a potential fifth component in assessing how much confidence to place in qualitative evidence synthesis findings. We initiated pilot work to explore how dissemination bias has been conceptualised in qualitative research, its perceived scope, and how it might be conceptualised. Some of this work is discussed in paper 7 in this series [37] and elsewhere [38, 39].

An overview of the CERQual approach to assessing confidence in the evidence

We have defined confidence in the evidence as an assessment of the extent to which a review finding is a reasonable representation of the phenomenon of interest. This assessment communicates the extent to which the review finding is likely to be substantially different from the phenomenon of interest. By ‘substantially different’, we mean different enough that it might change how the finding influences a decision about health, social care, or other interventions [16]. For instance, if a review finding suggests that a new social care intervention is very acceptable to most service users and we have high confidence in this finding (indicating that it is highly likely that the finding is a reasonable representation of acceptability to service users), decision-makers may assess that it is appropriate to recommend that the intervention be implemented, assuming that the desirable effects outweigh the undesirable effects for other decision criteria. However, if we have very low confidence in this finding and it is therefore unclear whether the intervention is acceptable to most service users, decision-makers may assess that it is not appropriate to recommend its implementation.

CERQual involves an assessment of each individual review finding in terms of four components: (1) methodological limitations, (2) coherence, (3) adequacy of data, and (4) relevance (Table 2). The assessments of the four components collectively contribute to an overall assessment of whether findings from a qualitative evidence synthesis provide a reasonable representation of the health or social care issue, intervention, or programme (phenomenon) of interest (Fig. 1). Our approach is comparable to that for GRADE for effectiveness where review authors assess the confidence, or certainty, in the estimates of effect for each critical and important outcome by evaluating risk of bias, directness, inconsistency, imprecision, and publication bias [17].

Table 2 Definitions of the components of the CERQual approach
Fig. 1
figure 1

Overview of the GRADE-CERQual series of papers

When using CERQual, we assess confidence in each individual review finding. We acknowledge that review findings can be presented in a range of ways (e.g., as themes or theories) as well as at different levels (e.g., descriptive/aggregative and interpretive). We report confidence in each review finding as either high, moderate, low, or very low confidence (Table 3).

Table 3 Descriptions of level of confidence in a review finding in the CERQual approach [16]

A key product of a CERQual assessment, as with other GRADE approaches, is a succinct, transparent, and informative Summary of Qualitative Findings (SoQF) table. The SoQF table facilitates the use of review findings in decision-making processes and is purpose-designed to communicate to users both the overall confidence assessment for each review finding and an explanation of this assessment. In paper 2 in this series, we describe how synthesis authors proceed from a full review finding to a summary of a review finding, for inclusion in a SoQF table [25]. The SoQF table is complemented by a CERQual Evidence Profile which includes the explicit judgements for each CERQual component that contributes to the overall confidence assessment for each review finding [25]. Additional file 4 outlines minimum criteria that need to be met for review authors to assert fidelity to the GRADE-CERQual approach. We noted earlier that the development of CERQual has been informed by the principles and methods underlying both primary qualitative research and qualitative evidence synthesis. Those applying CERQual should, in our experience, have a good understanding of both qualitative primary research methods and qualitative evidence synthesis methods to apply the approach appropriately.

As noted in relation to GRADE for evidence of effectiveness, confidence in the evidence exists on a continuum. One limitation of a CERQual assessment is the discrete categorisation of confidence into levels (high, moderate, etc.), which inevitably involves a degree of arbitrariness. However, we would argue that the accessibility and transparency of this approach outweigh its limitations [40].

Applying CERQual across types of qualitative data and synthesis methods

An ever-expanding array of qualitative synthesis methods are available [41, 42]. Our aspiration is that CERQual could be applied to findings from syntheses based on any type of qualitative data that use a variety of synthesis methods and that address a range of questions. Within the domains of health and social care, this includes questions such as people’s views or experiences of a health or social care issue, how different stakeholders and population groups value different health or social care outcomes, stakeholders’ views on the acceptability and feasibility of health or social care interventions or options and on how an intervention might work, and factors affecting the implementation of an intervention or option. So far, experience in using CERQual has been concentrated in reviews with findings that are aggregative in nature, for example, related to health care users’ and providers’ experiences and understanding of health issues and health service delivery [6, 7, 28, 29, 31]. We have yet to gather experience about the use of CERQual on the full scope of synthesis methods and types of review findings. This is an important area for future research. For example, we need to explore how decision-makers use review findings at different levels of abstraction: are findings that carry a high level of abstraction as immediately useful to decision-makers as those grounded within contextual parameters such as time, place, and culture? We also need to explore the usefulness for decision-making of findings from syntheses that use different synthesis methods.

Purpose and structure of this series of papers

This series of papers aims to provide guidance to users on how to apply the GRADE-CERQual approach. The series takes the reader through all the stages involved in making an assessment of confidence in findings from qualitative evidence syntheses, including how we have conceptualised each component of CERQual and how these components relate to other concepts in the fields of primary qualitative research and qualitative evidence synthesis. Figure 1 provides an overview of the series and the CERQual approach, while Fig. 2 provides a guide for navigating through the series, including an overview of the purpose of each paper and the relevance of each paper to review authors, methodologists, and people using CERQual assessments.

Fig. 2
figure 2

How the papers in the GRADE-CERQual series can be used

The second paper in the series discusses how to move from a full description of a review finding to a summary of a review finding—an important step in the process of applying CERQual; how to determine to which review findings to apply CERQual; how to make an overall CERQual assessment; and how to create a Summary of Qualitative Findings table [25]. The next four papers in the series describe each of the four CERQual components in depth, including how this component is conceptualised and how it should be operationalised as part of a CERQual assessment [43,44,45,46]. The final paper in the series discusses dissemination bias in qualitative research and its potential implications for qualitative evidence synthesis and CERQual assessments [37].

This series is the first to discuss in detail how to apply the GRADE-CERQual approach. The series has been developed primarily for those undertaking qualitative evidence syntheses or those supporting the use of the findings of such syntheses in decision-making processes. The series is also relevant to guideline development and health technology assessment agencies, decision-makers, and qualitative researchers. It will also be useful for those seeking to understand recommendations and other decisions to which qualitative evidence with CERQual assessments have contributed. We will provide further guidance on using CERQual within decision-making processes, including in guideline development and in additional papers planned within our publications strategy.

Different readers will use this series in different ways (Fig. 2). Those conducting qualitative evidence syntheses may choose to read the series in its entirety to help them to apply the approach. Those supporting the use of qualitative evidence in decision-making may use the series as a reference guide to better understand how CERQual assessments are undertaken and how they are to be interpreted. Qualitative researchers may use the articles to understand the diverse information to be reported in primary qualitative studies, so as to facilitate the application of CERQual to future syntheses. Implementation researchers can use these articles in undertaking qualitative evidence syntheses related to implementation and in using the findings of these syntheses.

In writing this series, we have tried to draw on examples from published or ongoing syntheses addressing a disparate range of questions and contexts in order to show how CERQual should be applied. We also highlight key methodological issues to be considered at each stage or that arise from using CERQual. However, because CERQual is a relatively new approach, the pool of worked examples is not yet extensive and is drawn largely from the areas of health and social care. We believe, though, that CERQual can and should be applied to findings from qualitative evidence syntheses across all fields, including agriculture, crime and justice, education, the environment and international development. We encourage readers to share with us their applied examples from these domains.

Conclusions: the widening influence of qualitative evidence

The increasing use of qualitative evidence within a range of decision-making processes, and the growing awareness of the roles that qualitative evidence can play in decision-making [47], suggests that we are entering a new era for qualitative research. Perhaps the most important function that qualitative evidence can play in decision-making, including in the development of guidelines and health technology assessments, is to represent the views and experiences of a wide range of stakeholders. Engaging key stakeholders, such as the public and health care consumers, providers, and managers, in decision-making is widely promoted and recognised as a key to encouraging participative democracy and public accountability [13, 48,49,50,51,52,53,54]. However, to date, stakeholder engagement in decision-making in many settings is largely accomplished through dialogues and consultations with these stakeholders [55] and through including stakeholder representatives in decision-making forums, such as guideline panels [56]. While important and valuable, such engagement is limited by the knowledge and experience that individuals can bring to such dialogues and forums. For decisions that may impact on very large numbers of people, individual stakeholder representatives alone cannot be expected to represent effectively the views of all affected groups and consultations seldom reach all sectors of society. By drawing on the global qualitative research literature, qualitative evidence syntheses have the potential to greatly widen the range of views and experiences represented in decision-making, thereby helping to ensure that the choices made take these views into account. This may also contribute to increased transparency and accountability in public decision-making [57, 58]. CERQual plays a key role in this process by providing decision-makers with assessments of how much confidence they can place in such evidence.

Many decision-makers acknowledge the need to widen the range of evidence that they examine, to address questions such as the acceptability of interventions and programmes as well as other factors that might impact on their implementation. Many are aware that qualitative research provides a valuable pool of knowledge from which they can draw. We believe that the CERQual approach will help to increase the usability of findings from qualitative evidence syntheses, including use by those who are implementing interventions across fields such as education, health, social care and justice. CERQual will evolve as we gain more experience in applying the approach across diverse review findings derived from different synthesis approaches. Table 4 identifies several important areas for further methodological research, including how to apply CERQual in syntheses that include qualitative and quantitative data; how best to present CERQual assessments together with other kinds of evidence; ways of applying CERQual to syntheses of sources that have not used formal qualitative research procedures; and whether CERQual requires adaptation for application to more interpretive synthesis outputs, such as logic models. We hope that those using the approach will help us to develop and improve what is presented in this series. We encourage readers to join the CERQual Project Group and to engage with our website (www.cerqual.org), on which new developments will continue to be flagged.

Table 4 Way forward and research agenda for CERQual

Open peer review

Peer review reports for this article are available in Additional file 5.

Abbreviations

CERQual:

Confidence in the Evidence from Reviews of Qualitative research

DECIDE:

Developing and Evaluating Communication Strategies to Support Informed Decisions and Practice Based on Evidence (an EU-funded research project)

GRADE:

Grading of Recommendations Assessment, Development, and Evaluation

SoQF:

Summary of Qualitative Findings

SURE:

Supporting the use of research evidence for policy in African health systems (an EU-funded research project)

References

  1. Oxman AD, Lavis JN, Lewin S, Fretheim A. SUPPORT Tools for evidence-informed health Policymaking (STP) 1: what is evidence-informed policymaking? Health Res Policy Syst. 2009;7(Suppl 1):S1.

    Article  PubMed  PubMed Central  Google Scholar 

  2. Nabyonga-Orem J, Mijumbi R. Evidence for informing health policy development in low-income countries (LICs): perspectives of policy actors in Uganda. Int J Health Policy Manag. 2015;4(5):285–93.

    Article  PubMed  PubMed Central  Google Scholar 

  3. Rashidian A, Eccles MP, Russell I. Falling on stony ground? A qualitative study of implementation of clinical guidelines’ prescribing recommendations in primary care. Health Policy. 2008;85(2):148–61.

    Article  PubMed  Google Scholar 

  4. Treweek S, Oxman AD, Alderson P, Bossuyt PM, Brandt L, Brozek J, Davoli M, Flottorp S, Harbour R, Hill S, et al. Developing and Evaluating Communication Strategies to Support Informed Decisions and Practice Based on Evidence (DECIDE): protocol and preliminary results. Implement Sci. 2013;8:6.

    Article  PubMed  PubMed Central  Google Scholar 

  5. Davies P: Evidence-based government: how can we make it happen? CHSRF (Canada Health Services Research Foundation) 7th annual invitational workshop - leveraging knowledge: tools and strategies for action. Montreal, Canada; 2005.

  6. Colvin CJ, de Heer J, Winterton L, Mellenkamp M, Glenton C, Noyes J, Lewin S, Rashidian A. A systematic review of qualitative evidence on barriers and facilitators to the implementation of task-shifting in midwifery services. Midwifery. 2013;29(10):1211–21.

    Article  PubMed  Google Scholar 

  7. Glenton C, Colvin CJ, Carlsen B, Swartz A, Lewin S, Noyes J, Rashidian A. Barriers and facilitators to the implementation of lay health worker programmes to improve access to maternal and child health: qualitative evidence synthesis. Cochrane Database Syst Rev. 2013;10:CD010414.

    Google Scholar 

  8. Rashidian A, Shakibazadeh E, Karimi-Shahanjarini A, Glenton C, Noyes J, Lewin S, Colvin C, Laurant M. Barriers and facilitators to the implementation of doctor-nurse substitution strategies in primary care: qualitative evidence synthesis (protocol). Cochrane Database Syst Rev. 2013;2:CD010412.

    Google Scholar 

  9. Leach-Kemon K, Chou DP, Schneider MT, Tardif A, Dieleman JL, Brooks BP, Hanlon M, Murray CJ. The global financial crisis has led to a slowdown in growth of funding to improve health in many developing countries. Health Aff (Millwood). 2012;31(1):228–35.

    Article  Google Scholar 

  10. Gough D, Oliver S, Thomas J. An introduction to systematic reviews. Thousand Oaks: Sage Publications; 2012.

    Google Scholar 

  11. Noyes J, Popay J, Pearson A, Hannes K, Booth A, Cochrane Qualitative Research Methods Group. Chapter 20: Qualitative research and Cochrane reviews. In: Higgins JPT, Green S, editors. Cochrane Handbook for Systematic Reviews of Interventions Version 510 [updated March 2011]: The Cochrane Collaboration; 2011. Available from: http://handbook-5-1.cochrane.org/

  12. Saini M, Shlonsky A. Systematic synthesis of qualitative research. Oxford, UK: Oxford University Press; 2012.

    Book  Google Scholar 

  13. Ansari S, Rashidian A. Guidelines for guidelines: are they up to the task? A comparative assessment of clinical practice guideline development handbooks. PLoS One. 2012;7(11):e49864.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  14. The SURE Collaboration: SURE Guides for Preparing and Using Evidence-Based Policy Briefs. Version 2.1 [updated November 2011]: The SURE Collaboration; 2011. Available from: http://www.who.int/evidence/sure/guides/en/

  15. Lewin S, Bosch-Capblanch X, Oliver S, Akl EA, Vist GE, Lavis J, Ghersi D, Røttingen J, Steinmann P, Gulmezoglu M, et al. Guidance for evidence-informed policies about health systems: assessing how much confidence to place in the research evidence. PLoS Med. 2012;9(3):e1001187.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Lewin S, Glenton C, Munthe-Kaas H, Carlsen B, Colvin CJ, Gulmezoglu M, Noyes J, Booth A, Garside R, Rashidian A. Using qualitative evidence in decision making for health and social interventions: an approach to assess confidence in findings from qualitative evidence syntheses (GRADE-CERQual). PLoS Med. 2015;12(10):e1001895.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Guyatt G, Oxman AD, Akl EA, Kunz R, Vist G, Brozek J, Norris S, Falck-Ytter Y, Glasziou P, DeBeer H, et al. GRADE guidelines: 1. Introduction-GRADE evidence profiles and summary of findings tables. J Clin Epidemiol. 2011;64(4):383–94.

    Article  PubMed  Google Scholar 

  18. Hsu J, Brozek JL, Terracciano L, Kreis J, Compalati E, Stein AT, Fiocchi A, Schunemann HJ. Application of GRADE: making evidence-based recommendations about diagnostic tests in clinical practice guidelines. Implement Sci. 2011;6:62.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Sandelowski M, Docherty S, Emden C. Focus on qualitative methods qualitative metasynthesis: issues and techniques. Res Nurs Health. 1997;20:365–72.

    Article  CAS  PubMed  Google Scholar 

  20. Thorne S. Metasynthetic madness: what kind of monster have we created? Qual Health Res. 2017;27(1):3–13.

    Article  PubMed  Google Scholar 

  21. Thorne S, Jensen L, Kearney M, Noblit G, Sandelowski M. Reflections on the methodological and ideological agenda in qualitative meta-synthesis. Qual Health Res. 2004;14:1342–65.

    Article  PubMed  Google Scholar 

  22. Hammersley M. What’s wrong with ethnography?—methodological explorations. London: Routledge; 1992.

    Google Scholar 

  23. Barth F. Analytical dimensions in the comparison of social organizations. Am Anthropol. 1972;74(1-2):207–20.

    Article  Google Scholar 

  24. Pope C, Mays N, Popay J. Synthesising qualitative and quantitative health evidence: a guide to methods. United Kingdom: McGraw-Hill Education; 2007.

    Google Scholar 

  25. Lewin S, Bohren M, Rashidian A, Glenton C, Munthe-Kaas HM, Carlsen B, Colvin CJ, Tuncalp Ö, Noyes J, Booth A et al. Applying GRADE-CERQual to qualitative evidence synthesis findings—paper 2: how to make an overall CERQual assessment of confidence and create a Summary of Qualitative Findings table. Implement Sci. 2018;13(Suppl 1). https://doi.org/10.1186/s13012-017-0689-2.

  26. Ames HMR, Glenton C, Lewin S. Parents’ and informal caregivers’ views and experiences of communication about routine childhood vaccination: a synthesis of qualitative evidence. Cochrane Database Syst Rev. 2017;2:CD011787.

    PubMed  Google Scholar 

  27. Aslam RW, Hendry M, Carter B, Noyes J, Rycroft Malone J, Booth A, Pasterfield D, Charles JM, Craine N, Tudor Edwards R et al: Interventions for preventing unintended repeat pregnancies among adolescents (Protocol). Cochrane Database of Systematic Reviews. 2015(1):Art. No.: CD011477.

  28. Bohren MA, Hunter EC, Munthe-Kaas HM, Souza JP, Vogel JP, Gulmezoglu AM. Facilitators and barriers to facility-based delivery in low- and middle-income countries: a qualitative evidence synthesis. Reprod Health. 2014;11(1):71.

    Article  PubMed  PubMed Central  Google Scholar 

  29. Bohren MA, Vogel JP, Hunter EC, Lutsiv O, Makh SK, Souza JP, Aguiar C, Saraiva Coneglian F, Diniz AL, Tuncalp O, et al. The mistreatment of women during childbirth in health facilities globally: a mixed-methods systematic review. PLoS Med. 2015;12(6):e1001847. discussion e1001847

    Article  PubMed  PubMed Central  Google Scholar 

  30. Munabi-Babigumira S, Glenton C, Lewin S, Fretheim A, Nabudere H. Factors that influence the provision of intrapartum and postnatal care by skilled birth attendants in low- and middle-income countries: a qualitative evidence synthesis. Cochrane Database Syst Rev. 2017;(11):CD011558.

  31. Munthe-Kaas HM, Hammerstrøm KT, et al. Effekt av og erfaringer med kontinuitetsfremmende tiltak i barnevernsinstitusjoner. Norwegian Knowledge Centre for the Health Services: Oslo; 2013.

    Google Scholar 

  32. O'Brien TD, Noyes J, Spencer LH, Kubis HP, Hastings RP, Edwards RT, Bray N, Whitaker R. Keep fit exercise interventions to improve health, fitness and well-being of children and young people who use wheelchairs: mixed-method systematic review protocol. J Adv Nurs. 2014;70(12):2942–51.

    Article  PubMed  Google Scholar 

  33. Whitaker R, Hendry M, Booth A, Carter B, Charles J, Craine N, Edwards RT, Lyons M, Noyes J, Pasterfield D, et al. Intervention Now To Eliminate Repeat Unintended Pregnancy in Teenagers (INTERUPT): a systematic review of intervention effectiveness and cost-effectiveness, qualitative and realist synthesis of implementation factors and user engagement. BMJ Open. 2014;4(4):e004733.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Downe S, Finlayson K, Tuncalp Ӧ, Metin Gulmezoglu A. What matters to women: a systematic scoping review to identify the processes and outcomes of antenatal care provision that are important to healthy pregnant women. BJOG. 2016;123(4):529–39.

    Article  CAS  PubMed  Google Scholar 

  35. Odendaal WA, Goudge J, Griffiths F, Tomlinson M, Leon N, Daniels K: Healthcare workers’ perceptions and experience on using mHealth technologies to deliver primary healthcare services: qualitative evidence synthesis (Protocol). Cochrane Database of Systematic Reviews 2015(11):Art. No.: CD011942.

  36. Moher D, Schulz KF, Simera I, Altman DG. Guidance for developers of health research reporting guidelines. PLoS Med. 2010;7(2):e1000217.

    Article  PubMed  PubMed Central  Google Scholar 

  37. Booth A, Lewin S, Glenton C, Munthe-Kaas HM, Meerpohl J, Rees R, Noyes J, Rashidian A, Berg R, Nyakango B, et al. Applying GRADE-CERQual to qualitative evidence synthesis findings—paper 7: understanding the potential impacts of dissemination bias. Implement Sci. 2018;13(Suppl 1). https://doi.org/10.1186/s13012-017-0694-5.

  38. Toews I, Booth A, Berg RC, Lewin S, Glenton C, Munthe-Kaas HM, Noyes J, Schroter S, Meerpohl JJ. Further exploration of dissemination bias in qualitative research required to facilitate assessment within qualitative evidence syntheses. J Clin Epidemiol. 2017;88:133–39.

  39. Toews I, Glenton C, Lewin S, Berg RC, Noyes J, Booth A, Marusic A, Malicki M, Munte-Kaas HM, Meerpohl JJ. Extent, awareness and perception of dissemination bias in qualitative research: an explorative survey. PLoS One. 2016;11(8):e0159290.

    Article  PubMed  PubMed Central  Google Scholar 

  40. Guyatt GH, Oxman AD, Vist GE, Kunz R, Falck-Ytter Y, Alonso-Coello P, Schunemann HJ, Group GW. GRADE: an emerging consensus on rating quality of evidence and strength of recommendations. BMJ. 2008;336(7650):924–6.

    Article  PubMed  PubMed Central  Google Scholar 

  41. Barnett-Page E, Thomas J. Methods for the synthesis of qualitative research: a critical review. BMC Med Res Methodol. 2009;9:59.

    Article  PubMed  PubMed Central  Google Scholar 

  42. Booth A, Noyes J, Flemming K, Gerhardus A, Wahlster P, Van Der Wilt GJ, Mozygemba K, Refolo P, Sacchini D, Tummers M et al: Guidance on choosing qualitative evidence synthesis methods for use in health technology assessments of complex interventions [online]. Available from: http://www.integrate-hta.eu/downloads/ 2016.

  43. Colvin CJ, Garside R, Wainwright M, Lewin S, Bohren M, Glenton C, Munthe-Kaas HM, Carlsen B, Tuncalp Ö, Noyes J, et al. Applying GRADE-CERQual to qualitative evidence synthesis findings—paper 4: how to assess coherence. Implement Sci. 2018;13(Suppl 1). https://doi.org/10.1186/s13012-017-0691-8.

  44. Glenton C, Carlsen B, Lewin S, Munthe-Kaas HM, Colvin CJ, Tuncalp Ö, Bohren M, Noyes J, Booth A, Garside R, et al. Applying GRADE-CERQual to qualitative evidence synthesis findings—paper 5: how to assess adequacy of data. Implement Sci. 2018;13(Suppl 1). https://doi.org/10.1186/s13012-017-0692-7.

  45. Munthe-Kaas HM, Bohren M, Carlsen B, Glenton C, Lewin S, Colvin CJ, Tuncalp Ö, Noyes J, Booth A, Garside R, et al. Applying GRADE-CERQual to qualitative evidence synthesis findings—paper 3: how to assess methodological limitations. Implement Sci. 2018;13(Suppl 1). https://doi.org/10.1186/s13012-017-0690-9.

  46. Noyes J, Booth A, Lewin S, Carlsen B, Glenton C, munthe-Kaas HM, Colvin CJ, Garside R, Bohren M, Rashidian A, et al. Applying GRADE-CERQual to qualitative evidence synthesis findings—paper 6: how to assess relevance of the data. Implement Sci. 2018;13(Suppl 1). https://doi.org/10.1186/s13012-017-0693-6.

  47. Glenton C, Lewin S, Norris SL. Using evidence from qualitative research to develop WHO guidelines (Chapter 15). In: World Health Organization, editor. Handbook for Guideline Development. 2nd ed. Geneva: WHO; 2016.

    Google Scholar 

  48. Oxman AD, Lewin S, Lavis JN, Fretheim A. SUPPORT tools for evidence-informed health policymaking (STP) 15: engaging the public in evidence-informed policymaking. Health Syst Policy Res. 2009;7(Suppl 1):S15.

    Article  Google Scholar 

  49. Abelson J, Blacksher EA, Li KK, Boesveld SE, Goold SD. Public deliberation in health policy and bioethics: mapping an emerging, interdisciplinary field. J Public Deliberation. 2013;9(1)

  50. Davies C, Wetherell M, Barnett E. Citizens at the centre: deliberative participation in healthcare decisions. Bristol: Policy Press; 2006.

    Google Scholar 

  51. Street J, Duszynski K, Krawczyk S, Braunack-Mayer A. The use of citizens’ juries in health policy decision-making: a systematic review. Soc Sci Med. 2014;109:1–9.

    Article  PubMed  Google Scholar 

  52. McCoy DC, Hall JA, Ridge M. A systematic review of the literature for evidence on health facility committees in low- and middle-income countries. Health Policy Plan. 2012;27(6):449–66.

    Article  PubMed  Google Scholar 

  53. Molyneux S, Atela M, Angwenyi V, Goodman C. Community accountability at peripheral health facilities: a review of the empirical literature and development of a conceptual framework. Health Policy Plan. 2012;27(7):541–54.

    Article  PubMed  PubMed Central  Google Scholar 

  54. NICE. Community engagement: improving health and wellbeing and reducing health inequalities. NICE Guideline. London, UK: National Institute for Health and Care Excellence (NICE); 2016.

    Google Scholar 

  55. Lavis JN, Boyko JA, Oxman AD, Lewin S, Fretheim A. SUPPORT Tools for evidence-informed health Policymaking (STP) 14: organising and using policy dialogues to support evidence-informed policymaking. Health Res Policy Syst. 2009;7(Suppl 1):S14.

    Article  PubMed  PubMed Central  Google Scholar 

  56. WHO. Handbook for guideline development (2nd edition). Geneva: World Health Organization; 2016.

    Google Scholar 

  57. Daniels N. Accountability for reasonableness. BMJ. 2000;321(7272):1300–1.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  58. Daniels N, Sabin JE. Accountability for reasonableness: an update. BMJ. 2008;337:a1850.

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

Our thanks for their feedback to those who participated in the GRADE-CERQual Project group meetings in January 2014 or June 2015 or gave comments to the paper: Elie Akl, Heather Ames, Zhenggang Bai, Rigmor Berg, Jackie Chandler, Karen Daniels, Hans de Beer, Kenny Finlayson, Signe Flottorp, Bela Ganatra, Manasee Mishra, Susan Munabi-Babigumira, Andy Oxman, Tomas Pantoja, Hector Pardo-Hernandez, Vicky Pileggi, Kent Ranson, Rebecca Rees, Holger Schünemann, Anna Selva, Elham Shakibazadeh, Birte Snilstveit, James Thomas, Hilary Thompson, Judith Thornton, Joseph D. Tucker, and Josh Vogel. Thanks also to Sarah Rosenbaum for developing the figures used in this series of papers and to the members of the GRADE Working Group for their feedback on this series. The guidance in this paper has been developed in collaboration and agreement with the GRADE Working Group (www.gradeworkinggroup.org).

Funding

This work, including the publication charge for this article, was supported by a funding from the Alliance for Health Policy and Systems Research, WHO (http://www.who.int/alliance-hpsr/en/). Additional funding was provided by the Department of Reproductive Health and Research, WHO (www.who.int/reproductivehealth/about_us/en/); Norad (Norwegian Agency for Development Cooperation: www.norad.no); the Research Council of Norway (www.forskningsradet.no); and the Cochrane Methods Innovation Fund. SL is supported by a funding from the South African Medical Research Council (www.mrc.ac.za). The funders had no role in the study design, data collection and analysis, preparation of the manuscript, or the decision to publish.

Availability of data and materials

Additional materials are available on the GRADE-CERQual website (www.cerqual.org)

To join the CERQual Project Group and our mailing list, please visit our website: http://www.cerqual.org/contact/. Developments in CERQual are also made available via our Twitter feed: @CERQualNet.

About this supplement

This article has been published as part of Implementation Science Volume 13 Supplement 1, 2018: Applying GRADE-CERQual to Qualitative Evidence Synthesis Findings. The full contents of the supplement are available online at https://implementationscience.biomedcentral.com/articles/supplements/volume-13-supplement-1.

Author information

Authors and Affiliations

Authors

Contributions

All authors participated in the design of the CERQual approach: SL, AB, CG, HM-K, AR, MW, MB, ÖT, CJC, RG, BC, SF, and JN led the conceptualisation of the CERQual approach while EL provided input into that process. SL and AB wrote the first draft of the manuscript. All authors contributed to the writing of the manuscript. All authors have read and approved the manuscript.

Corresponding author

Correspondence to Simon Lewin.

Ethics declarations

Ethics approval and consent to participate

This is not applicable. This study did not undertake any formal data collection involving humans or animals.

Consent for publication

Not applicable

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Additional files

Additional file 1:

The purpose of CERQual and what CERQual is not intended to address. (PDF 621 kb)

Additional file 2:

Methods used to develop the CERQual approach—2010 to 2015. (PDF 647 kb)

Additional file 3:

Questions included in the CERQual online feedback form and short individual discussions. (PDF 468 kb)

Additional file 4:

Minimum criteria for fidelity to the GRADE-CERQual approach in a qualitative evidence synthesis. (PDF 353 kb)

Additional file 5:

Open peer review reports. (PDF 142 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lewin, S., Booth, A., Glenton, C. et al. Applying GRADE-CERQual to qualitative evidence synthesis findings: introduction to the series. Implementation Sci 13 (Suppl 1), 2 (2018). https://doi.org/10.1186/s13012-017-0688-3

Download citation

  • Published:

  • DOI: https://doi.org/10.1186/s13012-017-0688-3

Keywords