- Systematic review
- Open access
- Published:
Methods for designing interventions to change healthcare professionals’ behaviour: a systematic review
Implementation Science volume 12, Article number: 30 (2017)
Abstract
Background
Systematic reviews consistently indicate that interventions to change healthcare professional (HCP) behaviour are haphazardly designed and poorly specified. Clarity about methods for designing and specifying interventions is needed. The objective of this review was to identify published methods for designing interventions to change HCP behaviour.
Methods
A search of MEDLINE, Embase, and PsycINFO was conducted from 1996 to April 2015. Using inclusion/exclusion criteria, a broad screen of abstracts by one rater was followed by a strict screen of full text for all potentially relevant papers by three raters. An inductive approach was first applied to the included studies to identify commonalities and differences between the descriptions of methods across the papers. Based on this process and knowledge of related literatures, we developed a data extraction framework that included, e.g. level of change (e.g. individual versus organization); context of development; a brief description of the method; tasks included in the method (e.g. barrier identification, component selection, use of theory).
Results
3966 titles and abstracts and 64 full-text papers were screened to yield 15 papers included in the review, each outlining one design method. All of the papers reported methods developed within a specific context. Thirteen papers included barrier identification and 13 included linking barriers to intervention components; although not the same 13 papers. Thirteen papers targeted individual HCPs with only one paper targeting change across individual, organization, and system levels. The use of theory and user engagement were included in 13/15 and 13/15 papers, respectively.
Conclusions
There is an agreement across methods of four tasks that need to be completed when designing individual-level interventions: identifying barriers, selecting intervention components, using theory, and engaging end-users. Methods also consist of further additional tasks. Examples of methods for designing the organisation and system-level interventions were limited. Further analysis of design tasks could facilitate the development of detailed guidelines for designing interventions.
Background
Our project sought to advance the methods for translating research knowledge into practice. Knowledge translation (KT) is ‘a dynamic and iterative process that includes the synthesis, dissemination, exchange and ethically sound application of knowledge to improve health, provide more effective health services and products and strengthen the healthcare system’ [1]. One of the critical aspects of KT is that it requires healthcare professionals (HCPs) to change practice [2].
HCPs’ practice can be influenced by a wide range of factors; for example, a recent review identified 57 clusters of factors [3]. Specific interventions range from interventions targeted at HCPs (e.g. educational materials, audit and feedback) to interventions targeted towards consumers and policy-makers. The evidence base for many of these interventions remains incomplete [4], and there is an on-going need to design more effective interventions.
Systematic reviews of KT interventions to change HCPs’ practice consistently indicate that interventions are haphazardly designed and poorly specified, limiting our ability for replication, understanding, and generalizability [5, 6]. Limitations in intervention design impede evaluations of interventions [7, 8]. One issue contributing to the shortcomings in intervention design is a lack of agreed, practical ‘how to’ guidance for designing KT interventions.
Recommendations have been made to ensure that intervention design includes an assessment and prioritization of barriers, identification of potential adopters and practice environments, and consideration of both the potential effectiveness and feasibility of the chosen strategies [2], but these recommendations do not necessarily provide an approach to the design or development of the intervention [9]. Various potential tools (e.g. Behaviour Change Technique (BCT) taxonomy [10]) and sources describing a range of methods for mapping barriers and facilitators to KT interventions exist [11], but sources describing a range of complete methods for intervention design are few.
The aim of the present study was to contribute to the design of such a resource by synthesising literature about methods for designing KT interventions. Our specific objective was to systematically identify published methods for designing interventions to change HCPs’ behaviour.
Methods
A systematic review was undertaken. We did not publish a protocol. The initial literature search included MEDLINE, Embase, and PsycINFO from1996 to April 2013. An identical search was conducted on April 20, 2015 to identify papers published since the initial search. A sensitive search strategy was designed in consultation with an information science specialist (CF) using both subject headings and text terms and comprised a combination of three facets: professional behaviour change; theory, framework or technique; and interventions. The search strategies used are detailed in Additional file 1. Reference lists of included papers were screened for additional papers as were articles known to the review team. Our search started in 1996 as this was consistent with the introduction of the evidence-based medicine movement [12], and an associated increase in evidence-to-practice related publications [13].
Papers were included if two criteria were met: (1) the paper described a method (process, tasks, approach) for designing interventions to change HCPs’ behaviour or practice, and (2) the primary focus of the paper was on the intervention design process (as opposed to, e.g. on intervention evaluation). We defined interventions as: ‘a method or technique designed to enhance adoption, implementation and sustainability of a clinical/therapeutic program or practice, a specific clinical/therapeutic practice or delivery system/organizational arrangement being tested or implemented to improve healthcare outcomes’ [14]. A HCP was defined as any member of the healthcare team providing care, and their behaviour was defined as objectively observable actions (as opposed to, e.g. their knowledge or reasoning).
Protocol papers were included if the primary aim of the protocol was to describe intervention design methods or process. Papers were excluded if they pertained to HCPs’ behaviours not related to their clinical practice (e.g. HCPs’ eating healthily, exercising). While papers that report the implementation and evaluation of interventions may include descriptions of how the intervention was designed, this is rarely in a detailed and replicable manner [15]. As our interest was in providing a resource to guide researchers in the process of intervention design, we excluded papers that lacked enough detail for replication. These decisions were made based on the consensus of the three reviewers (HLC, JES, NK). Due to resource limitations, we also excluded articles that were not in English.
A screen of titles and abstracts was conducted by one rater independently (shared by HLC, JES, NK) and was followed by a review of full papers by three raters independently (HLC, JES, NK). An interrater reliability analysis using the Kappa statistic was performed to determine consistency among raters for the full-text review.
For all included full texts, general descriptive information (authors, year, journal, name of method if so named) were extracted and tabulated. To extract and analyse data about the methods, a two-stage process was carried out. Stage 1 involved generating a framework for data extraction and analysis. Three reviewers (HLC, JES, NK) progressed in iterative cycles of reading and discussing the included papers to identify similarities and differences between them and used these discussions to develop a list of items to be extracted. The iterative cycles were continued until an agreement between the three authors was reached. In part, this process was necessary to improve our understanding of the tasks that constitute intervention design and allow us to extract data outside of simply a brief description of the method. The resulting descriptive variables to be extracted were believed to be the most critical, and it included a brief description of the design tasks. Data extraction was conducted by two individuals (HLC, JES) independently first followed by consensus discussions for discrepancies (Stage 2).
Results
Prior to de-duplication there were 4667 records (MEDLINE 1512, Embase 1567, PsycINFO 1588). Once the duplicates were removed, we had 3966 citations to screen (Fig. 1). We excluded 3902 records based on the title and abstract screen resulting in 64 articles assessed for eligibility with a full-text screen. Following full-text review and consensus discussion, 49 articles were excluded leaving a total of 15 articles in the review. Reasons for exclusion based on a full-text review included papers that were not about intervention design (n = 33), not about HCPs’ behaviour (n = 13), not enough detail for replication (n = 2), and not in English (n = 1). The mean Kappa statistic across all pairs was .43 indicated moderate agreement [16].
The stage 1 process of data extraction resulted in four categories for extraction:
-
1.
The context in which the method was developed; either generic (described) or specific (i.e. behaviour, providers, setting, clinical condition—described as able).
-
2.
The level of change that the method was focused on (i.e. individual, organization, system, other).
-
3.
Whether the method incorporated any other type of published approach, tool, or resource as a component of the design process (e.g. incorporated the Theoretical Domains Framework (TDF) [17] as part of the process).
-
4.
A brief overall description of the tasks included in the method, if the method included barrier identification, if it included a process of component selection that linked barriers to intervention components, the use of theory at any stage of the design process, and whether users were engaged in intervention design (i.e. was input sought regarding feasibility or acceptability of the intervention from the potential targets for behaviour change).
General descriptive information about the papers
Table 1 provides a summary of the included 15 papers that were published between 2001and 2014. The 15 papers were published in five journals with eight in Implementation Science, three in BioMed Central Health Services Research, and two in Quality and Safety in Health Care. Three of the 12 papers reported a formal name or label for the method: Analysis, Development, Design, Implementation, Evaluation (ADDIE) Method [18], the Quality Enhancement Research Initiative (QUERI) [14], and the Theoretical Domains Framework Implementation (TDFI) approach [19]).
Contexts, target levels, and incorporating other processes/steps/tools/resources
All 15 included papers specified a context in which the method was initially developed but indicated that the method could be used outside of the particular context; indeed, this was the purpose of the papers. Examples ranged from broad contexts such as quality improvement [20] or patient safety [18] to specific contexts such as general practitioners’ behaviours for the treatment of low back pain [21] or occupational therapists’ caseload management [22]. Thirteen of the 15 papers proposed methods targeting individual HCPs; one of these [22] proposed methods targeted at the team level but not the organization. Only one paper targeted change across individual, organization, and system levels [14]. The remaining paper focused on the feasibility of the intervention and not on the change at a specific level per se [23].
Eleven of the 15 papers incorporated other approaches, tools, or resources as a component of the intervention design process [14, 19–28]. Four of these [20, 23, 26, 27] incorporated Intervention Mapping [29], another three incorporated both the TDF and the BCT taxonomy [19, 21, 25], two incorporated the BCT taxonomy without also incorporating the TDF [22, 28], and two incorporated the Medical Research Council (MRC) framework [22, 24]. Five of 15 incorporated just one other published tool [20, 24, 26–28], and one incorporated three published tools [25].
Design tasks in the methods
All of the 15 identified papers included a number of tasks required for the design process. These steps ranged from two [14] to seven [28] with a median of 5 tasks.
All but two [22, 23] of the papers included some form of barrier identification. One of these papers [22] reported an assumption that barriers had already been identified in previous work and provided a method for linking barriers to intervention components. The other paper [23] focused on adapting an intervention using stakeholder engagement and did not address barrier identification specifically. In the papers where barrier identification was covered, the methods included observations, interviews and/or focus groups [14, 18, 30–32], surveys [32], literature reviews [24], structured reflection by the researchers [32], job analysis and expert consensus [18], and undertaking a predictive study to identify factors influencing the behaviour [28]. Six papers used the structured interview processes outlined in the TDF [19, 21, 25] or in Intervention Mapping [20, 26, 27].
All but two papers [23, 24] included linking barriers to intervention components. As mentioned, one of these papers [23] focused on adapting an intervention using stakeholder engagement and did not address linking to intervention components specifically. The second paper [24] did conduct a barriers assessment but did not describe how barriers were linked to intervention components. Methods to link barriers to intervention components included mapping TDF-barriers to the BCT taxonomy [19, 21, 25], as well as using the structured approaches described in Intervention Mapping [20, 26, 27]. One paper used what they termed ‘development panels’ which involved staff, research experts, clinical experts, and local champions participating in a range of meetings and consultations [14]. All but two papers [24, 32] included the use of theory. There were papers (e.g. [21]) that used broad theoretical frameworks by way of incorporating pre-existing approaches that had used theory in their development (e.g. the TDF [17]). Other papers used more specific, discrete theories chosen based on the context in which the specific intervention was being developed. Examples included using Social Cognitive Theory [33] to design a computer-delivered intervention to enhance the use of practice guidelines in general practices [31], and using theories of risk perception to improve physical activity in cardiovascular patients as part of the Intervention Mapping process [26]. All but two papers [18, 30] included some approach to gathering input on the design of the intervention from the users or the individuals that were the target of the intervention. In all cases, this involved testing, piloting, or showing the intervention to the targets and gathering feedback in the form of discussion or interview. In two cases, this included formal cognitive interviews [28, 31].
Discussion
We conducted a systematic review of methods for designing interventions to change HCPs’ behaviour. We found 15 papers that outlined 15 methods. All of the papers reported methods that were developed within a specific context. Thirteen papers targeted only individual HCPs with one paper targeting change across individual, organization, and system levels. The methods consisted of at least two tasks and, at most, seven tasks. Thirteen papers included some form of barrier identification and 13 provided direction for linking barriers to intervention components; however, these were not the same 13 papers. Thirteen papers included the use of theory, and another 13 included gathering input on the design of the intervention from the targets of the intervention.
A number of publications related to designing interventions were not included in this review, For example, MRC guidance documents for developing and evaluating complex interventions [9], publications outlining the KT process [34, 35], tools, and frameworks that examine barriers assessments for KT intervention [17], and taxonomies of behaviour change techniques [10]. While these publications are certainly of high relevance to intervention development and evaluation process in general, these were not included here as they were all judged to provide limited detail about the specific, replicable actions to design interventions. For example, the MRC guidance emphasises the importance of designing interventions, but is limited in concrete guidance on how to actually do this in practice. In addition, a number of papers were identified that specifically stated intervention design as an aim [36, 37] but that lacked the detail that would allow replication. It could be that there are additional papers not included in this review that could facilitate intervention design. The majority of the methods found (11/15) incorporated other tools or resources, albeit not in identical ways. While we do not know the rationale for doing so, it could be that existing tools alone are felt to be inadequate for intervention design. Future studies on additional design methods that incorporate other existing tools and resources will likely aid in advancing methodologies for designing interventions.
There were two additional papers not included in our review that received significant dialogue during consensus and therefore warrant some discussion. One of these was Eccles et al. [38] which includes a description of using a theory to design a KT intervention. We felt this paper provided a rationale and description of conceptual issues related to using theory to develop an intervention; however, the degree of detail on how to design an intervention in this paper is limited. The second paper warranting discussion was on Intervention Mapping [29]. This paper outlined a method for intervention design for health behaviours, not for HCPs’ behaviours, and was therefore excluded. However, we did find four methods papers for designing interventions to change HCPs’ behaviour that incorporated Intervention Mapping [20, 23, 26, 27]. It is likely that other methods to design interventions to change health behaviours could similarly be adopted to design interventions to change HCPs’ practice.
Two main gaps seem evident in our review of the intervention design literature. First, limited methods target change in organisations or systems, or at least were developed with a focus on the organization or system. We found only one study [14] that did this explicitly. A second study [22] targeted teams as well as individuals but did not do so at the organizational level. We found limited methods to specifically take the organisation and system level contexts into consideration, as well as methods that consider all levels. While it is true that many of the methods’ approaches to barrier identification could result in a focus on the organisation or system should barriers at these levels be identified (for example, see French et al. [21]), the implicit focus of these papers targeted individual behaviour change. Future studies should consider how and under what circumstances to ensure that organisational and system level change is considered.
The process of undertaking this review highlights a second gap: the need for a better understanding of what activities constitute intervention design. Our iterative process of determining the intervention design variables for extraction and the subsequent extraction of those variables has led to a better understanding of the steps inherent in intervention design, at least according to current methods. There appear to be four steps common to intervention design: barrier identification, linking barriers to intervention component selection, use of theory, and user engagement (i.e. seeking input on feasibility or acceptability of the intervention from the potential targets). While we do not necessarily understand the best order for these tasks nor do we know what additional tasks are required, it does represent a simple structure of potential prototypical steps for the design of a KT intervention. Additional understanding of these tasks, as well as a more in-depth consideration of potentially additional tasks that should be but are not yet routinely adopted, would improve intervention design methods.
Almost all of the methods found (13/15) used theory at some point in the tasks for intervention design, yet evidence indicates theory is rarely used in the design of interventions, or at least rarely reported [39, 40]. Selecting one of these published methods or building on these methods is likely to guide researchers to use theory. Our review did not measure the degree to which these methods are used but this would be a useful future area of research. Additionally, future methodological work could focus on best practices for the use of theory to design an intervention.
Several limitations of this review warrant discussion. We used only one rater for the title and abstract review. Although support exists for the validity of using one rater [41], having two raters would have reduced the possibility of omitting a potentially relevant study. All of the included methods were developed specifically for healthcare environments. This could be in part due to our search strategy. Other methods from disciplines outside of healthcare could yield additional and suitable methods, as could methods developed prior to 1996. Several limitations exist that might have reduced the number of potential methods found. Due to the challenges in adequately searching books, we did not include books or book chapters in our search. Our inclusion criteria meant that we did not include any studies that reported on the testing of an intervention in addition to the development of the intervention. In part, we did this to isolate methods that were described in enough detail to be able to replicate and adequately guide the design of an intervention. Lastly, we did not search grey literature, making the review susceptible to publication bias [42], and we only chose to use three databases. It is feasible that additional methods exist.
Conclusions
This systematic review outlined 15 published and replicable methods for designing interventions to change HCPs’ behaviour. Its use as a resource and as a catalyst for improved quality and quantity of methods is encouraged. Although these methods included varied steps, there was a general agreement that designing an intervention for individual-level change includes identifying barriers, selecting intervention components, using theory, and engaging end-users. Methods for designing organisation and system-level interventions were limited. Further comparative analysis of how the common tasks are completed in the different methods will provide a starting point for developing more detailed guidelines for designing KT interventions. Future research should focus on the degree to which these methods have been used, determining how such methods could be better adopted and further development of both guidance for the existing methods and, potentially, new methods.
Abbreviations
- ADDIE:
-
Analysis, Development, Design, Implementation, Evaluation Method
- TDFI:
-
Theoretical Domains Framework Implementation Approach
- HCP:
-
Healthcare professional
- KT:
-
Knowledge translation
- MRC:
-
Medical Research Council
- QUERI:
-
Quality Enhancement Research Initiative
- TDF:
-
Theoretical Domains Framework
References
Canadian Institutes of Health Research. About knowledge translation. 2008. Retrieved from http://www.cihr-irsc.gc.ca/e/29418.html#2.
Grimshaw JM, Eccles MP, Lavis JN, Hill SJ, Squires JE. Knowledge translation of research findings. Implement Sci. 2012;7(1):50.
Flottorp SA, Oxman AD, Krause J, Musila NR, Wensing M, Godycki-Cwirko M, Baker R, Eccles MP. A checklist for identifying determinants of practice: a systematic review and synthesis of frameworks and taxonomies of factors that prevent or enable improvements in healthcare professional practice. Implement Sci. 2013;8(1):35.
Bero LA, Grilli R, Grimshaw JM, Harvey E, Oxman AD, Thomson MA. Closing the gap between research and practice: an overview of systematic reviews of interventions to promote the implementation of research findings. BMJ. 1998;317(7156):465–8.
Flodgren G, Parmelli E, Doumit G, Gattellari M, O’Brien MA, Grimshaw J, Eccles MP. Local opinion leaders: effects on professional practice and health care outcomes. Cochrane Database Syst Rev. 2011;(8):CD000125. doi:10.1002/14651858.CD000125.pub4.
Michie S, Johnston M. Theories and techniques of behaviour change: developing a cumulative science of behaviour change. Health Psychol Rev. 2012;6(1):1–6.
Gardner B, Whittington C, McAteer J, Eccles MP, Michie S. Using theory to synthesise evidence from behaviour change interventions: the example of audit and feedback. Soc Sci Med. 2010;70(10):1618–1625.
Van Hoof TJ, Miller NE, Meehan TP. Do published studies of educational outreach provide documentation of potentially important characteristics? Am J Med Qual. 2013;28(6):480–4.
Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M, Medical Research Council G. Developing and evaluating complex interventions: the new Medical Research Council guidance. BMJ. 2008;337:a1655.
Michie S, Richardson M, Johnston M, Abraham C, Francis J, Hardeman W, Eccles MP, Cane J, Wood CE. The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: building an international consensus for the reporting of behavior change interventions. Ann Behav Med. 2013;46(1):81–95.
Straus S, Tetroe J, Graham ID. Knowledge translation in health care: moving from evidence to practice. West Sussex: John Wiley & Sons; 2013.
Sackett DL, Rosenberg WM, Gray JM, Haynes RB, Richardson WS. Evidence based medicine: what it is and what it isn’t. BMJ. 1996;312(7023):71–2.
Rycroft‐Malone J, Seers K, Titchen A, Harvey G, Kitson A, McCormack B. What counts as evidence in evidence-based practice? J Adv Nurs. 2004;47(1):81–90.
Curran GM, Mukherjee S, Allee E, Owen RR. A process for developing an implementation intervention: QUERI Series. Implement Sci. 2008;3(1):17.
Hoffmann TC, Glasziou PP, Boutron I, Milne R, Perera R, Moher D, Altman DG, Barbour V, Macdonald H, Johnston M. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ. 2014;348.
Landis JR, Koch GG The measurement of observer agreement for categorical data. Biometrics. 1977;33:159–174.
Cane J, O’Connor D, Michie S. Validation of the theoretical domains framework for use in behaviour change and implementation research. Implement Sci. 2012;7(1):37.
Battles J. Improving patient safety by instructional systems design. Qual Saf Health Care. 2006;15 suppl 1:i25–9.
Taylor N, Lawton R, Slater B, Foy R. The demonstration of a theory-based approach to the design of localized patient safety interventions. Implement Sci. 2013;8(1):123.
Van Bokhoven M, Kok G, Van der Weijden T. Designing a quality improvement intervention: a systematic approach. Qual Saf Health Care. 2003;12(3):215–20.
French SD, Green SE, O’Connor DA, McKenzie JE, Francis JJ, Michie S, Buchbinder R, Schattner P, Spike N, Grimshaw JM. Developing theory-informed behaviour change interventions to implement evidence into practice: a systematic approach using the Theoretical Domains Framework. Implement Sci. 2012;7(1):38.
Kolehmainen N, Francis JJ. Specifying content and mechanisms of change in interventions to change professionals’ practice: an illustration from the Good Goals study in occupational therapy. Implement Sci. 2012;7(1):100.
Cabassa LJ, Druss B, Wang Y, Lewis-Fernández R. Collaborative planning approach to inform the implementation of a healthcare manager intervention for Hispanics with serious mental illness: a study protocol. Implement Sci. 2011;6:80.
Clyne B, Bradley MC, Hughes CM, Clear D, McDonnell R, Williams D, Fahey T, Smith SM. Addressing potentially inappropriate prescribing in older patients: development and pilot study of an intervention in primary care (the OPTI-SCRIPT study). BMC Health Serv Res. 2013;13(1):307.
Porcheret M, Main C, Croft P, McKinley R, Hassell A, Dziedzic K. Development of a behaviour change intervention: a case study on the practical application of theory. Implement Sci. 2014;9(1):42.
Sassen B, Kok G, Mesters I, Crutzen R, Cremers A, Vanhees L. A web-based intervention for health professionals and patients to decrease cardiovascular risk attributable to physical inactivity: development process. JMIR Res Protocols. 2012;1(2):e21.
Schmid AA, Andersen J, Kent T, Williams LS, Damush TM. Using intervention mapping to develop and adapt a secondary stroke prevention program in Veterans Health Administration medical centers. Implement Sci. 2010;5(1):11.
Foy R, Francis JJ, Johnston M, Eccles M, Lecouturier J, Bamford C, Grimshaw J. The development of a theory-based intervention to promote appropriate disclosure of a diagnosis of dementia. BMC Health Serv Res. 2007;7:207.
Bartholomew LK, Parcel GS, Kok G, Gottlieb NH. Planning health promotion programs: an intervention mapping approach. San Francisco: John Wiley & Sons; 2011.
Chandler C, Meta J, Ponzo C, Nasuwa F, Kessy J, Mbakilwa H, Haaland A, Reyburn H. The development of effective behaviour change interventions to support the use of malaria rapid diagnostic tests by Tanzanian clinicians. Implement Sci. 2014;9:83.
McDermott L, Yardley L, Little P, Ashworth M, Gulliford M. Developing a computer delivered, theory based intervention for guideline implementation in general practice. BMC Fam Pract. 2010;11(1):90.
Fretheim A, Oxman AD, Flottorp S. Improving prescribing of antihypertensive and cholesterol-lowering drugs: a method for identifying and addressing barriers to change. BMC Health Serv Res. 2004;4(1):23.
Bandura A. Self-efficacy: toward a unifying theory of behavioral change. Psychol Rev. 1977;84(2):191.
Graham ID, Logan J, Harrison MB, Straus SE, Tetroe J, Caswell W, Robinson N. Lost in knowledge translation: time for a map? J Contin Educ Heal Prof. 2006;26(1):13–24.
Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health. 1999;89(9):1322–7.
Flottorp S, Oxman AD. Identifying barriers and tailoring interventions to improve the management of urinary tract infections and sore throat: a pragmatic study using qualitative methods. BMC Health Serv Res. 2003;3(1):3.
Forsetlund L, Bjørndal A. Identifying barriers to the use of research faced by public health physicians in Norway and developing an intervention to reduce them. J Health Serv Res Policy. 2002;7(1):10–8.
Eccles M, Grimshaw J, Walker A, Johnston M, Pitts N. Changing the behavior of healthcare professionals: the use of theory in promoting the uptake of research findings. J Clin Epidemiol. 2005;58(2):107–12.
Colquhoun HL, Brehaut JC, Sales A, Ivers N, Grimshaw J, Michie S, Carroll K, Chalifoux M, Eva KW. A systematic review of the use of theory in randomized controlled trials of audit and feedback. Implement Sci. 2013;8:66.
Davies P, Walker AE, Grimshaw JM. A systematic review of the use of theory in the design of guideline dissemination and implementation strategies and interpretation of the results of rigorous evaluations. Implement Sci. 2010;5(14):5908–5.
Edwards P, Clarke M, DiGuiseppi C, Pratap S, Roberts I, Wentz R. Identification of randomized controlled trials in systematic reviews: accuracy and reliability of screening records. Stat Med. 2002;21(11):1635–40.
Moher D, Liberati A, Tetzlaff J, Altman DG. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Ann Intern Med. 2009;151(4):264–9.
Acknowledgements
At the time of this work, Dr. Colquhoun held a CIHR and KT Canada Postdoctoral Fellowship. Dr. Squires holds a Canadian Institutes for Health Research New Investigator Award and a University Research Chair in Health Evidence Implementation. At the time of this work, Dr. Kolehmainen held MRC Population Health Scientist Fellowship (G0902129). Dr. Grimshaw holds a Canada Research Chair in Health Knowledge Transfer and Uptake. The authors accept full responsibility for the manuscript. Funders were not involved in the conduct of the study or preparation of the manuscript.
Funding
This study was undertaken with no funding.
Availability of data and materials
The datasets generated during and/or analysed during the current study are available from the corresponding author on reasonable request.
Authors’ contributions
HLC contributed to the conception and design of the study, the acquisition, analysis, and interpretation of data, and drafted the manuscript. JES and NK contributed to the acquisition, analysis, and interpretation of data in this study. CF contributed to the design and conduct of the search strategy. JG contributed to the conception and design of the study as well as the interpretation of the data. All authors contributed edits to, read and approved the final version of the manuscript.
Competing interests
The authors declare that they have no competing interests.
Consent for publication
Not applicable.
Ethics approval and consent to participate
As this paper is a syntheses of published literature, ethics approval was not required.
Author information
Authors and Affiliations
Corresponding author
Additional file
Additional file 1:
Intervention Development: Search strategies. (DOCX 16 kb)
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.
About this article
Cite this article
Colquhoun, H.L., Squires, J.E., Kolehmainen, N. et al. Methods for designing interventions to change healthcare professionals’ behaviour: a systematic review. Implementation Sci 12, 30 (2017). https://doi.org/10.1186/s13012-017-0560-5
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/s13012-017-0560-5