Skip to main content
  • Systematic review
  • Open access
  • Published:

Building capacity in dissemination and implementation science: a systematic review of the academic literature on teaching and training initiatives



The field of dissemination and implementation (D&I) science has grown significantly over recent years. Alongside this, an increased demand for training in D&I from researchers and implementers has been seen. Research describing and evaluating D&I training opportunities, referred to here as ‘capacity building initiatives’ (CBIs), can help provide an understanding of different methods of training as well as training successes and challenges. However, to gain a more detailed understanding of the evidence-base and how D&I CBIs are being reported in publications, a field-wide examination of the academic literature is required.


Systematic review to identify the type and range of D&I CBIs discussed and/or appraised in the academic literature. EMBASE, Medline and PsycINFO were searched between January 2006 and November 2019. Articles were included if they reported on a D&I CBI that was developed by the authors (of each of the included articles) or the author’s host institution. Two reviewers independently screened the articles and extracted data using a standardised form.


Thirty-one articles (from a total of 4181) were included. From these, 41 distinct D&I CBIs were identified which focussed on different contexts and professions, from 8 countries across the world. CBIs ranged from short courses to training institutes to being part of academic programmes. Nearly half were delivered face-face with the remainder delivered remotely or using a blended format. CBIs often stipulated specific eligibility criteria, strict application processes and/or were oversubscribed. Variabilities in the way in which the D&I CBIs were reported and/or evaluated were evident.


Increasing the number of training opportunities, as well as broadening their reach (to a wider range of learners), would help address the recognised deficit in D&I training. Standardisation in the reporting of D&I CBIs would enable the D&I community to better understand the findings across different contexts and scientific professions so that training gaps can be identified and overcome. More detailed examination of publications on D&I CBIs as well as the wider literature on capacity building would be of significant merit to the field.

Peer Review reports


The failure to optimally use research to improve population outcomes and reduce service inefficiencies is an endemic challenge to health and social care systems worldwide [1,2,3]. A critical and acknowledged issue is the considerable gap between what we know we should be doing based on the evidence, versus what gets implemented in healthcare settings [3, 4]. Dissemination and implementation science (referred to hereon in, as ‘D&I’) investigates ways to close ‘research to practice’ gaps (‘implementation science’) and spread knowledge and information to practice settings (‘dissemination science’) [5, 6].

The critical role of D&I in enhancing the application of evidence-based interventions has led to the discipline’s rapid advancement in recent years [7, 8]. Significant steps have been taken to build D&I capacity (defined as ‘a process which leads to greater individual, organisation or system capabilities to conduct and implement high-quality research and practice’ [9,10,11,12]) in recognition that a robust and sustainable workforce is required to successfully implement or maintain health and social care interventions of known effectiveness [13, 14].

Efforts to build capacity in D&I take many forms [15,16,17,18]. In the USA, as early as 1998, research organisations and initiatives were established (e.g. the Veterans Health Administration ‘Quality Enhancement Research Initiative’ (QUERI) [19,20,21]), with the aim of investigating ways to efficiently implement research-driven best practices. Academic and government institutions, centres and departments dedicated to the field have since been created in the USA [22,23,24,25,26], Canada [27, 28], Australia [29, 30], the UK [31, 32] and other countries [33,34,35] as well as global efforts [36]. Opportunities for D&I funding are increasingly available [7, 37,38,39], professional societies and groups have been set up [40,41,42] and there are a growing number of scientific conferences and meetings [43,44,45,46,47]. In 2006, the specialist journal ‘Implementation Science’ was born and, in 2019, the inception of its companion journal (Implementation Science & Communications) as well as several other journals and libraries that have developed over the years (e.g. the Cochrane library) that publish D&I-related research [7].

Alongside these efforts, another very important way to build D&I capacity is through the development and delivery of teaching and training initiatives—referred to hereon in as ‘capacity building initiatives’ (CBIs). These endeavours may be aimed at individuals conducting research (i.e. ‘researchers’), those faced with translating evidence into practice (i.e. ‘implementers’) [15,16,17,18] or those tasked with training others in D&I principles and methodologies (i.e. ‘educators’). Such training endeavours include short courses, workshops and webinars or they may form part of academic programmes [16,17,18, 46,47,48,49]—all of these can be important in ensuring individuals have the requisite knowledge and skill-set to successfully implement scientific discoveries across diverse populations [14,15,16,17, 50, 51]. Given the value of D&I CBIs, it is of interest to examine the type and range of training opportunities available [15,16,17,18] and how these extend to a wide range of individuals (implementers, researchers and educators) [14, 17, 52,53,54,55,56].

In 2015, Implementation Science expressed a renewed interest in research describing and critically appraising D&I training initiatives [52]. Since this editorial, several descriptive and/or evaluative articles on D&I CBIs have been published [18, 47, 49, 57, 58], but for those working in D&I to gain a more detailed understanding of the evidence-base, a field-wide perspective of the published literature is required [59]. A useful starting point to address this gap is through the review and documentation of D&I CBIs that have been written up in the academic literature. Examining the way in which training endeavours are reported can help highlight variabilities in reporting and enable comparisons of different CBIs against set criteria (e.g. mode of delivery, duration, target audience) so that gaps in training (and the reporting of training) can be identified.

To date, several articles published between 2013 and 2019 have reviewed (at least in part) D&I CBIs and resources, specifically related to teaching and training. In 2013, an article that focussed on developing the next generation of implementation researchers highlighted selected D&I training programmes, conferences and resources [55]. In 2017, a mapping exercise of D&I research training initiatives, stemming from the National Institute of Health’s 2013 meeting on training, measurement and reporting was published—comprising training institutes, academic programmes and courses, webinars and career development awards [17]. In the same year, an expert assessment on training opportunities in D&I was documented [59] together with a content analysis of D&I resources using public, web-based information [60]. More recently (in 2018–2019), studies have identified D&I training initiatives to help inform medical education [61], training needs in public health [62] and mental health [16], and a review of online D&I resources was performed [63]. Taking this evidence collectively, the value of D&I CBIs in developing and harnessing skills in implementation research and evidence translation can be seen. Taking the evidence individually, however, most of the research is geographically restrictive, focussing only on D&I CBIs in the USA [59] or the USA and/or Canada [16, 55, 61, 63]. While one paper considered D&I training efforts on a global level [62], this was not the main aim of the work, and thus, information on the characteristics of the CBIs and gaps in training needs were understandably limited.

With these thoughts in mind and heeding the call from Implementation Science on the need for publications on D&I CBIs [52], we present the findings of a systematic review aimed at identifying the type and range of D&I training opportunities reported in the academic literature. This review is part of a larger programme of work aimed at describing and appraising D&I CBIs. The aim of this paper is to provide a detailed description of our review methodology and a high-level summary of the main features and characteristics of the training initiatives and how these are reported. We also reflect on the implications of our findings and put forward recommendations on the future reporting of CBIs in the context of D&I science.


Search strategy

EMBASE, MEDLINE and PsycINFO were searched (using the OVID interface) for relevant articles published between January 2006 and November 2019. The cut-off point was set at 2006 in line with the inception of implementation science [64]—where most of the relevant articles identified in our initial scoping of the literature were published. The search strategy was informed by several reviews and discussion papers on D&I-related terms [65,66,67,68,69] together with a brainstorming exercise involving both authors (RD, DD) to generate applicable terminology. Terms relating to (1) implementation science (e.g. ‘knowledge translation’, ‘implementation research’) and (2) teaching and training (e.g. ‘capacity building’, ‘curriculum’) were included. To avoid a priori assumptions on the type of content (i.e. topics) the CBIs may cover, the search strategy was restricted to generic terms relating to D&I (e.g. ‘implementation science’) rather than specific terms that focussed on D&I methodologies or concepts (e.g. ‘hybrid designs’ [70], ‘implementation outcomes’ [71], or theories and frameworks, e.g. ‘Consolidated Framework for Implementation Research’ [72]). To tighten the search specificity, the search strategy was customised using appropriate wildcards (e.g. course$) and Boolean operators (i.e. OR, AND), and restricted to titles and abstracts. The sensitivity of the search was assessed by forward and backward citation searching of included articles and through handsearching key implementation and behavioural science journals (e.g. Implementation Science, Translational Behavioural Medicine). The final search was conducted on 21st November 2019 (see Table 1 for a full list of search terms).

Table 1 Search strategy

Inclusion criteria

At the first stage of screening (title and abstract), any empirical or review article that discussed CBIs in D&I and/or related areas (e.g. ‘improvement science’, ‘quality improvement’, ‘translational research’) in the context of teaching or training was included. At the second stage of screening (full text), tighter restrictions applied. Articles whereby authors discussed or appraised (as a primary or secondary focus) a D&I CBI they (or their host institution) developed were included—this comprised CBIs where the whole focus of the training was on D&I (e.g. a D&I workshop) or only part of the focus (e.g. a D&I module that formed part of a larger postgraduate programme in public health). Articles were not restricted based on their methodological focus—in other words, we included all D&I CBIs that met our inclusion criteria, irrespective of the type of information provided on the D&I CBI or the level of detail.

Exclusion criteria

Dissertations and doctoral theses, books/book reviews, conference posters/presentations and editorials/commentaries were excluded, as were articles not in English. Review papers were excluded following citation searching for relevant articles, as were articles that focussed on training in other areas of healthcare improvement (e.g. patient safety or quality improvement) if they did not include an element of D&I within the training (e.g. [73, 74]). Articles were also excluded if they: described D&I-related conferences or conference proceedings (e.g. [56, 75, 76]) unless there was a specific D&I CBI within the conference (e.g. workshop) that delegates could register for, examined how D&I methodologies or knowledge translation techniques could be used to better implement training programmes [77] or training centres [78] without the focus of the training itself being on D&I and assessed D&I training needs [14, 55, 79] or competencies in D&I [80] or discussed the development of D&I-related research centres [21] without reference to a specific D&I CBI. Equally, articles that provided an overview of a meeting(s) to discuss how to advance the field of D&I [81, 82], focussed on the development of collaboratives to encourage new research partnerships [83], presented general repositories for D&I resources or training opportunities [60, 63] or calls from journals for work relating to D&I CBIs [23] were excluded.

Finally, articles that focussed on the development of training programmes for mentors working in translational science [84, 85] were excluded unless the content of the mentoring was on D&I science (versus more generally on how to be an effective mentor), as were those that explored mentoring approaches as a way of assisting knowledge translation without actually discussing a D&I-related mentoring scheme [86].

Screening of articles

Articles were screened for relevance by the lead author (RD). The second author (DD) independently screened a random selection of 20% of the articles at the first stage of screening (title and abstract) and 100% at the second stage (full text). Discrepancies were resolved through discussions between the authors until consensus was reached.

Data extraction

A standardised form was developed to extract data from the included articles and to help synthesise the data in the review. The ‘Template for Intervention Description and Replication’ (TIDieR) checklist [87] was used as a starting point to see which items in the checklist would be of relevance to review aims—the TIDieR specifies the clear reporting of interventions (in our context ‘training interventions’). Additional criteria of potential relevance were identified by the lead author (RD) and agreed by the second author (DD) by searching google scholar, electronic databases (e.g. PUBMED) and consulting with the Equator network website for relevant guidelines ( Operational definitions for each criterion were developed and tested across all included articles to ensure reliability, validity and consistency in the data extraction process (see Suppl. file 1 for data extraction form).

For articles where authors discussed more than one D&I CBI they had developed (i.e. they presented a suite of D&I CBIs that were independent of one another, such as workshops or postgraduate courses), data was extracted on each CBI separately. Data was extracted by both authors (RD, DD) across all articles to ensure consistency and accuracy in the reporting of findings.

Quality assessment

The eligibility criteria for articles in our review led to the inclusion of heterogenous research in terms of aims and methodological focus. It was deemed inappropriate therefore to appraise the methodological quality of the articles. Instead, we used the data extraction form (Suppl. file 1) to describe key characteristics of the D&I CBIs, delineate commonalities and differences between these and highlight key learnings when taking the evidence collectively.


Search results

The search retrieved 5564 articles, with a total of 4181 remaining after the removal of duplicates (N = 1383). A further 3938 articles were excluded at the title and abstract stage, resulting in 243 full-text articles assessed for eligibility. Of these, 212/243 articles were disregarded (see Fig. 1 for reasons), leaving 31 articles relevant for inclusion.

Fig. 1
figure 1

PRISMA flowchart of results

There was a high level of agreement (> 90%) regarding inclusion between the reviewers (RD and DD) at both stages of screening, with disagreements quickly and easily resolved.

Key characteristics of included articles

Articles spanned a 13-year period (2006–2019) with the majority published during or post 2014 (N = 21) [18, 47, 57, 58, 62, 88,89,90,91,92,93,94,95,96,97,98,99,100,101,102,103] and 10 articles published pre 2014 (N = 10) [48, 49, 104,105,106,107,108,109,110,111]. Publications originated from 8 countries: the USA (N = 21) [18, 48, 57, 58, 62, 88,89,90, 92,93,94,95,96, 98,99,100,101, 103, 104, 108, 109], Canada (N = 3) [106, 107, 110], Australia (N = 2) [49, 91], the UK (N = 1) [105], Sweden (N = 1) [47], Kenya (N = 1) [97], Germany (N = 1) [102] and Saudi Arabi (N = 1) [111].

Articles reporting on the same CBI

Most of the articles (N = 27) [47,48,49, 57, 58, 62, 88, 89, 91,92,93,94,95,96,97,98,99,100,101,102,103, 105,106,107,108,109, 111] reported on just one CBI, with the remaining articles (N = 4) [18, 90, 104, 110] discussing between 2 and 10 different D&I CBIs. Four CBIs were the focus of more than one article, including the Implementation Research Institute (IRI) (mentioned in 4 articles) [18a, 57, 92, 109], the Knowledge Translation Summer Institute (KTSI) (mentioned in 3 articles) [106, 107, 110c], the Mentored Training for Dissemination and Implementation Research in Cancer (MT-DIRC) (mentioned in 2 articles) [18b, 98] and the Training in Dissemination and Implementation Research in Health (TIDIRH) (mentioned in 2 articles) [48, 103]. Considering the articles separately, 48 CBIs were reported, but taking the articles collectively, accounting for those CBIs that were reported in more than one article, 41 distinct CBIs were reported (across the 31 included articles).

Key characteristics of the included D&I CBIs

For the remainder of the results, findings are presented in relation to the number of included D&I CBIs (N = 41). Where CBIs are discussed in multiple articles (e.g. data is drawn from two articles on the TIDIRH) [48, 103], this is reflected in the number of references accompanying each finding.

Of the 41 D&I CBIs identified, a range of ‘types’ of training (as defined by the authors of each of the included articles) were reported including training Institutes (N = 4) [18a, 48, 57, 92, 103, 106, 107, 109, 110c] or courses that were part of training programmes (N = 2) [18b, 98, 100] or training initiatives (N = 1) [99]; workshops (N = 4) [49, 93, 95, 111], seminars (N = 3) [90a, 110a,b], clerkships (N = 1) [88], mentorship programmes (N = 1) [90b], graduate certificates (N = 1) [89], webinars (N = 1) [90c], fellowship programmes (N = 1) [97], master’s programmes (N = 1) [102] or modules that have been integrated in master’s programmes—including clinical research (N = 1) [104a] primary health care (N = 1) [105] and public health (N = 1) [62]; or PhD programmes (N = 1) [94] or modules that are integrated as part of a Doctors of Nursing programme (N = 2) [101, 108]. The 15 remaining CBIs were termed by the authors as ‘courses’ relating (in part or in full) to D&I science [18c,d, 47, 58, 91, 96, 104b,c, 110d-k].

Fourteen CBIs were delivered face-face [47,48,49, 88, 91, 95, 100,101,102,103,104a, 106, 107, 110a-c,i,k, 111], 7 were delivered remotely (either online, over the phone or through video-conferencing) [62, 90a-c, 105, 110a,b], 8 were blended—employing F-F and remoted-based methods [18a,b, 57, 58, 89, 92, 96,97,98, 108, 109], 4 were delivered either F-F or remotely (i.e. individuals picked one mode of delivery) [93, 110e,f,h] and for the remainder, it was not reported/not clear (N = 8) [18c,d, 94, 104b,c, 110d,g,j]. CBIs ranged in length from hours [93], day(s) [49, 91, 106, 107, 110g], week(s) [11, 48, 103], month(s) [47, 58, 88, 96, 108] to years [18a,b, 57, 89, 92, 97,98,99, 102, 109].

Tables 2 and 3 provide further information on some of the key selected characteristics of the included CBIs.

Table 2 Characteristics of the D&I CBIs included in the review
Table 3 Additional characteristics of the D&I CBIs included in the review

Eligibility criteria and application process

The majority of CBIs were aimed at individuals who had undertaken or were undertaking a postgraduate qualification (N = 12) [62, 89, 95,96,97, 102, 104a, 105, 110a-g], and to a lesser extent, doctoral/postdoctoral (N = 10 ) [18a,b, 47, 48, 57, 92, 94, 98, 101, 103, 106,107,108,109] or undergraduate level individuals (N = 1) [88]. For the remaining CBIs (N = 18 CBIs) [18c,d, 49, 58, 90a-c, 91, 93, 99, 100, 104a,b, 110i,k, 111], it was not reported. CBIs were run through academic institutions (N = 22) [18a-d, 47, 57, 62, 88, 89, 100,101,102, 104a-c, 108, 109, 111, 91,92,93,94,95,96,97,98] or healthcare organisations/institutions working in D&I-related areas (N = 17) [48, 58, 90a-c, 99, 103, 106, 107, 110a-k] and to a lesser extent through D&I-related collaboratives (N = 2) [49, 93].

Seventeen of the CBIs focussed the training towards a specific context, including cancer (N = 4 ) [18b, 90a-c, 98], public or global health (N = 4) [2, 91, 94, 96], nursing (N = 2) [101, 108], behavioural health (2) [89, 100], cardiology (N = 1) [88], family medicine (N = 1) [111], mental health (N = 1) [18a, 57, 92, 109], dementia (N = 1) [49] and primary care (N = 1) [105]. The remaining CBIs were either not restricted to a heath or social care setting (N = 22) [47, 48, 58, 93, 95, 97, 99, 102,103,104a-c, 107, 110a-k] or it was not reported (N = 2 CBIs) [18c,d]. Most of the CBIs were aimed at multiple professions(N = 33) [18a,d, 47,48,49, 57, 58, 62, 90a-c, 91, 92, 95,96,97,98,99,100, 102,103,104a-c, 105,106,107, 109, 110a-k] with fewer confined to specific groups of individuals, including medical students or medics (N = 2) [88, 111] or nurses (N = 2) [101, 108]: for the remainder, it was not reported (N = 4) [18c,d, 93, 94].

Ten of the CBIs provided information on the application and selection process. This ranged from individuals taking a formative assessment to ensure they had the requisite knowledge and skills in evidence-based medicine [111]; providing evidence that they had not received major research funding in D&I research before [48]; writing a 1.5–2 page concept paper describing a D&I research project they would like to undertake as part of the training [48, 57, 98, 109]; detailing prior experience with implementation and/or health science research [48, 57, 58, 102]; producing a cover letter or statement to demonstrate a motivation to undertake the CBI, pursue a career in D&I and/or their long-term research agenda [48, 57, 58, 97, 98, 102, 109]; obtaining a letter of support or character reference from their workplace [48, 97, 98, 109]; providing evidence of academic grades or research productivity [48, 57, 102, 106, 109]; answering essay questions [106]; and undertaking interviews [88, 102]. One CBI also required individuals to apply in a team (i.e. a joint application involving other individuals) whereby they had to explain a D&I-related project they would like to implement in their workplace to address a healthcare-related challenge [99].

Additional data on the competitive nature of the application process was referred to for six of the CBIs. The TIDIRH [48, 103] had a total of 266 investigators applying in the first year (2011) for 30 places [48], with 1100 applicants over a 5-year evaluation period (2011–2015) for 199 places [103], and in 2019, over 200 applicants applied for 50 available training slots [103]. The IRI [18a, 57, 92, 109] accepted approximately 10 fellows each year with a total of 31 fellows over the first 3 years (2010–2012) from a total pool of 86 applicants [109], with other data derived across 4 separate occurrences of the IRI training reporting a 43/124 acceptance rate [57, 92]. The KTSI [106, 107, 110c] had 150 trainees that applied for 30 places [106] while the MT-DIRC [18b, 98] offered 56 fellows a place over the 4 occurrences of the training (2014–2017) [98]. The ‘Action Research Programme’ reported that only 6 students were accepted at the start of the programme’ [88], and numbers on a master’s (in ‘health services research and implementation science’) [102] and a doctoral-level course (on ‘implementation science’) [47] were capped at 20 due to practical reasons, despite the demand for the courses growing [47].

Content and structure of the CBI

The level of detail on the content and structure of the 41 CBIs included in the review varied considerably (largely due to the differing aims of the included articles). It is beyond the scope of the present review to examine this in detail here, but an individual breakdown of the information supplied relating to the content of the CBI (e.g. weblinks, course handbooks, workshop agendas) can be found in Table 2. While we are not suggesting here that the number of files, tables or supplementary documents each article provides for each CBI should be used as an indicator of the quality of the article, this does serve to illustrate the type of information authors are providing when reporting on D&I CBIs. Further inspection of the content of the CBIs will be explored as part of our larger programme of work on capacity building in D&I.

Evaluation and impact of CBIs

Of the 41 CBIs included in the review, evaluative data was provided for 21 CBIs [47,48,49, 57, 58, 62, 88,89,90a, 91,92,93, 95,96,97,98,99,100, 102, 103, 106,107,108,109,110c, 111]. We provide here a high-level summary of key themes (see also Table 2).

Overall perception

CBIs were rated ‘positively’ by individuals—in terms of the CBI itself and/or the importance of the contents [47, 95, 97, 98, 102, 108], overall satisfaction [95], acceptability and appropriateness [93, 100], usefulness of tools/methods [62], helpfulness [48] and likelihood of recommending the CBI to others [96].

Knowledge and skills

Knowledge and use of D&I principles as well as confidence in conducting D&I activities increased as a result of the CBI [47, 49, 58, 95, 97,98,99,100, 111]. Individuals reported using D&I skills they acquired as a result of the training to influence and train peers in D&I [48]; be involved in research networks [48]; deliver educational modules and presentations [110c], embark on practicums, master’s papers and other projects [62]; and serve as mentors for more junior investigators [109].

Project-based work

Conducting a D&I-related project was reported as one of the main reasons for applying for a CBI [89] and one of the most valuable aspects [96] with individuals reporting this helped to enhance their understanding of the relationship between evidence and implementation [111]. However, individuals also raised the need for more time to conduct projects and more guidance from faculty on the scope of the projects [89, 96]—while some project ideas were implemented, most did not move beyond conceptualization due to lack of time or guidance from faculty [88].

Research productivity

Undertaking and completing a CBI was related to research productivity in terms of applying for and/or being awarded funding for D&I research [48, 57, 92, 93, 95, 97, 99, 103, 109, 110c], writing publications [92, 109, 110c] and embarking on D&I-related PhD programmes [97]. Individuals also reported collaborating with other trainees [103] and expressed interest in maintaining relationships and being updated on each other’s work [106] after the completion of training.


To the best of our knowledge, this is the first systematic review of its kind to identify and collate the type and range of D&I CBIs relating to teaching and training that have been described and/or appraised in the academic literature. An array of training opportunities from countries across the world were uncovered, aimed at numerous professions, focussed on different contexts and ranging in delivery format, duration, structure and content.

This review was (in part) in response to an editorial calling for a greater number of publications on the development and evaluation of D&I CBI training initiatives [52]—we took this call one step further by synthesising the collective evidence published to date. Our research has raised several important implications for the development and delivery of future D&I CBIs as well as their reporting, discussion and appraisal in academic journals. Here, we discuss some of the most pertinent overarching challenges we believe should be prioritised in terms of building capacity in teaching and training in D&I and in how these training endeavours are reported and disseminated for wider use.

Demand and importance of D&I training

While our findings, supported by the wider evidence-base [59, 112], highlight the recognised international demand and importance of D&I CBIs, we also found an unmet need for D&I training. For some CBIs, enrolment may only occur once a year and/or may have strict eligibility criteria (e.g. specific qualifications or experience), which significantly limits the pool of individuals whom are able to apply. The highly selective nature and low reported acceptance rates of some CBIs [48, 57, 92, 98, 103, 109] also suggest that the demand for training from the wider population is likely much higher—oversubscriptions to D&I conferences, meetings and initiatives provide further support for this view [7, 17, 59, 75].

Also, of note is that many of the CBIs we delineated centred on advancing postgraduate or postdoctoral researchers’ D&I skills. While these CBIs were designed specifically for building research capacity, so do not by nature restrict options for other kinds of learners (e.g. practitioners, policy makers), there does appear to be fewer training options for individuals newer to the field, which could widen the gap between novices and those already skilled in D&I [59]. Greater emphasis on reaching out to predoctoral individuals, practitioners, policy makers and consumers [14, 16, 59] and publishing findings on such CBIs after they have been evaluated is required if we are to gain a better understanding of training needs, priorities and challenges from a diverse range of learners. Additional efforts are required to train multidisciplinary teams (not just individuals), whom are often critical to the successful design and execution of implementation research and practice [16, 113, 114] and on delivering training in low resource settings, whom encounter unique challenges in implementing evidence-based practices due to limited financial resources and healthcare workforce [115,116,117] (only 3 D&I CBIs in our review focussed on this [97, 99, 111]).

Availability and accessibility of D&I training and resources

Creative approaches to providing support in D&I are required if/when local institutional support is lacking—which may often be the case given the relative infancy of the field [118] and proportionately small pool of experts able to provide senior mentorship [60]. Examples of such creative approaches uncovered in our review include the use of online platforms to provide mentorship support [90b], web-networking to enhance research connections and obtain feedback on research activities [48, 90c] and webinars/online seminar series [90a, 100a,b] to share D&I learnings. More widely, a whole host of additional training opportunities and other resources exist (many of which are free)—including interactive web-based tools [63], networks and discussion forums [42], MOOCs and online courses [119,120,121,122] and numerous guides on D&I methodologies [123,124,125,126,127,128,129,130,131].

Preliminary evidence indicates that individuals are not always aware of the existence of D&I resources nor may they be aware how to access them [132, 133]. Arguably, the D&I community may benefit from more focussed efforts of dissemination in order to reach a wider critical mass of individuals interested in learning about D&I—a point which is of particular importance when no other training option is available (e.g. due to cost or time). While some organisations have made steps to providing lists of D&I resources and training opportunities on their websites [134,135,136,137,138], a more general repository [17] where all the up-to-date evidence and training could be logged is likely to be of significant merit to the field.

Barriers to effective training

Systematic reviews and research in areas related to D&I (e.g. behavioural sciences, quality improvement and patient safety) report that competing educational demands, time, faculty expertise, motivation and institutional culture are important determinants of successful curriculum implementation and/or completion [139,140,141,142]. Parallels can be drawn from our review, with lack of time to conduct D&I projects and insufficient guidance on projects being raised as issues by faculty and trainees [88, 89, 96]. More widely in the literature, costs and time constraints are reported as major factors in the decision to undertake knowledge translation training [50, 63], particularly for those from low resources settings [63]. These findings, while only preliminary, highlight the need to examine different systems and individuals in which D&I curriculum will be implemented, alongside the determinants of developing, delivering and accessing curriculum within these systems for a variety of learners. Doing this will better enable strategies to be put in place to overcome barriers to implementation of D&I CBIs and, in turn, help to address the recognised deficit in training opportunities [14, 75, 80, 112].

Standardisation in reporting

Unlike other areas of research, where reporting guidelines exist, e.g. for systematic reviews [143], implementation research [144] and intervention reporting [87], there is no equivalent resource specific to the reporting of D&I CBIs. Systematic reviews on knowledge translation interventions (a related and synonymous term with implementation science [65,66,67, 145, 146]) have raised how inconsistencies in intervention reporting hamper evidence synthesis [147,148,149]. In the same way through this review, we found that variabilities in reporting D&I CBIs (both in terms of describing and evaluating) can make literature synthesis problematic. While this challenge is not surprising given the differing aims and focus of our included articles (and so is by no means a criticism of the articles we included), it nonetheless highlights an important issue. If we are to use articles like these to learn and further build capacity efforts in D&I—a point raised as important by this journal [52]—greater consistency in reporting is required. We consider the importance of standardisation in more detail here by drawing on two key areas: (1) the reporting of the content and structure of D&I CBIs and (2) the reporting of how D&I CBIs are evaluated.

The reporting of the content and structure of D&I CBIs

Due to the extensive scope of this work, we were only able to provide a high-level summary of the content and structure of the CBIs in this paper. However, it was clear when undertaking the review that despite evident similarities on content (e.g. covering measurement or theory), different topics were covered to varying degrees and a consistent curriculum, focussed on inter-disciplinary competencies, was not revealed. While initial steps have been taken to reach consensus on D&I curricula expectations and competencies for various learners (both within the CBIs included in our review and more widely [79, 80, 150, 151]), measures and methods are still developing [152,153,154,155,156] and can be difficult to define [65, 67, 156, 157]. Advocating the adoption of a small, common set of terms (which could then also be used when reporting D&I CBIs) is one way in which a better understanding of the evidence-base in D&I could be reached [65, 71, 152, 154, 155]. Progress is already underway in the reporting of some areas of implementation methodology (e.g. implementation outcomes [71], implementation strategies [152, 154], theories, models and frameworks [155]), but we are still a long way off being able to establish a more comprehensive taxonomy of common terms and methods.

The reporting of how D&I CBIs are evaluated

Appraising existing CBIs is one way which can help to understand individual needs for D&I research and practice [14, 50, 51, 55] and identify priorities for D&I capacity building [50, 51]. Articles in our review evaluated CBIs to varying degrees. While this would be expected given our eligibility criteria for articles and their subsequent differing aims, without clear and consistent reporting of data, it is difficult to appraise, synthesise and effectively communicate progress in D&I training across different professions, contexts and purposes. Ideally, evaluations of CBIs would be performed on repeated occurrences of the training (to check consistency in the findings across several cohorts) and longitudinally (to assess the longer-term impact of training) to better establish the effectiveness of D&I curricula in enabling desired outcomes in practice (something which few of the articles included in our review did).

In a field where the evidence (and therefore priorities for teaching) is rapidly changing, standardisation in the reporting of key elements of D&I content and structure as well as the evaluation of the CBIs is critical. This understanding and clarity is essential for D&I educators, researchers and implementers to draw meaningful conclusions from the literature on D&I training. In turn, a clearer cumulative assessment of the evidence-base can be reached, so that training successes and challenges as well as educational gaps in the field can be identified and addressed.

Review caveats

While our review has much to add to the field of capacity building in D&I, several important caveats should be borne in mind when interpreting our results. First, given the extensive scope of our review and its complexity, we were only able to provide a high-level summary of our findings here. We acknowledge that examining the curriculum of each the D&I CBIs may be of interest to the readership of this journal, as may a detailed synthesis of the evaluation of D&I CBIs like those identified in our review. However, while this is something we plan to undertake in future work, it was beyond the scope and aims of the current paper and would not have been possible without significantly compromising on the level and detail of other information required in order to meet our review’s aims. Second, given one of the aims of our review was to show how D&I CBIs are reported in the academic literature and to use these findings to help inform future recommendations on reporting (a point which has been raised as important to explore [52]), we did not include ‘grey literature’ in our review. We are aware of D&I CBIs in the field that have not been written up for publication [158,159,160,161,162] so acknowledge that our review (while intentional) only provides a fieldwide perspective on the academic literature, not the total number of D&I CBIs on offer. Third, to provide a comprehensive account of the literature, we did not exclude articles based on their aims—unsurprisingly, therefore, those that were included differed in focus, ranging from brief or detailed descriptions and/or evaluations of D&I CBIs to general overviews of several initiatives. Fourth, we did not exclude CBIs based on the level of detail authors provided on them. While this was intentional, in order to highlight variabilities in reporting, undoubtedly, this meant that less meaningful conclusions can be drawn from those articles where minimal information was included. Finally, it is worth noting that we examined just one way to build capacity in D&I—providing funding for D&I research networks [163,164,165,166], research proposals [68, 167,168,169], faculty positions and job vacancies [170, 171] and departments and centres [22,23,24,25,26,27,28,29,30,31,32,33,34,35,36], as well as organising D&I-related conferences and meetings [43,44,45,46] are also important avenues for growth.


This review addresses a clear gap in the evidence-base and helps pave the way for future research on building capacity in D&I. Greater investment in education and training is necessary to increase the cadre of D&I scientists and practitioners. Consistent reporting on D&I CBIs is required to enable greater transparency on the type and range of training opportunities, attitudes towards them and training gaps that need to be prioritised and addressed. Increasing awareness and accessibility to D&I training and resources should also be prioritised. Ultimately, doing this should result in D&I learnings being more clearly communicated so that the best possible D&I CBIs can be developed to achieve the most optimal outcomes. Further work examining the evidence on CBI D&Is (both within the academic literature and more widely) is required.

Availability of data and materials

The datasets are available from the authors on reasonable request.



Capacity building initiative


Dissemination and implementation science


  1. Mallonee S, Fowler C, Istre GR. Bridging the gap between research and practice: a continuing challenge. Inj Prev. 2006;12(6):357–9.

    Article  CAS  Google Scholar 

  2. Hirschkorn M, Geelan D. Bridging the research-practice gap: research translation and/or research transformation. Alberta J Ed Res. 2008;54(1):1–13.

    Google Scholar 

  3. Bero LA, Grilli R, Grimshaw JM, Harvey E, Oxman AD, Thomson MA. Closing the gap between research and practice: an overview of systematic reviews of interventions to promote the implementation of research findings. BMJ. 1998;317:465.

    Article  CAS  Google Scholar 

  4. Brownson RC, Jones E. Bridging the gap: translating research into policy and practice. Prev Med. 2009;49(4):313–5.

    Article  Google Scholar 

  5. Estabrooks PA, Brownson RC, Pronk NP. Dissemination and implementation science for public health professionals: an overview and call to action. Prev Chronic Dis. 2018;15:e162.

    Article  Google Scholar 

  6. Schillinger D. An introduction to effectiveness, dissemination and implementation research: a resource manual for community-engaged research community engagement program: Clinical & Translational Science Institute: University of California San Francisco; 2010.

  7. Norton W, Lungeanu A, Chambers DA, Contractor N. Mapping the growing discipline of dissemination and implementation science in health. Scientometrics. 2017;112:1367–90.

    Article  Google Scholar 

  8. Sales AE, Wilson PM, Wensing M. Implementation science and implementation science communications: our aims, scope, and reporting expectations. Implement Sci. 2019;14:77.

    Article  Google Scholar 

  9. Trostle J. Research capacity building in international health: definitions, evaluations and strategies for success. Soc Sci Med. 1992;52:1321–4.

    Article  Google Scholar 

  10. Bates I, Akoto AY, Ansong D, Karikari P, Bedu-Addo G, Critchley J, et al. Evaluating health research capacity building: an evidence-based tool. PLoS Med. 2006;3:e299.

    Google Scholar 

  11. Levine R, Russ-Eft D, Burling A, Stephens J, Downey J. Evaluating health services research capacity building programs: implications for health services and human resource development. Eval Program Plan. 2013;37:1–11.

    Google Scholar 

  12. Kislov R, Waterman H, Harvey G, Boaden R. Rethinking capacity building for knowledge mobilisation: developing multilevel capabilities in healthcare organisations. Implement Sci. 2014;9:166.

    Google Scholar 

  13. Imison C, Castle-Clarke S, Watson R. Reshaping the workforce to deliver the care patients need: The Nuffield Trust; 2016.

  14. Tabak RG, Padek M, Kerner JF, Stange KC, Proctor EK, Dobbins MJ, et al. Dissemination and implementation science training needs: Insights from practitioners and researchers. Am J Prev Med. 2017;52(3, Suppl 3):S322–9.

    Google Scholar 

  15. Boyce CA, Barfield W, Curry J, Shero S, Green Parker M, Cox H, et al. Building the next generation of implementation science careers to advance health equity. Ethn Dis. 2019;29(Suppl 1):77–82.

    Google Scholar 

  16. Chambers DA, Pintello D, Juliano-Bult D. Capacity building and training opportunities for implementation science in mental health. Psychiatry Res. 2020;283:112511.

    Google Scholar 

  17. Chambers DA, Proctor EK, Brownson RC, Straus S. Mapping training needs for dissemination and implementation research: lessons from a synthesis of existing D&I research training programs. Transl Behav Med. 2017;7(3):593–601.

    Google Scholar 

  18. Brownson RC, Proctor EK, Luke DA, Baumann AA, Staub M, Brown MT, et al. Building capacity for dissemination and implementation research: one university’s experience. Implement Sci. 2017;12:104.

    Google Scholar 

  19. Demakis JG, McQueen L, Kizer KW, Feussner JR. Quality Enhancement Research Initiative (QUERI): a collaboration between research and clinical practice. Med Care. 2000;38(6):S17–25.

    Google Scholar 

  20. Kilbourne AM, Elwy AR, Sales AE, Atkins D. Accelerating research impact in a learning health care system: VA’s quality enhancement research initiative in the choice act era. Med Care. 2017;55(7 Suppl 1):S4–12.

    Google Scholar 

  21. Stetler CB, Mittman BS, Francis J. Overview of the VA Quality Enhancement Research Initiative (QUERI) and QUERI theme articles: QUERI Series. Implement Sci. 2008;3:8.

    Google Scholar 

  22. Centre for Dissemination and Implementation. Institute for Public Health at Washington University.

  23. National Cancer Institute. Division of Cancer Control and Population Sciences. Implementation Science Centres in Cancer Control.

  24. Penn Implementation Science Centre at Lennard Davis Institute of Health Economics.

  25. Department of Research and Evaluation. Kaiser Permanente.

  26. Fogarty International Centre. National Institute for Health.

  27. Choosing Wisely. Canada Implementation Research Network.

  28. Institute of Health Services and Policy Research. Canadian Institute for Health Research.

  29. Priority Research Centre for Cancer Research, Innovation and Translation. The University of Newcastle, Australia.

  30. Training Institute for Dissemination and Implementation Research in Health Australia.

  31. Centre for Implementation Science. Kings College London. National Institute for Health Research, Applied Research Collaboration.

  32. Wessex Centre for Implementation Science, University of Southampton.

  33. Doctoral Programme in Public Health Intervention and Implementation Research, Institute of Environmental Medicine, Karolinska Institute.

  34. Health Services Research and Implementation Science in Health Systems, University of Heidelberg.

  35. Medicines Implementation Research. Swiss Tropical and Public Health Institute.

  36. Alliance for Health Policy and Systems Research, World Health Organisation.

  37. Knowledge Mobilisation Fellowship Programme. National Institute for Health Research.

  38. Dissemination and Implementation Research in Health. National Cancer Institute, Division for Cancer Control and Population Sciences.

  39. Tinkle M, Kimball R, Haozous EA, Shuster G, Meize-Grochowski R. Dissemination and Implementation Research funded by the US National Institutes of Health, 2005-2012. Nurs Res Pract. 2013; Article ID: 909606.

  40. The National Implementation Research Network. Frank Porter Graham Child Development Institute.

  41. UK Implementation Society.

  42. The Society for Implementation Research Collaboration.

  43. 13th Annual Conference on the Science of Dissemination and Implementation in Health. Academy Health

  44. Global Implementation Conference.

  45. 5th Biennial Society for Implementation Research Conference.

  46. UK Implementation Science Masterclass. 2019.

  47. Carlfjord S, Roback K, Nilsen P. Five years’ experience of an annual course on implementation science: an evaluation among course participants. Implement Sci. 2017;12:101.

    Article  Google Scholar 

  48. Meissner H, Glasgow RE, Vinson CA, et al. The US training institute for dissemination and implementation research in health. Implement Sci. 2013;8:12.

    Article  Google Scholar 

  49. Goodenough B, Fleming R, Young M, Burns K, Jones C, Forbes F. Raising awareness of research evidence among professionals delivering dementia care: are knowledge translation workshops useful? Gerontol Geriat Ed. 2017;38(4):392–406.

    Article  Google Scholar 

  50. Holmes BJ, Schellenberg M, Schell K, Scarrow G. How funding agencies can support research use in healthcare: an online province-wide study to determine knowledge translation training needs. Implement Sci. 2014;9:71.

    Article  Google Scholar 

  51. Newman MO, Bellare NB, Chakanyuka CM, Oyelade TA, Thom E, Bigirimana F, et al. Building health system capacity through implementation research. Experience of INSPIRE – A multi-country PMTCT implementation research project. JAIDS. 2017;75:S240–7.

    Google Scholar 

  52. Straus S, Sales A, Wensing M, Michie M, Kent B, Fiy R. Education and training for implementation science: our interest in manuscripts describing education and training materials. Implement Sci. 2015;1:136.

    Article  Google Scholar 

  53. Smits PA, Denis JL. How research funding agencies support science integration into policy and practice: an international overview. Implement Sci. 2014;9:28.

    Article  Google Scholar 

  54. Brownson RC, Fielding JE, Green LW. Building Capacity for evidence-based public health: reconciling the pulls of practice and push of research. Annu Rev Public Health. 2018;39:27–53.

    Article  Google Scholar 

  55. Stamakatis KA, Norton WE, Stirman SW, Melvin C, Brownson R. Developing the next generation of dissemination and implementation researchers: insights from initial trainees. Implement Sci. 2013;8:29.

    Article  Google Scholar 

  56. Chambers D, Proctor EK. Advancing a comprehensive plan for dissemination and implementation research training 6th NIH Meeting on Dissemination and Implementation Research in Health: a working meeting on training. In National Cancer Institute Division of Cancer Control & Population Science Implementation Science Webinar Series; January, 28th 2014.

  57. Baumann AA, Carothers BJ, Landsverk J, Kryzer E, Aarons GA, Brownson R, et al. Evaluation of the Implementation Research Institute: Trainees’ publications and grant productivity. Admin Policy Mental Health Mental Health Ser Res. 2020;47:254–64.

    Article  Google Scholar 

  58. Moore JER, Rashid S, Park JS, Khan S, Straus SE. Longitudinal analysis of a course to build core competencies in implementation practice. Implement Sci. 2018;13:106.

    Article  Google Scholar 

  59. Proctor EK, Chambers DA. Training in dissemination and implementation research: a field-wide perspective. Transl Behav Med. 2017;7(3):624–35.

    Article  Google Scholar 

  60. Darnell D, Dorsey CN, Melvin A, Chi J, Lyon AR, Lewis CC. A content analysis of dissemination and implementation science resource initiatives: what types of resources do they offer to advance the field? Implement Sci. 2017;12:137.

    Article  Google Scholar 

  61. Ginossar T, Heckman KJ, Cragun D, et al. Bridging the chasm: challenges, opportunities, and resources for integrating a dissemination and implementation science curriculum into medical education. J Med Ed Curr Dev. 2018;5:1–11.

    Google Scholar 

  62. Ramaswamy R, Mosnier J, Reed K, Powell BJ, Schenck AP. Building capacity for Public Health 3.0: Introducing implementation science into a MPH curriculum. Implement Sci. 2019;14:18.

    Article  Google Scholar 

  63. Ford BS, Rabin B, Morrato EH, Glasgow RE. Online resources for dissemination and implementation science: meeting the demand and lessons learned. J Clin Transl Sci. 2018;2(5):259–66.

    Article  Google Scholar 

  64. Eccles MP, Mittman BS. Welcome to Implementation Science. Implement Sci. 2006;1:1.

    Article  Google Scholar 

  65. McKibbon KA, Lokker C, Wilczynski NL, Ciliska D, Dobbins M, Davis DA, et al. A cross-sectional study of the number and frequency of terms used to refer to knowledge translation in a body of health literature in 2006: a Tower of Babel? Implement Sci. 2010;5:16.

    Article  Google Scholar 

  66. Colquhoun H, Leeman J, Michie S, Lokker C, Bragge P, Hempel S, et al. Toward a common terminology. A simplified framework of interventions to promote and integrate evidence into health practices systems and policies. Implement Sci. 2014;9:781.

    Article  Google Scholar 

  67. Rabin BA, Brownson RC, Haire-Joshu D, Kreuter MW, Weaver NL. A glossary for dissemination and implementation research in health. J Public Health Manage Pract. 2008;14(2):117–23.

    Article  Google Scholar 

  68. Glasgow RE, Vinson C, Chambers D, Khoury MJ, Kaplan RM, Hunter C. National Institutes of Health approaches to dissemination and implementation science: current and future directions. Am J Public Health. 2012;102(7):1274–81.

    Article  Google Scholar 

  69. Bauer MS, Damschroder L, Hagedorn H, Smith J, Kilbourne AM. An introduction to implementation science for the non-specialist. BMC Psychol. 2015;3:32.

    Article  Google Scholar 

  70. Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public impact. Med Care. 2012;50(3):217–26.

    Article  Google Scholar 

  71. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Admin Pol Ment Health. 2011;38:65–76.

    Article  Google Scholar 

  72. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowrey JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50.

    Article  Google Scholar 

  73. McKee R, Sussman AL, Nelson TM, Russell JC. A quantitative and qualitative needs assessment enhances the relevance of a patient safety curriculum for surgical residents. J Am Coll Surg. 2014;219(4):e160–1.

    Article  Google Scholar 

  74. Maddalena V, Pendergast A, McGrath G. Quality improvement in curriculum development. Leadersh Health Serv (Bradf Engl). 2018;31(4):409–12.

    Article  Google Scholar 

  75. Proctor E, Carpenter C, Brown CH, Neta G, Glasgow R, Grimshaw J. Advancing the science of dissemination and implementation: three “6th NIH Meetings” on training, measures, and methods. 7th Annu Conf Dissemination Implementation Health. 2015;10(Suppl 1):A13.

    Google Scholar 

  76. Neta G. Proceedings from the 10th Annual Conference on the Science of Dissemination and Implementation. Meeting abstracts. Implement Sci. 2018;13(Suppl 4):728.

    Google Scholar 

  77. Dolansky MA, Schexnayder J, Patrician PA, Sales A. Implementation Science: New approaches to integrating quality and safety education for nurses’ competencies in nursing education. Nurse Educ. 2017;42(5S Suppl 1):S12–7.

    Article  Google Scholar 

  78. Fleming WO, Apostolico AA, Mullenix AJ, Starr K, Margolis L. Putting implementation science into practice: lessons from the creation of the National Maternal and Child Workforce Development Centre. Matern Child Health J. 2019;23(6):722–32.

    Article  Google Scholar 

  79. Newman K, Van Eerd D, Powell BJ, Urquhart R, Cornelissen E, Chan V, et al. Identifying priorities in knowledge translation from the perspective of trainees: results from an online survey. Implement Sci. 2015;10:92.

    Article  Google Scholar 

  80. Padek M, Colditz G, Dobbins M, Koscielniak N, Proctor EK, Sales A, et al. Developing educational competencies for dissemination and implementation research training programs: an exploratory analysis using card sorts. Implement Sci. 2015;10:114.

    Article  Google Scholar 

  81. Chambers DA. Advancing the science of implementation: a workshop summary. Admin Pol Ment Health. 2008;35(1-2):3–10.

    Article  Google Scholar 

  82. Bender BG, Krishnan JA, Chambers DA, Cloutier MM, Riekert KA, Rand CS, et al. American Thoracic Society and National Heart, Lung, and Blood Institute Implementation Research Workshop Report. Ann Am Thorac Soc. 2015;12(12):S213–21.

    Article  Google Scholar 

  83. Cornelissen E, Urquahrt R, Chan VW, DeForge RT, Colquhoun HL, Sibbald S, et al. Creating a knowledge translation trainee collaborative: from conceptualization to lessons learned in the first year. Implement Sci. 2011;6:98.

    Article  Google Scholar 

  84. Sorkness CA, Pfund C, Asquith P, Drezner MK. Research Mentor Training: Initiatives of the University of Wisconsin Institute for Clinical and Translational Research. Clin Trans Sci. 2013;6(4):256–8.

    Article  Google Scholar 

  85. Feldman MD, Huang L, Guglielmo BJ, Jordan R, Kahn J, Creasman JM, et al. Training the next generation of research mentors: the University of California, San Francisco, Clinical and Translational Science Institute Mentor Development Program. Clin Transl Sci. 2009;2(3):216–21.

    Article  Google Scholar 

  86. Straus SE, Graham ID, Taylor M, Lockyer J. Development of a mentorship strategy: a knowledge translation case study. J Continuing Ed Health Prof. 2008;28(3):117–22.

    Article  Google Scholar 

  87. Hoffman TC, Glasziou PP, Boutron I, Milne R, Perera R, Moher D, et al. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ. 2014;348:g1687.

    Article  Google Scholar 

  88. Ackerman SL, Boscardin C, Karliner L, Handley MA, Cheng S, Gaither T, et al. The Action Research Program: experiential learning in systems-based practice for first year medical students. Teach Learn Med. 2016;28(2):183–91.

    Article  Google Scholar 

  89. Burton DL, Levin BL, Massey T, Baldwin J, Williamson H. Innovative graduate research education for advancement of implementation science in adolescent behavioural health. J Beahv Health Serv Res. 2016;43(2):172–86.

    Article  Google Scholar 

  90. Farrell MM, La Porta M, Gallagher A, Vinson C, Bernal SB. Research to reality: Moving evidence into practice through an online community of practice, Prev Chronic Dis Public Health Res Pract Policy. 2014;11:130272.

  91. Jones K, Armstrong R, Pettman T, Waters E. Cochrane Update. Knowledge translation for researchers: developing training to support public health researchers KTE efforts. J Public Health. 2015;37(2):364–6.

    Google Scholar 

  92. Luke DA, Baumann AA, Carothers BJ, Landsverk J, Proctor EK. Forging a link between mentoring and collaboration: a new training model for implementation science. Implement Sci. 2016;11:137.

    Google Scholar 

  93. Marriott BR, Rodriguez AL, Landes SJ, Leis CC, Comtois KA. A methodology for enhancing implementation science proposals: comparison of face-to-face versus virtual workshops. Implement Sci. 2015;11:62.

    Google Scholar 

  94. Means AR, Phillips DE, Lurton G, Njoroge A, Furere SM, Liu R, et al. The role of implementation science training in global health: from the perspective of graduates of the field’s first dedicated doctoral program. Glob Health Action. 2016;9:31899.

    Google Scholar 

  95. Morrato EH, Rabin B, Proctor J, Cicutto LC, Battaglia CT, Lambert-Kerzner A, et al. Bringing it home: expanding the local reach of dissemination and implementation training via a university-based workshop. Implement Sci. 2015;10:94.

    Google Scholar 

  96. Norton W. Advancing the science and practice of dissemination and implementation in health: a novel course for public health students and academic researchers. Public Health Rep. 2014;129:536–42.

    Google Scholar 

  97. Osanjo GO, Oyugi JO, Kidwage IO, Mwanda WO, Ngugi NN, Otieno FC, et al. Building capacity in implementation science research training at the University of Nairobi. Implement Sci. 2016;11:30.

    Google Scholar 

  98. Padek M, Mir N, Jacob RR, Chambers DA, Dobbins M, Emmons KM, et al. Training scholars in dissemination and implementation research for cancer prevention and control: a mentored approach. Implement Sci. 2018;13:18.

    Google Scholar 

  99. Park JS, Moore JER, Sayal R, Holmes BJ, Scarrow G, Graham ID, et al. Evaluation of the “Foundations in Knowledge Translation” training initiative: preparing end users to practice KT. Implement Sci. 2018;13:63.

    Google Scholar 

  100. Proctor E, Ramsey AT, Brown MT, Malone S, Hooley C, McKay V. Training in Implementation Practice Leadership (TRIPLE): evaluation of a novel practice change strategy in behavioural health organisations. Implement Sci. 2019;14:66.

    Google Scholar 

  101. Riner M. Using implementation science as the core of the Doctor of Nursing Practice Inquiry project. J Prof Nurs. 2015;31(3):200–7.

    Google Scholar 

  102. Ullrich C, Mahler C, Forstner J, Szecsenyi J, Wensing M. Teaching implementation science in a new Master of Science Program in Germany: a survey of stakeholder expectations. Implement Sci. 2017;12:55.

    Google Scholar 

  103. Vinson CA, Clyne M, Cardoza N, Emmons KM. Building capacity: a cross-sectional evaluation of the US Training Institute for Dissemination and Implementation Research in Health. Implement Sci. 2019;14:97.

    Google Scholar 

  104. Gonzalez R, Handley MA, Ackerman S, O’Sullivan PS. Increasing the translation of evidence into practice, policy, and public health improvements: a framework for training professional in implementation and dissemination science. Acad Med. 2012;87(3):217–78.

    Google Scholar 

  105. Greenhalgh T, Russell J. Promoting the skills of knowledge translation in an online Master of Science course in primary health care. J Cont Educ Health Prof. 2006;26:100–8.

    Google Scholar 

  106. Kho M, Estey E, DeForge RT, Mak L. Riding the knowledge translation roundabout: Lessons learned from the Canadian Institutes of Health Research Summer Institute in Knowledge Translation. Implement Sci. 2009;4:33.

    Google Scholar 

  107. Leung BM, Catallo C, Riediger ND, Cahill NE, Kastner M. The trainees’ perspective on developing an end-of-grant knowledge translation plan. Implement Sci. 2010;5:78.

    Google Scholar 

  108. Moore ER, Watters R. Educating DNP students about critical appraisal and knowledge translation. Int J Nurs Educ. 2013;10(1):237–44.

    Google Scholar 

  109. Proctor EK, Landsverk J, Baumann AA, Mittman BS, Aarons GA, Brownson RC. The implementation research institute: training mental health implementation researchers in the United States. Implement Sci. 2013;8:105.

    Google Scholar 

  110. Straus SE, Brouwers M, Johnson D, Lavis JN, Legare F, Majumdar SR, et al. Core competencies in the science and practice of knowledge translation: description of a Canadian strategic training initiative. Implement Sci. 2011;6:127.

    Article  Google Scholar 

  111. Wahabi HA, Al-Ansary A. Innovative teaching methods for capacity building in knowledge translation. BMC Med Ed. 2011;11:85.

    Article  Google Scholar 

  112. Proctor EK, Landsverk J, Aarons G, Chambers D, Glisson C, Mittman B. Implementation research in mental health services: an emerging science with conceptual, methodological and training challenges. Admin Pol Ment Health. 2009;36(1):24–34.

    Article  Google Scholar 

  113. Chambers DA, Azrin ST. Research and service partnerships: partnership: a fundamental component of dissemination and implementation research. Psychiatr Serv. 2013;64(6):509–11.

    Article  Google Scholar 

  114. Hall KL, Ax F, Moser RP, et al. Moving the science of team science forward: collaboration and creativity. Am J Prev Med. 2008;35(2 Suppl 1):S243–9.

    Article  Google Scholar 

  115. Yapa MH, Bärnighausen T. Implementation science in resource poor countries and communities. Implement Sci. 2018;13:154.

    Article  Google Scholar 

  116. Hart JT. The inverse care law. Lancet. 1971;1(7696):405–12.

    Article  CAS  Google Scholar 

  117. GBD 2015 Healthcare Access and Quality Collaborators. Healthcare Access and Quality Index based on mortality from causes amenable to personal health care in 195 countries and territories, 1990–2015: a novel analysis from the Global Burden of Disease Study 2015. Lancet. 2017;390(10091):231–66.

    Article  Google Scholar 

  118. Chambers D. Foreword. In: Brownson R, Colditz G, Proctor E, editors. Dissemination and implementation research in health: translating science to practice. New York: Oxford University Press; 2012. p. vii–x.

    Google Scholar 

  119. Massive Open Online Course (MOOC) on implementation research: infectious diseases of poverty. Special Programme for Research and Training in Tropical Diseases.

  120. Inspiring change: creating impact with evidence-based implementation. Canadian Knowledge Transfer and Exchange Community of Practice.

  121. The science of global implementation. Centre Virchow-Villermé via iversity.

  122. Dissemination and implementation science online courses. University of Colorado.

  123. Dissemination and implementation models in health research and practice.

  124. CFIR Technical Assistance Website.

  125. Implementation Science. Research Tools. National Cancer Institute.

  126. Dissemination & Implementation (D&I) Toolkits. Washington University in St Louis.

  127. Implementation Research Toolkit. World Health Organisation. 2014.

  128. PCORI Dissemination and Implementation Toolkit. Patient-Centred Outcomes Research Institute. 2015.

  129. Dissemination & Implementation. Tips for getting funded. Colorado Research and Implementation Science Program.

  130. Hull L, Goulding L, Khadjesari Z, et al. Designing high-quality implementation research: development, application, feasibility and preliminary evaluation of the implementation science research development (ImpRes) tool and guide. Implement Sci. 2019;14:70.

    Article  Google Scholar 

  131. Implementation Science Research Toolkit Workbook. World Health Organisation. 2014.

  132. Morrato EH, Concannon TW, Meissner P, Shah ND, Turner BJ. Dissemination and implementation of comparative effectiveness evidence: key informant interviews with Clinical and Translational Science Award institutions. J Comp Effectiveness Res. 2013;2(2):185–94.

    Article  Google Scholar 

  133. The Evaluation Center. Colorado Clinical and Translational Sciences Institute (CCTSI) needs assessment: University of Colorado Denver; 2014.

  134. Resources for implementation science researchers. NIH Fogarty International Research Centre.

  135. Resources for Dissemination and Implementation Science. Washington University in St Louis.

  136. Implementation science resources for CRISP. Colorado Research in Implementation Science Research Prevention.

  137. Resources for researchers in implementation science. Global Alliance for Chronic Diseases.

  138. The Centre for Implementation: Our resources.

  139. Tabatabei Z, Yazdani S, Sadeghi R. Barriers to integration of behavioral and social sciences in the general medicine curriculum and recommended strategies to overcome them: A systematic review. J Adv Med Ed Prof. 2016;4(3):111–21.

    Google Scholar 

  140. Regmi K, Jones L. A systematic review of the factors – enablers and barriers – affecting e-learning in health sciences education. BMC Med Educ. 2020;20:91.

    Article  Google Scholar 

  141. Patow C, Karpovich K, Riesenberg LA, et al. Residents’ Engagement in Quality Improvement: A Systematic Review of the Literature. Acad Med. 2009;84(12):1757–64.

    Article  Google Scholar 

  142. Ginsburg LR, Dhingra-Kumar N, Donaldson LJ. What stage are low-income and middle-income countries (LMICs) at with patient safety curriculum implementation and what are the barriers to implementation? A two-stage cross-sectional study. BMJ Open. 2017;7:e016110.

    Article  Google Scholar 

  143. Moher D, Liberati A, Tetzlaff J, Altman DG. and the PRISMA Group. Preferred Reporting Items for Systematic Reviews and Meta-Analyses: The PRISMA Statement. Ann Intern Med. 2009;151(4):264–70.

    Article  Google Scholar 

  144. Pinnock H, Barwick M, Carpenter CR, et al. Standards for Reporting Implementation Studies (StaRI) Statement. BMJ. 2017;356:i6795.

    Article  Google Scholar 

  145. McKibbon KA, Lokker C, Wilczynski NL, et al. Search filters can find some but not all knowledge translation articles in MEDLINE: an analytic survey. J Clin Epidemiol. 2012;65:651–9.

    Article  Google Scholar 

  146. Tabak RG, Khoong EC, Chambers DA, Brownson RC. Bridging research and practice: models for dissemination and implementation research. Am J Prev Med. 2012;43:337–50.

    Article  Google Scholar 

  147. Flodgren G, Parmelli E, Doumit G, Gattellari M, O’Brien MA, Grimshaw J, Eccles MP. Local opinion leaders: effects on professional practice and health care outcomes. Cochrane Database Syst Rev. 2011:CD000125.

  148. O’Brien MA, Rogers S, Jamtvedt G, et al. Educational outreach visits: effects on professional practice and health care outcomes. Cochrane Database Syst Rev. 2007:CD000409.

  149. Grilli R, Ramsay C, Minozzi S. Mass media interventions: effects on health services utilisation. Cochrane Database Syst Rev. 2002:CD000389.

  150. Metz A, Louison L, Ward C, Burke K. Implementation Specialist Practice Profile: Skills and competencies for implementation practitioners. Working draft. 2018.

    Google Scholar 

  151. Mallidou AA, Atherton P, Chane L, Frisch N, Glegg S, Scarrow G. Core knowledge translation competencies: a scoping review. BMC Health Serv Res. 2018;18:502.

    Article  Google Scholar 

  152. Powell BJ, Walz TJ, Chinman MJ, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10(21).

  153. Waltz TJ, Powell BJ, Fernández, Abadie B, Damschroder LJ. Choosing implementation strategies to address contextual barriers: diversity in recommendations and future directions. Implement Sci. 2019;14:42.

    Article  Google Scholar 

  154. Waltz TJ, Powell BJ, Matthieu MM, et al. Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: results from the Expert Recommendations for Implementing Change (ERIC) study. Implement Sci. 2015;10:109.

    Article  Google Scholar 

  155. Nilsen P. Making sense of theories, models and frameworks. Implement Sci. 2015;10:53.

    Article  Google Scholar 

  156. Rapport F, Clay-Williams R, Churruca K, Shih P, Hogden A, Braithwaite J. The struggle of translating science into action. J Eval Clin Pract. 2018;24(1):117–26.

    Article  Google Scholar 

  157. Eccles MP, Foy R, Sales A, Wensing M, Mittman B. Implementation Science six years on–our evolving scope and common reasons for rejection without review. Implement Sci. 2012;7:71.

    Article  Google Scholar 

  158. Agile Implementation Science short course, Indiana university.

  159. Summer Institute on Implementation Science, University of North Carolina at Chapel Hill.

  160. Knowledge into Action. University of Oxford.

  161. Specialist Knowledge Translation Training Workshop. Sick Kids.

  162. Innovation and Implementation Science Graduate Certificate. Indiana University.

  163. Agency for Healthcare Research and Quality. Accelerating Change and Transformation in Organizations and Networks (ACTION): fact sheet: field partnerships for applied research. Rockville: Agency for healthcare research and quality; 2009.

    Google Scholar 

  164. National Implementation Research Network.

  165. Dearing J. PS1-11: A Dissemination and Implementation Research Agenda for the Cancer Research Network: Looking Back and Looking Forward. Clin Med Res. 2013;11(3):127.

    Google Scholar 

  166. UK Implementation Science Society.

  167. Knowledge Mobilisation Fellowship Programme. National Institute for Health Research.

  168. Implementation science. Medical Research Council.

  169. Implementation science grant writing. University of Washington.

  170. Implementation science jobs. Consortium for Implementation Science.

  171. Implementation Science Global Job Board. University of Washington.

Download references




RD is supported by the National Institute for Health Research (NIHR) Applied Research Collaboration: South London, at King’s College Hospital NHS Foundation Trust. The views expressed in this publication are those of the author(s) and not necessarily those of the NHS, NIHR or the Department of Health and Social Care.

Author information

Authors and Affiliations



RD designed the review and was responsible for its conceptualization, drafting and edits. DD provided detailed feedback on all aspects of the review methodology and was involved in editing the manuscript. RD and DD were involved in the screening and identification of relevant articles to be included in the review as well as the data extraction process. Both authors read and approved the final version of the manuscript.

Corresponding author

Correspondence to Rachel Davis.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests


Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Additional file 1:.

Data extraction form.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Davis, R., D’Lima, D. Building capacity in dissemination and implementation science: a systematic review of the academic literature on teaching and training initiatives. Implementation Sci 15, 97 (2020).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: