Building capacity in dissemination and implementation science: a systematic review of the academic literature on teaching and training initiatives

Background The field of dissemination and implementation (D&I) science has grown significantly over recent years. Alongside this, an increased demand for training in D&I from researchers and implementers has been seen. Research describing and evaluating D&I training opportunities, referred to here as ‘capacity building initiatives’ (CBIs), can help provide an understanding of different methods of training as well as training successes and challenges. However, to gain a more detailed understanding of the evidence-base and how D&I CBIs are being reported in publications, a field-wide examination of the academic literature is required. Methods Systematic review to identify the type and range of D&I CBIs discussed and/or appraised in the academic literature. EMBASE, Medline and PsycINFO were searched between January 2006 and November 2019. Articles were included if they reported on a D&I CBI that was developed by the authors (of each of the included articles) or the author’s host institution. Two reviewers independently screened the articles and extracted data using a standardised form. Results Thirty-one articles (from a total of 4181) were included. From these, 41 distinct D&I CBIs were identified which focussed on different contexts and professions, from 8 countries across the world. CBIs ranged from short courses to training institutes to being part of academic programmes. Nearly half were delivered face-face with the remainder delivered remotely or using a blended format. CBIs often stipulated specific eligibility criteria, strict application processes and/or were oversubscribed. Variabilities in the way in which the D&I CBIs were reported and/or evaluated were evident. Conclusions Increasing the number of training opportunities, as well as broadening their reach (to a wider range of learners), would help address the recognised deficit in D&I training. Standardisation in the reporting of D&I CBIs would enable the D&I community to better understand the findings across different contexts and scientific professions so that training gaps can be identified and overcome. More detailed examination of publications on D&I CBIs as well as the wider literature on capacity building would be of significant merit to the field. Supplementary information Supplementary information accompanies this paper at 10.1186/s13012-020-01051-6.


Introduction
The failure to optimally use research to improve population outcomes and reduce service inefficiencies is an endemic challenge to health and social care systems worldwide [1][2][3]. A critical and acknowledged issue is the considerable gap between what we know we should be doing based on the evidence, versus what gets implemented in healthcare settings [3,4]. Dissemination and implementation science (referred to hereon in, as 'D&I') investigates ways to close 'research to practice' gaps ('implementation science') and spread knowledge and information to practice settings ('dissemination science') [5,6].
The critical role of D&I in enhancing the application of evidence-based interventions has led to the discipline's rapid advancement in recent years [7,8]. Significant steps have been taken to build D&I capacity (defined as 'a process which leads to greater individual, organisation or system capabilities to conduct and implement high-quality research and practice' [9][10][11][12]) in recognition that a robust and sustainable workforce is required to successfully implement or maintain health and social care interventions of known effectiveness [13,14].
Alongside these efforts, another very important way to build D&I capacity is through the development and delivery of teaching and training initiatives-referred to hereon in as 'capacity building initiatives' (CBIs). These endeavours may be aimed at individuals conducting research (i.e. 'researchers'), those faced with translating evidence into practice (i.e. 'implementers') [15][16][17][18] or those tasked with training others in D&I principles and methodologies (i.e. 'educators'). Such training endeavours include short courses, workshops and webinars or they may form part of academic programmes [16][17][18][46][47][48][49]-all of these can be important in ensuring individuals have the requisite knowledge and skill-set to successfully implement scientific discoveries across diverse populations [14-17, 50, 51]. Given the value of D&I CBIs, it is of interest to examine the type and range of training opportunities available [15][16][17][18] and how these extend to a wide range of individuals (implementers, researchers and educators) [14,17,[52][53][54][55][56].
In 2015, Implementation Science expressed a renewed interest in research describing and critically appraising D&I training initiatives [52]. Since this editorial, several descriptive and/or evaluative articles on D&I CBIs have been published [18,47,49,57,58], but for those working in D&I to gain a more detailed understanding of the evidence-base, a field-wide perspective of the published literature is required [59]. A useful starting point to address this gap is through the review and documentation of D&I CBIs that have been written up in the academic literature. Examining the way in which training endeavours are reported can help highlight variabilities in reporting and enable comparisons of different CBIs against set criteria (e.g. mode of delivery, duration, target audience) so that gaps in training (and the reporting of training) can be identified.
To date, several articles published between 2013 and 2019 have reviewed (at least in part) D&I CBIs and resources, specifically related to teaching and training. In 2013, an article that focussed on developing the next generation of implementation researchers highlighted selected D&I training programmes, conferences and resources [55]. In 2017, a mapping exercise of D&I research training initiatives, stemming from the National Institute of Health's 2013 meeting on training, measurement and reporting was published-comprising training institutes, academic programmes and courses, webinars and career development awards [17]. In the same year, an expert assessment on training opportunities in D&I was documented [59] together with a content analysis of D&I resources using public, web-based information [60]. More recently (in 2018-2019), studies have identified D&I training initiatives to help inform medical education [61], training needs in public health [62] and mental health [16], and a review of online D&I resources was performed [63]. Taking this evidence collectively, the value of D&I CBIs in developing and harnessing skills in implementation research and evidence translation can be seen. Taking the evidence individually, however, most of the research is geographically restrictive, focussing only on D&I CBIs in the USA [59] or the USA and/or Canada [16,55,61,63]. While one paper considered D&I training efforts on a global level [62], this was not the main aim of the work, and thus, information on the characteristics of the CBIs and gaps in training needs were understandably limited.
With these thoughts in mind and heeding the call from Implementation Science on the need for publications on D&I CBIs [52], we present the findings of a systematic review aimed at identifying the type and range of D&I training opportunities reported in the academic literature. This review is part of a larger programme of work aimed at describing and appraising D&I CBIs. The aim of this paper is to provide a detailed description of our review methodology and a high-level summary of the main features and characteristics of the training initiatives and how these are reported. We also reflect on the implications of our findings and put forward recommendations on the future reporting of CBIs in the context of D&I science.

Search strategy
EMBASE, MEDLINE and PsycINFO were searched (using the OVID interface) for relevant articles published between January 2006 and November 2019. The cut-off point was set at 2006 in line with the inception of implementation science [64]-where most of the relevant articles identified in our initial scoping of the literature were published. The search strategy was informed by several reviews and discussion papers on D&I-related terms [65][66][67][68][69] together with a brainstorming exercise involving both authors (RD, DD) to generate applicable terminology. Terms relating to (1) implementation science (e.g. 'knowledge translation', 'implementation research') and (2) teaching and training (e.g. 'capacity building', 'curriculum') were included. To avoid a priori assumptions on the type of content (i.e. topics) the CBIs may cover, the search strategy was restricted to generic terms relating to D&I (e.g. 'implementation science') rather than specific terms that focussed on D&I methodologies or concepts (e.g. 'hybrid designs' [70], 'implementation outcomes' [71], or theories and frameworks, e.g. 'Consolidated Framework for Implementation Research' [72]). To tighten the search specificity, the search strategy was customised using appropriate wildcards (e.g. course$) and Boolean operators (i.e. OR, AND), and restricted to titles and abstracts. The sensitivity of the search was assessed by forward and backward citation searching of included articles and through handsearching key implementation and behavioural science journals (e.g. Implementation Science, Translational Behavioural Medicine). The final search was conducted on 21st November 2019 (see Table 1 for a full list of search terms).

Inclusion criteria
At the first stage of screening (title and abstract), any empirical or review article that discussed CBIs in D&I and/or related areas (e.g. 'improvement science', 'quality improvement', 'translational research') in the context of teaching or training was included. At the second stage of screening (full text), tighter restrictions applied. Articles whereby authors discussed or appraised (as a primary or secondary focus) a D&I CBI they (or their host institution) developed were included-this comprised CBIs where the whole focus of the training was on D&I (e.g. a D&I workshop) or only part of the focus (e.g. a D&I module that formed part of a larger postgraduate programme in public health). Articles were not restricted based on their methodological focus-in other words, we included all D&I CBIs that met our inclusion criteria, irrespective of the type of information provided on the D&I CBI or the level of detail.

Exclusion criteria
Dissertations and doctoral theses, books/book reviews, conference posters/presentations and editorials/commentaries were excluded, as were articles not in English. Review papers were excluded following citation searching for relevant articles, as were articles that focussed on training in other areas of healthcare improvement (e.g. patient safety or quality improvement) if they did not include an element of D&I within the training (e.g. [73,74]). Articles were also excluded if they: described D&Irelated conferences or conference proceedings (e.g. [56,75,76]) unless there was a specific D&I CBI within the conference (e.g. workshop) that delegates could register for, examined how D&I methodologies or knowledge translation techniques could be used to better implement training programmes [77] or training centres [78] without the focus of the training itself being on D&I and assessed D&I training needs [14,55,79] or competencies in D&I [80] or discussed the development of D&Irelated research centres [21] without reference to a specific D&I CBI. Equally, articles that provided an overview of a meeting(s) to discuss how to advance the field of D&I [81,82], focussed on the development of collaboratives to encourage new research partnerships [83], presented general repositories for D&I resources or training opportunities [60,63] or calls from journals for work relating to D&I CBIs [23] were excluded.
Finally, articles that focussed on the development of training programmes for mentors working in translational science [84,85] were excluded unless the content of the mentoring was on D&I science (versus more generally on how to be an effective mentor), as were those that explored mentoring approaches as a way of assisting knowledge translation without actually discussing a D&I-related mentoring scheme [86].

Screening of articles
Articles were screened for relevance by the lead author (RD). The second author (DD) independently screened a random selection of 20% of the articles at the first stage of screening (title and abstract) and 100% at the second stage (full text). Discrepancies were resolved through discussions between the authors until consensus was reached.

Data extraction
A standardised form was developed to extract data from the included articles and to help synthesise the data in the review. The 'Template for Intervention Description and Replication' (TIDieR) checklist [87] was used as a starting point to see which items in the checklist would be of relevance to review aims-the TIDieR specifies the clear reporting of interventions (in our context 'training interventions'). Additional criteria of potential relevance were identified by the lead author (RD) and agreed by the second author (DD) by searching google scholar, electronic databases (e.g. PUBMED) and consulting with the Equator network website for relevant guidelines (https://www.equator-network.org/). Operational definitions for each criterion were developed and tested across all included articles to ensure reliability, validity and consistency in the data extraction process (see Suppl. file 1 for data extraction form).
For articles where authors discussed more than one D&I CBI they had developed (i.e. they presented a suite of D&I CBIs that were independent of one another, such as workshops or postgraduate courses), data was extracted on each CBI separately. Data was extracted by both authors (RD, DD) across all articles to ensure consistency and accuracy in the reporting of findings.

Quality assessment
The eligibility criteria for articles in our review led to the inclusion of heterogenous research in terms of aims and methodological focus. It was deemed inappropriate therefore to appraise the methodological quality of the articles. Instead, we used the data extraction form (Suppl. file 1) to describe key characteristics of the D&I CBIs, delineate commonalities and differences between these and highlight key learnings when taking the evidence collectively.

Search results
The search retrieved 5564 articles, with a total of 4181 remaining after the removal of duplicates (N = 1383). A further 3938 articles were excluded at the title and abstract stage, resulting in 243 full-text articles assessed for eligibility. Of these, 212/243 articles were disregarded (see Fig. 1 for reasons), leaving 31 articles relevant for inclusion.
There was a high level of agreement (> 90%) regarding inclusion between the reviewers (RD and DD) at both stages of screening, with disagreements quickly and easily resolved.   [48,103]. Considering the articles separately, 48 CBIs were reported, but taking the articles collectively, accounting for those CBIs that were reported in more than one article, 41 distinct CBIs were reported (across the 31 included articles).

Key characteristics of the included D&I CBIs
For the remainder of the results, findings are presented in relation to the number of included D&I CBIs (N = 41). Where CBIs are discussed in multiple articles (e.g. data is drawn from two articles on the TIDIRH) [48,103], this is reflected in the number of references accompanying each finding. Of the 41 D&I CBIs identified, a range of 'types' of training (as defined by the authors of each of the included articles) were reported including training Institutes (N = 4) [18 a , 48, 57, 92, 103, 106, 107, 109, 110 c ] or courses that  [102] or modules that have been integrated in master's programmes-including clinical research (N = 1) [104 a ] primary health care (N = 1) [105] and public health (N = 1) [62]; or PhD programmes (N = 1) [94] or modules that are integrated as part of a Doctors of Nursing programme (N = 2) [101,108]. The [11,48,103], month(s) [47,58,88,96,108] to years [18 a,b , 57, 89, 92, 97-99, 102, 109]. Tables 2 and 3 provide further information on some of the key selected characteristics of the included CBIs.

Eligibility criteria and application process
The majority of CBIs were aimed at individuals who had undertaken or were undertaking a postgraduate qualification (N = 12) [ [88,111] or nurses (N = 2) [101,108]: for the remainder, it was not reported (N = 4) [18 c,d , 93, 94].
Ten of the CBIs provided information on the application and selection process. This ranged from individuals taking a formative assessment to ensure they had the requisite knowledge and skills in evidence-based medicine [111]; providing evidence that they had not received major research funding in D&I research before [48]; writing a 1.5-2 page concept paper describing a D&I research project they would like to undertake as part of the training [48,57,98,109]; detailing prior experience with implementation and/or health science research [48,57,58,102]; producing a cover letter or statement to demonstrate a motivation to undertake the CBI, pursue a career in D&I and/or their long-term research agenda [48,57,58,97,98,102,109]; obtaining a letter of support or character reference from their workplace [48,97,98,109]; providing evidence of academic grades or research productivity [48,57,102,106,109]; answering essay questions [106]; and undertaking interviews [88,102]. One CBI also required individuals to apply in a team (i.e. a joint application involving other individuals) whereby they had to explain a D&I-related project they would like to implement in their workplace to address a healthcare-related challenge [99].
Additional data on the competitive nature of the application process was referred to for six of the CBIs. The TIDIRH [48,103] had a total of 266 investigators applying in the first year (2011) for 30 places [48], with 1100 applicants over a 5-year evaluation period (2011-2015) for 199 places [103], and in 2019, over 200 applicants applied for 50 available training slots [103]. The IRI [18 a , 57, 92, 109] accepted approximately 10 fellows each year with a total of 31 fellows over the first 3 years (2010-2012) from a total pool of 86 applicants [109], with other data derived across 4 separate occurrences of the IRI training reporting a 43/124 acceptance rate [57,92]. The KTSI [106, 107, 110 c ] had 150 trainees that applied for 30 places [106] while the MT-DIRC [18 b , 98] offered 56 fellows a place over the 4 occurrences of the training (2014-2017) [98]. The 'Action Research Programme' reported that only 6 students were accepted at the start of the programme' [88], and numbers on a master's (in 'health services research and implementation science') [102] and a doctoral-level course (on 'implementation science') [47] were capped at 20 due to practical reasons, despite the demand for the courses growing [47].

Content and structure of the CBI
The level of detail on the content and structure of the 41 CBIs included in the review varied considerably Clerkship/F-F Nine-month CBI comprising experiential learning in cardiology clinics (weekly in sequential rotations of 9 consecutive weeks), reflective writing (after each clinic rotation), seminars on systems-based practice and implementation science (weekly in the first 3 months of training), and a clinic-based project focussed on strategies to improve the quality and efficiency of clinical operations (amount of time dedicated to this at the students' discretion). CBI includes a mentoring component from clinicians.      Table 2).
Knowledge and skills Knowledge and use of D&I principles as well as confidence in conducting D&I activities increased as a result of the CBI [47,49,58,95,[97][98][99][100]111]. Individuals reported using D&I skills they acquired as a result of the training to influence and train peers in D&I [48]; be involved in research networks [48]; deliver educational modules and presentations [110 c ], embark on practicums, master's papers and other projects [62]; and serve as mentors for more junior investigators [109].
Project-based work Conducting a D&I-related project was reported as one of the main reasons for applying for a CBI [89] and one of the most valuable aspects [96] with individuals reporting this helped to enhance their understanding of the relationship between evidence and implementation [111]. However, individuals also raised the need for more time to conduct projects and more guidance from faculty on the scope of the projects [89,96]-while some project ideas were implemented, most did not move beyond conceptualization due to lack of time or guidance from faculty [88].

NR
Text in bold denotes relevant information about the CBI that was not described in the article but was described in another article in the review that focussed on the same CBI-the linked article(s) are highlighted in bold in the 'description' column Where articles are numbered 'A', 'B' (e.g. Brownson, 2017A, 2017B), these are CBIs that are discussed in the same article which are distinct from one another so are included as separate entries in the table. Each CBI is given a unique identifier to show the findings relating to each CBI The level of detail in Table 1 varies depending on what was reported in the article. Cells that are coded as 'NR' = when information is not reported or not clear The level of education reported in Table 1  The 'type' of D&I CBI has been defined as the way in which the author(s) of each of the included articles refer to their CBI in the article Description in text, figure on programme plan by semester, and a table of the benefits of a blended theoretical approach

The Institute for Translational Research in Adolescent Behavioural Health
In total 28 scholars were recruited in the first 2 years. Preliminary results from surveys revealed that gaining research experience through real-world service-learning opportunities was a key factor in the decision to apply for the graduate certificate. The online method for presenting coursework proved difficult and required additional time and effort from faculty to help navigate technology. Academic mentors felt the design of the program was beneficial but that they needed more guidance on their role as mentors and the scope of the projects.

Knowledge Translation Workshop
This was a one-day workshop on KT in dementia but also included a seminar on KT methods and practices (which the article focusses on). Article states the response rate for the evaluation survey but does not state the number of delegates that attended the KT seminar. Delegates were emailed a survey 6-months post-workshop.
Results were compared between those that did and did not opt for the KT seminar as part of the workshop. The KT group reported the highest median number of overall uses of workshop information in daily practice when compared to those that only participated in the clinical seminars -7.5 vs. 6 (p > 0.05). There was a correlation (p < 0.05) between the total number of 'kinds of research use' (e.g. changed a practice, changed your beliefs) and individual mean scores (average across 5 items) for conceptual research uses (e.g. 'gave you knowledge on how to care for residents'), and this was stronger for those that attended the KT seminar. Three items stood out-changing a practice, changing a procedure and creating a new policy/guideline. Six separate one-day workshops were held in total. Greenhalgh, 2006 [105] Primary health/multiple Description in text 15. Getting Research into Practice and Policy Article briefly summarises students' evaluations-students highlighted that the online environment provided the opportunity to rehearse and modify potential

Practising Knowledge Translation
Seventeen participants were enrolled on the course. Data were collected at 3, 6 and 12 months. Attendees reported significant positive effects in terms of-increased use of implementing theories, models and frameworks and increased knowledge of developing evidence-informed programmes, evidence implementation, evidence evaluation, sustainability scale and spread and context assessment (with self-efficacy increasing across these measures too).

Moore, 2013 [108]
Nursing/nurses Description in text and table of competencies

EBP 11: Evaluating and Applying Evidence
Numbers on the course have increased steadily from 2009 (32) to 2013 (64). No specific evaluation data relating to the EBP 11 module was provided. Students rated the overall CBI highly and identified several strengths, includingexposure to different research article critique instruments and group interactions.

Morrato, 2015 [95]
Non-specific/multiple Description in text, link to website, tables on agenda, faculty and D&I CBI resources

Introduction to Dissemination and Implementation
Sixty-eight delegates attended day one and 11 also attended the half-day on day two (which was optional). Data collected 1 week after the CBI (from 34/68 responses) revealed that: 100% 'strongly agreed' they were satisfied with the training and 97% felt the workbook was a valuable resource. Delegates that attended the day 2 mentoring session 'strongly agreed' that working closely with faculty/experts increased their confidence. At 6-month follow-up, evidence of 23 new manuscripts and grant proposals were found.
Norton, 2014 [96] Public health/multiple Description in text and a table of weekly topics

Dissemination and Implementation in Health
A total of 24 students enrolled in the course and 19 faculty researchers participated. Students strongly agreed that they would recommend the course to other students, they enjoyed it and were able to apply what they learnt to their D&I project. Faculty rated it highly too and strongly agreed that they would recommend participation in the course to other faculty and were interested in learning about D&I from students. The collaborative learning projects were rated by both as one of the most valuable aspects. Suggestions for improvement centred on (for students) course logistics, more meetings to discuss collaborative project, more time from start of course to when they meet faculty partners. Faculty reported the need for clearer expectations for the collaborative learning project and the opportunity to attend lectures.
Osanjo, 2016 [97] Non-specific/multiple Description in text and a table on curriculum

Implementation Science Fellowship Program
There were 5 trainees in the two cohorts that undertook the course. A survey (in years 1 and 2) revealed a high degree of satisfaction with most aspect of the CBI including content, duration and attachment sites. Fellows expressed high satisfaction with the mentorship program and would prefer the existing mentorship arrangement to be extended. Some fellows indicated they were already applying the skills gained at their home institutions. Fellows have embarked on PhD programmes in dissemination and implementation (N = 4), secured funding (N = 3) and most (85%) identify implementation science as a component of their work activity.
Padek, 2018 [98] Cancer/multiple Description in text, tables on faculty and mentoring, weblink to training, and an online supplementary file on the agenda Forty-three dissemination and implementation science competencies were assessed-all improved from baseline to 6 months and 18 months. The effect was apparent across beginner, intermediate, and advanced fellows. Mentoring was rated very highly by fellows (and more highly than by the mentors). The importance of different mentoring activities was linked to fellows' satisfaction with the mentoring activities. CBI also discussed in Brownson, 2017B.

Park, 2018 [99]
Non-specific/multiple Description in text, and an online supplementary file on the agenda

Foundations in Knowledge Translation
A total of 46 participants across two cohorts have completed the training (16 teams ranging in size from 2-4 people). Surveys (at 6, 12, 18, 24 months) revealed attendees' self-efficacy in evidence-based practices, KT activities, and using evidence to inform practice increased over time.
Focus groups and interviews indicated that confidence in using KT increased from baseline to 24 months and that training helped to achieve attendees' KT objectives, plan their projects and solve problems. Teams with high self-reported capacity and commitment to implement projects and 'buy-in' from upper management that resulted in securing funding and resources were stated as important to achieve goals. Sustained spread of KT practice was observed with 5 teams at 24 months.  related PhD programmes [97]. Individuals also reported collaborating with other trainees [103] and expressed interest in maintaining relationships and being updated on each other's work [106] after the completion of training.

Discussion
To the best of our knowledge, this is the first systematic review of its kind to identify and collate the type and range of D&I CBIs relating to teaching and training that have been described and/or appraised in the academic literature. An array of training opportunities from countries across the world were uncovered, aimed at numerous professions, focussed on different contexts and ranging in delivery format, duration, structure and content. This review was (in part) in response to an editorial calling for a greater number of publications on the development and evaluation of D&I CBI training initiatives [52]-we took this call one step further by synthesising the collective evidence published to date. Our research has raised several important implications for the development and delivery of future D&I CBIs as well as their reporting, discussion and appraisal in academic journals. Here, we discuss some of the most pertinent overarching challenges we believe should be prioritised in terms of building capacity in teaching and training in D&I and in how these training endeavours are reported and disseminated for wider use.

Demand and importance of D&I training
While our findings, supported by the wider evidencebase [59,112], highlight the recognised international demand and importance of D&I CBIs, we also found an unmet need for D&I training. For some CBIs, enrolment may only occur once a year and/or may have strict eligibility criteria (e.g. specific qualifications or experience), which significantly limits the pool of individuals whom Text in bold denotes relevant information about the CBI that was not described in the article but was described in another article in the review that focussed on the same CBI-the linked article(s) are highlighted in bold in the 'description' column.
Where articles are numbered 'A', 'B' (e.g. Brownson, 2017A, 2017B), these are CBIs that are discussed in the same article which are distinct from one another so are included as separate entries in the table. Each CBI is given a unique identifier to show which findings refer to each CBI. Whenever possible, we have provided information on the CBIs-the level of detail in Table 1 varies depending on what was reported in the article. Cells that are coded as 'NR' = when information is not explicitly reported. a The types of information provided on the CBI are listed in the table-this is only a high-level summary and should not be used as an indicator of article quality (the content and structure of these CBIs will be examined in follow-up work) b A high-level summary of evaluative findings on the CBI (where reported) is provided-a more detailed analysis will be conducted in follow-up work.
are able to apply. The highly selective nature and low reported acceptance rates of some CBIs [48,57,92,98,103,109] also suggest that the demand for training from the wider population is likely much higher-oversubscriptions to D&I conferences, meetings and initiatives provide further support for this view [7,17,59,75]. Also, of note is that many of the CBIs we delineated centred on advancing postgraduate or postdoctoral researchers' D&I skills. While these CBIs were designed specifically for building research capacity, so do not by nature restrict options for other kinds of learners (e.g. practitioners, policy makers), there does appear to be fewer training options for individuals newer to the field, which could widen the gap between novices and those already skilled in D&I [59]. Greater emphasis on reaching out to predoctoral individuals, practitioners, policy makers and consumers [14,16,59] and publishing findings on such CBIs after they have been evaluated is required if we are to gain a better understanding of training needs, priorities and challenges from a diverse range of learners. Additional efforts are required to train multidisciplinary teams (not just individuals), whom are often critical to the successful design and execution of implementation research and practice [16,113,114] and on delivering training in low resource settings, whom encounter unique challenges in implementing evidencebased practices due to limited financial resources and healthcare workforce [115][116][117] (only 3 D&I CBIs in our review focussed on this [97,99,111]).

Availability and accessibility of D&I training and resources
Creative approaches to providing support in D&I are required if/when local institutional support is lackingwhich may often be the case given the relative infancy of the field [118] and proportionately small pool of experts able to provide senior mentorship [60]. Examples of such creative approaches uncovered in our review include the use of online platforms to provide mentorship support [90 b ], web-networking to enhance research connections and obtain feedback on research activities [48, 90 c ] and webinars/online seminar series [90 a , 100 a,b ] to share D&I learnings. More widely, a whole host of additional training opportunities and other resources exist (many of which are free)-including interactive webbased tools [63], networks and discussion forums [42], MOOCs and online courses [119][120][121][122] and numerous guides on D&I methodologies [123][124][125][126][127][128][129][130][131].
Preliminary evidence indicates that individuals are not always aware of the existence of D&I resources nor may they be aware how to access them [132,133]. Arguably, the D&I community may benefit from more focussed efforts of dissemination in order to reach a wider critical mass of individuals interested in learning about D&I-a point which is of particular importance when no other training option is available (e.g. due to cost or time). While some organisations have made steps to providing lists of D&I resources and training opportunities on their websites [134][135][136][137][138], a more general repository [17] where all the up-to-date evidence and training could be logged is likely to be of significant merit to the field.

Barriers to effective training
Systematic reviews and research in areas related to D&I (e.g. behavioural sciences, quality improvement and patient safety) report that competing educational demands, time, faculty expertise, motivation and institutional culture are important determinants of successful curriculum implementation and/or completion [139][140][141][142]. Parallels can be drawn from our review, with lack of time to conduct D&I projects and insufficient guidance on projects being raised as issues by faculty and trainees [88,89,96]. More widely in the literature, costs and time constraints are reported as major factors in the decision to undertake knowledge translation training [50,63], particularly for those from low resources settings [63]. These findings, while only preliminary, highlight the need to examine different systems and individuals in which D&I curriculum will be implemented, alongside the determinants of developing, delivering and accessing curriculum within these systems for a variety of learners. Doing this will better enable strategies to be put in place to overcome barriers to implementation of D&I CBIs and, in turn, help to address the recognised deficit in training opportunities [14,75,80,112].

Standardisation in reporting
Unlike other areas of research, where reporting guidelines exist, e.g. for systematic reviews [143], implementation research [144] and intervention reporting [87], there is no equivalent resource specific to the reporting of D&I CBIs. Systematic reviews on knowledge translation interventions (a related and synonymous term with implementation science [65-67, 145, 146]) have raised how inconsistencies in intervention reporting hamper evidence synthesis [147][148][149]. In the same way through this review, we found that variabilities in reporting D&I CBIs (both in terms of describing and evaluating) can make literature synthesis problematic. While this challenge is not surprising given the differing aims and focus of our included articles (and so is by no means a criticism of the articles we included), it nonetheless highlights an important issue. If we are to use articles like these to learn and further build capacity efforts in D&Ia point raised as important by this journal [52]-greater consistency in reporting is required. We consider the importance of standardisation in more detail here by drawing on two key areas: (1) the reporting of the content and structure of D&I CBIs and (2) the reporting of how D&I CBIs are evaluated.

The reporting of the content and structure of D&I CBIs
Due to the extensive scope of this work, we were only able to provide a high-level summary of the content and structure of the CBIs in this paper. However, it was clear when undertaking the review that despite evident similarities on content (e.g. covering measurement or theory), different topics were covered to varying degrees and a consistent curriculum, focussed on interdisciplinary competencies, was not revealed. While initial steps have been taken to reach consensus on D&I curricula expectations and competencies for various learners (both within the CBIs included in our review and more widely [79,80,150,151]), measures and methods are still developing [152][153][154][155][156] and can be difficult to define [65,67,156,157]. Advocating the adoption of a small, common set of terms (which could then also be used when reporting D&I CBIs) is one way in which a better understanding of the evidence-base in D&I could be reached [65,71,152,154,155]. Progress is already underway in the reporting of some areas of implementation methodology (e.g. implementation outcomes [71], implementation strategies [152,154], theories, models and frameworks [155]), but we are still a long way off being able to establish a more comprehensive taxonomy of common terms and methods.

The reporting of how D&I CBIs are evaluated
Appraising existing CBIs is one way which can help to understand individual needs for D&I research and practice [14,50,51,55] and identify priorities for D&I capacity building [50,51]. Articles in our review evaluated CBIs to varying degrees. While this would be expected given our eligibility criteria for articles and their subsequent differing aims, without clear and consistent reporting of data, it is difficult to appraise, synthesise and effectively communicate progress in D&I training across different professions, contexts and purposes. Ideally, evaluations of CBIs would be performed on repeated occurrences of the training (to check consistency in the findings across several cohorts) and longitudinally (to assess the longer-term impact of training) to better establish the effectiveness of D&I curricula in enabling desired outcomes in practice (something which few of the articles included in our review did).
In a field where the evidence (and therefore priorities for teaching) is rapidly changing, standardisation in the reporting of key elements of D&I content and structure as well as the evaluation of the CBIs is critical. This understanding and clarity is essential for D&I educators, researchers and implementers to draw meaningful conclusions from the literature on D&I training. In turn, a clearer cumulative assessment of the evidence-base can be reached, so that training successes and challenges as well as educational gaps in the field can be identified and addressed.

Review caveats
While our review has much to add to the field of capacity building in D&I, several important caveats should be borne in mind when interpreting our results. First, given the extensive scope of our review and its complexity, we were only able to provide a high-level summary of our findings here. We acknowledge that examining the curriculum of each the D&I CBIs may be of interest to the readership of this journal, as may a detailed synthesis of the evaluation of D&I CBIs like those identified in our review. However, while this is something we plan to undertake in future work, it was beyond the scope and aims of the current paper and would not have been possible without significantly compromising on the level and detail of other information required in order to meet our review's aims. Second, given one of the aims of our review was to show how D&I CBIs are reported in the academic literature and to use these findings to help inform future recommendations on reporting (a point which has been raised as important to explore [52]), we did not include 'grey literature' in our review. We are aware of D&I CBIs in the field that have not been written up for publication [158][159][160][161][162] so acknowledge that our review (while intentional) only provides a fieldwide perspective on the academic literature, not the total number of D&I CBIs on offer. Third, to provide a comprehensive account of the literature, we did not exclude articles based on their aims-unsurprisingly, therefore, those that were included differed in focus, ranging from brief or detailed descriptions and/or evaluations of D&I CBIs to general overviews of several initiatives. Fourth, we did not exclude CBIs based on the level of detail authors provided on them. While this was intentional, in order to highlight variabilities in reporting, undoubtedly, this meant that less meaningful conclusions can be drawn from those articles where minimal information was included. Finally, it is worth noting that we examined just one way to build capacity in D&I-providing funding for D&I research networks [163][164][165][166], research proposals [68,[167][168][169], faculty positions and job vacancies [170,171] and departments and centres [22][23][24][25][26][27][28][29][30][31][32][33][34][35][36], as well as organising D&I-related conferences and meetings [43][44][45][46] are also important avenues for growth.

Conclusions
This review addresses a clear gap in the evidence-base and helps pave the way for future research on building capacity in D&I. Greater investment in education and training is necessary to increase the cadre of D&I scientists and practitioners. Consistent reporting on D&I CBIs is required to enable greater transparency on the type and range of training opportunities, attitudes towards them and training gaps that need to be prioritised and addressed. Increasing awareness and accessibility to D&I training and resources should also be prioritised. Ultimately, doing this should result in D&I learnings being more clearly communicated so that the best possible D&I CBIs can be developed to achieve the most optimal outcomes. Further work examining the evidence on CBI D&Is (both within the academic literature and more widely) is required.
Abbreviations CBI: Capacity building initiative; D&I: Dissemination and implementation science