Skip to main content

Capacity building for evidence-based decision making in local health departments: scaling up an effective training approach

Abstract

Background

There are few studies describing how to scale up effective capacity-building approaches for public health practitioners. This study tested local-level evidence-based decision making (EBDM) capacity-building efforts in four U.S. states (Michigan, North Carolina, Ohio, and Washington) with a quasi-experimental design.

Methods

Partners within the four states delivered a previously established Evidence-Based Public Health (EBPH) training curriculum to local health department (LHD) staff. They worked with the research team to modify the curriculum with local data and examples while remaining attentive to course fidelity. Pre- and post-assessments of course participants (n = 82) and an external control group (n = 214) measured importance, availability (i.e., how available a skill is when needed, either within the skillset of the respondent or among others in the agency), and gaps in ten EBDM competencies. Simple and multiple linear regression models assessed the differences between pre- and post-assessment scores. Course participants also assessed the impact of the course on their work.

Results

Course participants reported greater increases in the availability, and decreases in the gaps, in EBDM competencies at post-test, relative to the control group. In adjusted models, significant differences (p < 0.05) were found in `action planning,’ `evaluation design,’ `communicating research to policymakers,’ `quantifying issues (using descriptive epidemiology),’ and `economic evaluation.’ Nearly 45% of participants indicated that EBDM increased within their agency since the training. Course benefits included becoming better leaders and making scientifically informed decisions.

Conclusions

This study demonstrates the potential for improving EBDM capacity among LHD practitioners using a train-the-trainer approach involving diverse partners. This approach allowed for local tailoring of strategies and extended the reach of the EBPH course.

Peer Review reports

Background

An evidence-based decision making (EBDM) process in public health involves making use of the best available scientific evidence, engaging communities in assessment and decision making, applying planning frameworks, conducting sound evaluations, and disseminating results through appropriate channels [1],[2]. In recent years, efforts have been made to establish more uniform guidelines related to EBDM for public health practitioners and agencies. For example, based on recommendations of the Institute of Medicine, Core Competencies for Public Health Professionals emerged to define `a set of skills desirable for the broad practice of public health’ [3]. Additionally, the Public Health Accreditation Board (PHAB) is leading a voluntary accreditation effort in the United States to establish national achievement standards for health departments, including such requirements as `maintain a competent public health workforce’ (Domain 8) and `contribute to and apply the evidence base of public health’ (Domain 10) [4]. Funders are increasingly interested in supporting projects that are evidence-based and may soon prioritize funding accredited health departments to ensure effective use of their funds [5]-[7].

Based on literature in the emerging field of dissemination and implementation research [8],[9], the scale-up of effective workforce capacity-building approaches is a key need for research and practice [10]. The public health workforce is transdisciplinary by nature and represents diverse educational backgrounds and job types [11]-[14]. There is a need for comprehensive training programs that build and maintain common skillsets and language among public health practitioners to accomplish EBDM goals [15],[16]. The Prevention Research Center in St. Louis (PRC-StL) developed an Evidence-Based Public Health (EBPH) training course in 1997 with support from the Centers for Disease Control and Prevention and the World Health Organization. To date, the EBPH course has been offered to over 1,240 participants by faculty associated with the PRC-StL. Course content aligns closely with core competencies of public health [2],[3],[17] and covers specific skills to improve public health practice [18].

A series of mixed methods evaluations have shown that the EBPH course is effective in improving self-reported measures of knowledge, skill, and ability [16],[19],[20]. The present study represents the first evaluation of this course curriculum with a quasi-experimental design. A train-the-trainer approach was used to engage partners in four states in efforts to improve EBDM capacity among local health department (LHD) practitioners. Much of the research on improving EBDM has been focused on state-level practitioners, even though gaps in skills are higher at the local level [21],[22].

Methods

Selection of intervention states

Intervention activities were delivered in four U.S. states: Michigan, North Carolina, Ohio, and Washington. Prevention Research Centers (PRCs) in these states formed partnerships with either Public Health Practice Based Research Networks (PBRNs) or Public Health Training Centers (PHTCs) to conduct capacity-building activities for their state’s local health departments. For a PRC to be eligible for this study, the following criteria had to be met: a PBRN and/or PHTC existed in the same state; the PRC-PBRN or PRC-PHTC pair had a track record of productive collaboration; there were at least 30 LHDs in the state; the PRC had a strong mission and track record in training public health practitioners; and it had not already conducted extensive trainings in EBPH with LHD practitioners. The following PRC-PBRN/PHTC pairs were chosen:

  1. 1.

    University of Michigan PRC of Michigan; Michigan PHTC.

  2. 2.

    University of North Carolina at Chapel Hill Center for Health Promotion and Disease Prevention; Southeast PHTC.

  3. 3.

    Case Western Reserve University PRC for Healthy Neighborhoods; Ohio PBRN.

  4. 4.

    University of Washington Health Promotion Research Center; Northwest Center for Public Health Practice, PHTC.

Development of intervention activities

The intervention primarily involved the delivery of the EBPH training course. However each PRC-PBRN/PHTC team was also expected to provide at least one additional capacity-building activity with the attendees of the training based on the needs of their course’s participants (e.g., technical assistance with community assessment, grant proposal development, program development, implementation, or evaluation; practicum opportunities for public health/preventive medicine students and LHDs).

The EBPH curriculum consists of nine modules (see next section for a list of modules and learning objectives) and adheres to adult learning principles (i.e., learning through problem solving and active involvement, integrating the experiences of faculty and participants into course discussions) [14],[23]. Seven of the nine modules (excluding Modules 1 and 6) include interactive exercises in which participants work in small groups (e.g., using local data to develop a concise problem statement, searching PubMed for literature on a specific topic, developing an action plan based on a logic model).

At least two representatives from each state traveled to St. Louis in November 2012 for a 2.5-day `train-the-trainer’ workshop conducted by members of the research team. The workshop included review of the EBPH course curriculum developed by the PRC-StL. In collaboration with previous EBPH trainers, new trainers discussed sources of local data and examples of successful programs and policies to be used to modify the curriculum. Attendees also received detailed information on the administrative process for planning and conducting a successful training (e.g., registration processes, site selection, preparation of course materials). Over the next six months, the research team provided state partners with technical assistance as they modified the curriculum for local relevance while being attentive to course fidelity, ensuring consistency with the original curriculum and with what was delivered in other states.

One course was conducted in each of the four states during the months of April-June 2013, and 130 participants completed the course. North Carolina and Ohio conducted 3.5-day in-person trainings. To reduce travel costs and the burden of time away from the office for attendees, the other states opted to deliver three of the nine modules via interactive webinars (Michigan: Modules 3, 5, 7; Washington: Modules 1, 5, 7) with the remaining modules delivered in two days of in-person sessions. The PRC-PBRN/PHTC partners, with help from their state health departments, recruited participants through website postings, announcements and flyers at conferences, and emails to various public health electronic mailing lists. Each state had a waiting list for their training course.

EBPH modules and learning objectives

Module 1: Introduction:

  1. 1.

    Understand the basic concepts of evidence-based decision making.

  2. 2.

    Introduce some sources and types of evidence.

  3. 3.

    Describe several applications within public health practice that are based on strong evidence and several that are based on weak evidence.

  4. 4.

    Define some barriers to evidence-based decision making in public health settings.

Module 2: Community assessment:

  1. 1.

    Understand the importance of conducting a community assessment.

  2. 2.

    Understand the types of data that are appropriate for assessing the needs and assets of the population/community of interest.

  3. 3.

    Understand the major steps in the community assessment process.

Module 3: Quantifying the issue:

  1. 1.

    Measure and characterize disease frequency in defined populations using principles of descriptive epidemiology and surveillance.

  2. 2.

    Find and use disease surveillance data presently available on the Internet.

Module 4: Developing a concise statement of the issue:

  1. 1.

    Understand the overall strategic planning process for setting priorities in public health.

  2. 2.

    Understand a criterion for the components of a sound problem statement.

  3. 3.

    Develop a concise written statement of the public health problem, issue or policy under consideration in a measurable manner.

Module 5: Searching and summarizing scientific literature:

  1. 1.

    Understand the process used in systematic reviews and identify a key source (e.g., the Community Guide).

  2. 2.

    Use recommended guidelines for searching the scientific literature.

Module 6: Developing and prioritizing options:

  1. 1.

    Identify methods for prioritizing program and policy options (Types 1, 2, and 3).

  2. 2.

    Explore the role of creativity and group processes in developing intervention options.

  3. 3.

    Understand when and how to adapt interventions for different communities, cultures, and settings.

Module 7: Economic evaluation:

  1. 1.

    Know the differences between types of economic evaluations: cost-benefit, cost-utility, and cost-effectiveness analysis.

  2. 2.

    Understand key terms in economic analysis.

  3. 3.

    Be able to use economic evaluation studies to justify, prioritize, and implement prevention and treatment strategies.

Module 8: Developing an action plan and building a logic model:

  1. 1.

    Identify key characteristics and principles in successful action planning, including the role of coalitions/partnerships.

  2. 2.

    Identify the steps in program planning.

  3. 3.

    Understand the purpose and use of logic models.

  4. 4.

    Describe steps used in constructing logic models.

Module 9: Evaluating the program or policy:

  1. 1.

    Understand the basic components of program evaluation.

  2. 2.

    Understand the various types of evaluation designs useful in program evaluation.

  3. 3.

    Understand the concepts of measurement validity and reliability.

  4. 4.

    Understand the contributions of both qualitative and quantitative data to the evidence based process.

  5. 5.

    Understand some of the methods used in qualitative evaluation.

  6. 6.

    Understand organizational issues in evaluation.

Selection of control group

Control group selection began with a merged database of two national surveys previously conducted by the research team. In October-December 2012, a random sample of 1,067 U.S. LHDs was drawn from the database of 2,565 LHDs maintained by the National Association of County and City Health Officials, resulting in available pre-test data from 517 LHD directors or their designees (54% response rate) [24]. Respondents of this survey identified program managers within their same LHD, resulting in the collection of 332 additional responses from December 2012 to February 2013 (67% response rate) [25]. The focus of these surveys was to identify evidence-based training, practices, and related decision-making activities.

A subsample of the merged directors’ and program managers’ surveys (n = 849) was selected to be retested to serve as the control group. Because baseline surveys found that governance structure and population of jurisdiction were significantly related to administrative evidence-based practices [24],[25], we used these variables, along with job position, to guide our sample selection. Because all LHDs in the four intervention states are locally governed, the sample was first restricted to respondents whose LHD followed a localized (decentralized) governance structure. Next, we eliminated anyone who attended or had a colleague who attended the EBPH training. Finally, we stratified the remaining group by job position and population of jurisdiction and selected participants to best parallel the intervention group’s stratification at a 3:1 ratio. Despite the improved balance across control and intervention groups, they still significantly differed (p < 0.05) on these matching variables, as there were not enough controls to match in the higher population categories and the lower job positions. These differences were therefore controlled for in the analysis. Of those invited to the control group (n = 330), 40% came from the directors’ survey and 60% from the program managers’ survey.

Questionnaire development and testing

Baseline surveys were identical for control and intervention groups; development of this instrument is described previously [24],[25]. From this baseline instrument, the post-test questionnaire retested a set of questions related to perceived importance and availability of EBDM competencies. This set of questions was originally informed by a previous study that rated competencies for evidence-based cancer control [26] and has been used in other assessments of state and local public health practitioners [22], Jacob RR, Allen P, Baker EA, Dodson EA, Duggan K, Fields R, Sequeira S, Brownson RC: Training needs and supports for evidence-based decision making among the public health workforce in the United States, submitted]. The 10 EBDM competencies, along with their descriptions as provided on the survey tool, are listed in Table 1.

Table 1 Local health department practitioners’ importance and availability ratings of ten evidence-based decision making (EBDM) competencies

The entire baseline survey instrument underwent cognitive response testing (n = 12) and test-retest processes (n = 38) for refinement and to document validity and reliability. Cronbach’s alpha values were 0.94 and 0.89 for the importance and availability of EBDM questions, respectively, with 8 of 10 EBDM importance questions having substantial reliability and 7 of 10 availability questions rated with substantial or nearly perfect reliability [27].

Additionally, the intervention group’s post-test questionnaire asked participants to assess how frequently they used EBDM skills and to rate benefits and barriers to using course content. These questions have been used in previous evaluations of the EBPH course [16],[19].

Data collection

All data were collected using Qualtrics survey software [28]. A unique link was emailed to each participant, and non-respondents received email and phone call reminders to bolster response rates. For the control group, baseline data were collected from October 2012 to February 2013 and repeated in October-December 2013. Baseline data were collected from course attendees prior to their trainings and were repeated six months after each training (October - December 2013). Respondents were offered a $20 Amazon gift card incentive for completing the pre-test and a $10 Amazon gift card for completing the post-test. The median pre-test administration time was 14 minutes, and the median post-test was five minutes. Human participant approval was obtained from the Washington University Institutional Review Board.

Data analysis

An average of 33 participants completed each EBPH course (nMI = 27, nNC = 32, nOH = 33, nWA = 38). Among those invited to complete a post-test (ncontrol = 330, nintervention = 130), data were collected from 236 controls (response rate 72%) and 112 intervention subjects (response rate 86%). Excluding participants who no longer worked at the same organization or who had an undeliverable email address (ncontrol = 22, nintervention = 6), response rates were 77% and 90%, respectively. Efforts were made to update any undeliverable email addresses by contacting the LHDs and by conducting Internet searches for the individual, but survey invitations were not forwarded if the individual was working for a new organization. Although unique survey links should have ensured that the same person completed the pre- and post-test, we compared demographic data from pre- to post-test to determine if survey links were shared without our knowledge. This resulted in the exclusion of 11 cases from the control group. Another 11 controls who did not answer the majority of the EBDM competency questions were also excluded. Among the intervention group, 14 represented state health departments or other non-LHD organizations; they are excluded from all analyses. An additional 16 intervention subjects did not complete a pre-test or did not answer the majority of EBDM competency questions. A total of 214 control and 82 intervention subjects were used in the quasi-experimental analysis (Tables 1 and 2) while the previously mentioned 16 intervention subjects were retained for the analysis represented in Table 3 (n = 98).

Table 2 Characteristics of the sample of local health department practitioners, United States, 2012-2013
Table 3 Local health department respondents’ use of Evidence-Based Public Health (EBPH) course content (n = 98)

Respondents rated perceived importance followed by availability of each EBDM competency. Availability was defined as `how available you feel each skill is to you when you need it (either in your own skillset or among others’ in your agency).’ Importance and availability were measured on a continuous 11-point scale in which only the endpoints were defined (0 = unimportant/not available, 10 = very important/available). A `gap’ score was computed by subtracting each availability score from its corresponding importance score. A net difference was calculated for importance, availability, and gap scores by subtracting the pre-test score from the post-test score for each respondent. Difference scores were normally distributed and were used as the outcome variable in simple linear regression models. The estimated regression coefficient of a group assignment variable (coded as intervention = 1, control = 0) represented the average change in the outcome variable associated with the intervention. Standard multiple linear regression models adjusted for job position, population of jurisdiction, highest degree, gender, age, state, and years of public health experience. Frequency of EBPH skill use was measured as weekly, monthly, quarterly, and seldom/never. Benefits and barriers were measured on a 5-point Likert scale, and combined `agree’ and `strongly agree’ categories are reported. Chi-square tests assessed differences between categorical groups.

Results

Table 2 shows post-test demographic characteristics of control and intervention respondents used in the quasi-experimental analysis. In general, controls had higher-level jobs, were more likely to be older and male, and had more years of public health experience than intervention subjects (p < 0.01). Population of jurisdiction was roughly balanced between the two groups (p = 0.26). Over one-half of both groups had attained post-graduate degrees (53% of control, 65% of intervention). Control and intervention respondents did not significantly differ from non-respondents (p < 0.05) for any of the variables listed in Table 2.

Controls (n = 214) represented 32 U.S. states, averaging 6.9 respondents per state (St. dev. 4.9) and including respondents from the four intervention states who were unassociated with the training (nMI = 4, nNC = 7, nOH = 10, nWA = 7). All 27 states in which all LHDs are locally governed were represented, and locally governed LHDs from 5 of 13 mixed governance states were represented. Intervention states were represented approximately equally (ntotal = 82, nMI = 22, nNC = 21, nOH = 22, nWA = 17) in the quasi-experimental analysis.

All pre-test mean importance scores for the 10 EBDM competencies were 8.0 or greater on the scale of 0 - 10 for both groups, leaving little room for improvement (moreso in the intervention group with higher pre-test means than the control group in all 10 competencies) (Table 1). While nearly all mean importance scores improved from pre-test to post-test in both groups, negative mean difference scores indicate the greater increase in control scores relative to intervention scores. No adjusted scores, and only one unadjusted score (`prioritization’), showed significant differences between groups.

Availability of EBDM competencies increased more for the intervention group, relative to the control group, for unadjusted and adjusted measures of all 10 competencies (except the unadjusted measure of `community assessment’). The overall post-test availability means of all 10 competencies were equivalent, with the intervention group starting lower at pre-test. Adjusted mean differences were significant (p < 0.05) for `action planning,’ `communicating research to policy makers’, `evaluation design’, `quantifying the issue’, and the overall mean availability score. Smallest availability increases between groups were in `community assessment’ and `prioritization’.

Gaps between the importance and availability of each EBDM competency decreased in all 10 competencies and in the overall mean, with significant (p < 0.05) decreases found in: `evaluation design’, `action planning’, `communicating research to policy makers’, `economic evaluation’, and the overall mean. The adjusted estimates of `quantifying the issue’ and `quantitative evaluation’ approached significance (p = 0.07). The smallest gap decreases between groups were in `community assessment’ and `qualitative evaluation’.

Over 60% of EBPH course attendees reported using EBPH materials and skills at least quarterly in planning, modifying and evaluating programs, in searching scientific literature, and in referring to course readings. Between 22% and 36% of EBPH course attendees reported using course materials or skills on at least a monthly basis in these same five categories (Table 3). In three categories (planning, modifying, and evaluating programs), participants without post-graduate degrees were more likely to report monthly use (p < 0.05). The majority of participants indicated agreement with 11 of the 12 benefits statements (excluding only obtaining funding, 39.8%). Highest rated benefits were: acquiring new knowledge and seeing applications for it in their work, becoming better leaders, and making scientifically informed decisions. The largest barriers to using course content included lack of time for implementation, lack of funding to continue training, and co-workers not being similarly trained. Importantly, only 17.3% of participants did not use course content because they lacked sufficient skill to do so.

Nearly 45% of participants indicated that EBDM had increased within their agency since completing the EBPH training. An open-ended survey question solicited examples, and common themes included: selecting new programs based on scientific literature, epidemiologic data, and tools such as The Guide to Community Preventive Services; critically evaluating current programs and modifying or eliminating programs as necessary; writing grants to secure new funding; conducting evaluation, community health assessments, and strategic planning; supporting health department accreditation processes; and providing a framework for talking with leaders. One participant noted:

`It helped raise awareness about evidence based decision-making among agency leadership, paving the way for those of us who completed the training to discuss, promote and facilitate integration of it in our public health programming, services, grant writing etc. and receive increased support to do so. It assisted in it becoming part of a common organizational language.’

Discussion

This study shows the potential for improving LHD practitioners’ capacity in EBDM using a train-the-trainer approach involving diverse partners. The EBPH course, developed by the PRC-StL, has been previously evaluated [16],[19],[20], but this quasi-experimental design (pre/post with external comparison group) improves the quality of the evidence [29], examining the potential effects of the training while accounting for secular trends and other external factors.

Partners within four states tailored and delivered a previously established EBPH curriculum and provided technical assistance to course participants. Both control and intervention groups saw mean increases in importance and availability scores (and decreases in gap scores), possibly indicating the increased focus on EBDM from other sources, such as funding and accreditation agencies. However, the intervention group consistently saw greater gains in availability of EBDM competencies and decreased gaps between importance and availability, particularly in the areas of `action planning’, `communicating research to policy makers’, `evaluation design’, `quantifying issues (using descriptive epidemiology)’, and `economic evaluation’. Importance of EBDM competencies showed little change between pre- and post-assessments, likely due to their already high ratings at baseline and indicating agreement with the procedure by which these competencies were developed (i.e., competencies were originally selected and prioritized as those that were important) [26].

Across four surveys of state and local health department practitioners (including the baseline surveys from which control subjects of this current study were selected) and consistent with previous research [22], the largest gaps between the importance and availability of EBDM competencies were consistent across samples: `economic evaluation,’ `communicating research to policy makers,’ `evaluation designs,’ and `adapting interventions’ [Jacob RR, Allen P, Baker EA, Dodson EA, Duggan K, Fields R, Sequeira S, Brownson RC: Training needs and supports for evidence-based decision making among the public health workforce in the United States, submitted]. The current evaluation showed significant decreases in gaps in the first three, indicating that the EBPH course is targeting the areas of EBDM most in need of improvement. Participants in the EBPH course in this multi-state intervention also showed similar use of skills and agreement with benefits and barriers to using course material as did almost 500 previous participants of the course who were primarily taught by faculty associated with the course’s original developers [16],[19].

Lessons learned

Based on this evaluation, EBPH training courses can effectively improve the availability of several skills essential to EBDM among LHD practitioners. With the development of successful partnerships and the availability of experienced trainers, such a course can be tailored and replicated in nearly any environment. Based on the experiences of the trainers and on participants’ onsite evaluations of the course, we share below some lessons learned from the adaptation and implementation of the EBPH course in this train-the-trainer model:

Some participants found components of the curriculum to be too elementary while others with less experience or formal training learned new skills. Efforts should be made to assess the audience’s level of knowledge during the planning phases of the course and adapt course content to the appropriate level of knowledge and expertise. However, a heterogeneous group supports networking among individuals in different roles (e.g., evaluators, surveillance staff, health educators), and this heterogeneity also reflects the realities of staff expertise within departments and programs. Not every practitioner must possess every EBDM skill. Rather, as a whole, the team should be able to conduct an EBDM process. More experienced participants could be asked to self-identify and support less experienced participants during vital program exercises.

Two states incorporated web-based technology to deliver three course modules, and the majority of their participants found the webinars to be useful and to enhance learning. Webinar formats can increase reach and sustainability, and participants appreciated the flexibility they afford. However, strengths of in-person training as identified by the participants (e.g., interacting with new peers, working through examples face-to-face, hearing about best practices from other counties) are difficult to recreate in web-based formats.

Similar to previous evaluations of the course [16],[20], participants requested more specific examples of how to apply an evidence-based process to practical work, more tailored materials (to their specific program areas), and more problem sharing amongst course participants. They appreciated hands-on activities and exposure to new resources and take-home tools. If possible, it is recommended to have previous course attendees share experiences in using the new knowledge and making changes within their agency.

Participants consistently requested more guidance on economic evaluation. This competency also had the lowest mean pre- and post-test availability scores among both control and intervention groups. Participants may benefit from a more simplified approach to presenting this content, with a greater focus on accessing, rather than conducting, economic evaluations.

Curriculum related to the competencies with low availability gains and small decreases in gap scores (e.g., `community assessment,’ `qualitative evaluation’) should be reviewed for opportunities to incorporate new tools, exercises, or teaching points. In some cases, low availability gains may reflect existing training efforts in that area (e.g., a state health department has invested in community assessment trainings) and the EBPH curriculum should be coordinated with those existing efforts.

Trainings were strengthened by the participation of trainers with a diversity of experience and expertise and by coordination among presenters in advance of the training to ensure consistent messaging and localization of data and concepts.

Having teams of two or three individuals from an agency attend the course together creates a `critical mass’ of trained staff in an agency [30],[31] and enhances the likelihood of influencing the agency’s decision-making processes.

A focus on training leaders with targeted or more advanced EBDM sessions is also important. Leadership buy-in is critical when building skills, fostering expectations for EBDM, and conducting participatory decision-making [32]-[34].

Next steps

With promising results from the implementation of this intervention, a next step is to identify practices for further scaling up EBDM capacity-building efforts among the nation’s 2,565 LHDs. Health departments, particularly those applying for PHAB accreditation, need to enhance their workforce’s capacity to implement EBDM. The effectiveness of webinar formats should be investigated, as they can be an efficient way of addressing the increasing demands placed on public health professionals as they face declining government funding, staff reductions, and travel restrictions.

Our study was not designed to test webinar effectiveness. Only two EBDM competencies could be related to EBPH modules delivered via webinar (Module 3: Quantifying the Issue for Michigan and Module 7: Economic Evaluation for Michigan and Washington). An assessment of the related competency differences in importance, availability and gaps for participants from these states versus others yielded no significant findings (p < 0.05). While we cannot draw conclusions due to small sample sizes, these findings may imply that webinars were as effective as in-person training. It is currently unknown if web-based public health training is as effective as in-person training, and further research is indicated.

Effective webinar development can incorporate adult learning principles, focusing on scenario-based (rather than lecture-based) learning and thereby increasing participants’ engagement and ability to apply lessons to their work [35]. Maintaining the local tailoring of course material for webinar development may sustain some of the advantages (e.g., locally relevant examples and credible, familiar trainers) experienced in this trial.

Limitations

Some limitations of this study should be noted. Ideally, control and intervention groups would have been retested within the same time intervals; the timeframe of this research project did not allow for that. Training course participants may have been more biased towards socially desirable responses than control subjects. Intervention and control groups could have differed on more demographic variables than those measured and accounted for in adjusted models. This study was restricted to localized, or decentralized, governance structures, and results do not necessarily apply to other types of LHDs (i.e., those that are part of state government).

Conclusions

This evaluation shows the value and effectiveness of an EBDM capacity building course among local public health practitioners using a train-the-trainer approach and thus extending the reach of the course. The PRC-PHTC/PBRN partnership network covers LHDs in 28 states, expanding the potential reach of a scaled-up version of this project. This approach allows for local tailoring of strategies, examples and exercises, and it provides familiar and credible trainers who remain available to participants for technical assistance.

Authors’ contributions

RCB initiated the research and supervised all aspects of the study. JAJ conducted analyses and drafted the manuscript. KD and CS coordinated the study and collected data. PE served as consultant to the research team and provided scientific input on the study. EB, JC, LDA, SFK, SF, PH, JL, and AM served as state trainers and coordinators of the EBPH training. All authors contributed substantially to the interpretation of data and revision of the manuscript.

References

  1. Kohatsu ND, Robinson JG, Torner JC: Evidence-based public health: an evolving concept. Am J Prev Med. 2004, 27 (5): 417-421.

    PubMed  Google Scholar 

  2. Brownson RC, Fielding JE, Maylahn CM: Evidence-based public health: a fundamental concept for public health practice. Annu Rev Public Health. 2009, 30: 175-201. 10.1146/annurev.publhealth.031308.100134.

    Article  PubMed  Google Scholar 

  3. Core Competencies for Public Health Professionals. 2001, Public Health Foundation, Washington, DC

  4. Board PHA: Public Health Accreditation Board Standards: An Overview. 2011, Public Health Accreditation Board, Alexandria, VA

    Google Scholar 

  5. Riley WJ, Bender K, Lownik E: Public health department accreditation implementation: transforming public health department performance. Am J Public Health. 2012, 102 (2): 237-242. 10.2105/AJPH.2011.300375.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Liebman JB: Building on recent advances in evidence-based policymaking. 2013, Results for America and the Brookings Institution, New York, NY and Washington, DC

    Google Scholar 

  7. Jacobs JA, Jones E, Gabella BA, Spring B, Brownson RC: Tools for implementing an evidence-based approach in public health practice. Prev Chronic Dis. 2012, 9: E116-

    PubMed  PubMed Central  Google Scholar 

  8. Dissemination and Implementation Research in Health: Translating Science to Practice. 2012, Oxford University Press, New York, NY

  9. Norton WE, Mittman BS: Scaling up health promotion/disease prevention programs in community settings: barriers, facilitators, and initial recommendations. 2010, Patrick and Catherine Weldon Donaghue Medical Research Foundation, West Hartford, CT

    Google Scholar 

  10. Brownson RC, Allen P, Duggan K, Stamatakis KA, Erwin PC: Fostering more-effective public health by identifying administrative evidence-based practices: a review of the literature. Am J Prev Med. 2012, 43 (3): 309-319. 10.1016/j.amepre.2012.06.006.

    Article  PubMed  PubMed Central  Google Scholar 

  11. Turnock BJ: Public Health: What It Is and How It Works. 2009, Jones and Bartlett Publishers, Sudbury, MA

    Google Scholar 

  12. 2013 National Profile of Local Health Departments. 2014, National Association of County and City Health Officials, Washington, DC

  13. Who Will Keep the Public Healthy? Educating Public Health Professionals for the 21st Century. 2003, National Academies Press, Washington, DC

  14. Koo D, Miner K: Outcome-based workforce development and education in public health. Annu Rev Public Health. 2010, 31: 253-269. 10.1146/annurev.publhealth.012809.103705.

    Article  PubMed  Google Scholar 

  15. Evashwick CJ: Educating the public health workforce. Front Public Health. 2013, 1: 20-

    PubMed  PubMed Central  Google Scholar 

  16. Gibbert WS, Keating SM, Jacobs JA, Dodson E, Baker E, Diem G, Giles W, Gillespie KN, Grabauskas V, Shatchkute A, Brownson RC: Training the workforce in evidence-based public health: an evaluation of impact among US and international practitioners. Prev Chronic Dis. 2013, 10: E148-10.5888/pcd10.130120.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Slonim A, Wheeler FC, Quinlan KM, Smith SM: Designing competencies for chronic disease practice. Prev Chronic Dis. 2010, 7 (2): A44-

    PubMed  PubMed Central  Google Scholar 

  18. EBPH Course Information. In ., [http://prcstl.wustl.edu/training/Pages/EBPH-Course-Information.aspx]

  19. Dreisinger M, Leet TL, Baker EA, Gillespie KN, Haas B, Brownson RC: Improving the public health workforce: evaluation of a training course to enhance evidence-based decision making. J Public Health Manag Pract. 2008, 14 (2): 138-143. 10.1097/01.PHH.0000311891.73078.50.

    Article  PubMed  Google Scholar 

  20. Baker EA, Brownson RC, Dreisinger M, McIntosh LD, Karamehic-Muratovic A: Examining the role of training in evidence-based public health: a qualitative study. Health Promot Pract. 2009, 10 (3): 342-348. 10.1177/1524839909336649.

    Article  PubMed  Google Scholar 

  21. Brownson RC, Ballew P, Brown KL, Elliott MB, Haire-Joshu D, Heath GW, Kreuter MW: The effect of disseminating evidence-based interventions that promote physical activity to health departments. Am J Public Health. 2007, 97 (10): 1900-1907. 10.2105/AJPH.2006.090399.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Jacobs JA, Clayton PF, Dove C, Funchess T, Jones E, Perveen G, Skidmore B, Sutton V, Worthington S, Baker EA, Deshpande AD, Brownson RC: A survey tool for measuring evidence-based decision making capacity in public health agencies. BMC Health Serv Res. 2012, 12: 57-10.1186/1472-6963-12-57.

    Article  PubMed  PubMed Central  Google Scholar 

  23. Bryan RL, Kreuter MW, Brownson RC: Integrating adult learning principles into training for public health practice. Health Promot Pract. 2009, 10 (4): 557-563. 10.1177/1524839907308117.

    Article  PubMed  Google Scholar 

  24. Brownson RC, Reis RS, Allen P, Duggan K, Fields R, Stamatakis KA, Erwin PC: Understanding administrative evidence-based practices: findings from a survey of local health department leaders. Am J Prev Med. 2013, 46 (1): 49-57. 10.1016/j.amepre.2013.08.013.

    Article  Google Scholar 

  25. Erwin PC, Harris JK, Smith C, Leep CJ, Duggan K, Brownson RC: Evidence-based public health practice among program managers in local public health departments. J Public Health Manag Pract. 2014, 20 (5): 472-480. 10.1097/PHH.0000000000000027.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Brownson RC, Ballew P, Kittur ND, Elliott MB, Haire-Joshu D, Krebill H, Kreuter MW: Developing competencies for training practitioners in evidence-based cancer control. J Cancer Educ. 2009, 24 (3): 186-193. 10.1080/08858190902876395.

    Article  PubMed  Google Scholar 

  27. Reis RS, Duggan K, Allen P, Stamatakis KA, Erwin PC, Brownson RC: Developing a tool to assess administrative evidence-based practices in local health departments. Front Public Health Serv Syst Res. 2014, 3 (3): 2-

    Google Scholar 

  28. Qualtrics: Survey Research Suite. In ., [http://www.qualtrics.com/]

  29. Briss PA, Zaza S, Pappaioanou M, Fielding J, Wright-De Aguero L, Truman BI, Hopkins DP, Mullen PD, Thompson RS, Woolf SH, Carande-Kulis VG, Anderson L, Hinman AR, McQueen DV, Teutsch SM, Harris JR: Developing an evidence-based Guide to Community Preventive Services--method. The Task Force on Community Preventive Services. Am J Prev Med. 2000, 18 (1 Suppl): 35-43. 10.1016/S0749-3797(99)00119-1.

    Article  CAS  PubMed  Google Scholar 

  30. Aarons GA, Hurlburt M, Horwitz SM: Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011, 38 (1): 4-23. 10.1007/s10488-010-0327-7.

    Article  PubMed  Google Scholar 

  31. Klein KJ, Sorra JS: The challenge of innovation implementation. Acad Manage Rev. 1996, 21 (4): 1055-1080.

    Google Scholar 

  32. Erwin PC: The performance of local health departments: a review of the literature. J Public Health Manag Pract. 2008, 14 (2): E9-E18. 10.1097/01.PHH.0000311903.34067.89.

    Article  PubMed  Google Scholar 

  33. Hyde JK, Shortell SM: The structure and organization of local and state public health agencies in the U.S.: a systematic review. Am J Prev Med. 2012, 42 (5 Suppl 1): S29-S41. 10.1016/j.amepre.2012.01.021.

    Article  PubMed  Google Scholar 

  34. Orton L, Lloyd-Williams F, Taylor-Robinson D, O'Flaherty M, Capewell S: The use of research evidence in public health decision making processes: systematic review. PLoS One. 2011, 6 (7): e21704-10.1371/journal.pone.0021704.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  35. Ballew P, Castro S, Claus J, Kittur N, Brennan L, Brownson RC: Developing web-based training for public health practitioners: what can we learn from a review of five disciplines?. Health Educ Res. 2012, 28 (2): 276-287. 10.1093/her/cys098.

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Acknowledgements

This study was supported by Robert Wood Johnson Foundation's grant no. 69964 (Public Health Services and Systems Research). This article is a product of a Prevention Research Center and was also supported by Cooperative Agreement Number U48/DP001903 from the Centers for Disease Control and Prevention. The findings and conclusions in this article are those of the authors and do not necessarily represent the official position of the Centers for Disease Control and Prevention. We also thank members of our research team, Carolyn Leep (National Association of City and County Health Officials), Dr. Beth Baker (Saint Louis University), Dr. Rodrigo Reis (the Pontifical Catholic University of Parana and the Federal University of Panara), Kathleen Wojceihowski (Missouri Institute for Community Health), and Carol Brownson (Washington University in St. Louis).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ross C Brownson.

Additional information

Competing interests

The authors declare that they have no competing interests.

Rights and permissions

Open Access  This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made.

The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.

To view a copy of this licence, visit https://creativecommons.org/licenses/by/4.0/.

The Creative Commons Public Domain Dedication waiver (https://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Jacobs, J.A., Duggan, K., Erwin, P. et al. Capacity building for evidence-based decision making in local health departments: scaling up an effective training approach. Implementation Sci 9, 124 (2014). https://doi.org/10.1186/s13012-014-0124-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13012-014-0124-x

Keywords