Skip to main content
  • Systematic review
  • Open access
  • Published:

Implementation strategies in suicide prevention: a scoping review

Abstract

Background

Implementation strategies can be a vital leveraging point for enhancing the implementation and dissemination of evidence-based suicide prevention interventions and programming. However, much remains unknown about which implementation strategies are commonly used and effective for supporting suicide prevention efforts.

Methods

In light of the limited available literature, a scoping review was conducted to evaluate implementation strategies present in current suicide prevention studies. We identified studies that were published between 2013 and 2022 that focused on suicide prevention and incorporated at least one implementation strategy. Studies were coded by two independent coders who showed strong inter-rater reliability. Data were synthesized using descriptive statistics and a narrative synthesis of findings.

Results

Overall, we found that studies most commonly utilized strategies related to iterative evaluation, training, and education. The majority of studies did not include direct measurement of suicide behavior outcomes, and there were few studies that directly tested implementation strategy effectiveness.

Conclusion

Implementation science strategies remain an important component for improving suicide prevention and intervention implementation. Future research should consider the incorporation of more type 3 hybrid designs as well as increased systematic documentation of implementation strategies.

Trial registration

 < de-identified > 

Peer Review reports

Background

Suicide remains a leading cause of death worldwide [1]. Although suicide rates have decreased in certain regions of the world, rates within the USA have remained elevated over the past 20 years and have continued to rise across demographic groups [1]. The Socioecological Model of Suicide Prevention posits that suicide risk is multi-factorial and impacted by factors ranging from the individual level (e.g., mental health symptoms, financial challenges) through to the societal level (e.g., health policy, stigma) [2]. Accordingly, suicide prevention and intervention programming has been developed to address risk across these levels. For example, one such multicomponent intervention approach with demonstrated effectiveness was developed through the Garrett Lee Smith Memorial Act program funded by the Substance Abuse and Mental Health Services Administration [3]. This program supports multi-component state and tribal suicide prevention initiatives to address not only those with known risk but also increase the capacity of systems to identify and support those at risk [3]. Unsurprisingly, multi-component prevention programs carry an inherent level of complexity requiring multiple strategies for implementation support. Indeed, research shows this program is effective in decreasing suicide deaths over multiple years with increased effectiveness with more years of active implementation support, highlighting the importance of implementation strategies for suicide prevention efforts [4].

Systematic reviews have identified several promising interventions for decreasing suicide attempts and deaths [5,6,7]. However, there remains limited adoption of these interventions as well as significant variability in effectiveness, which may be secondary to implementation challenges. A recent review identifies several implementation barriers that impact suicide prevention programming, including but not limited to high levels of complexity and cost as well as insufficient tailoring to patient needs [8]. It is, however, unknown which implementation strategies may be most helpful for addressing these needs to enhance the reach and effectiveness of promising suicide prevention programming.

In light of the need to better understand the types of implementation strategies that may enhance suicide prevention efforts, a recent systematic review attempted to describe implementation strategies used in complex interventions and determined use of such strategies was inconsistent [9]. However, this review focused only on complex suicide prevention interventions (i.e., those which had more than two components operating at different levels of intervention [e.g., individual, community]) and excluded studies focused on implementing only one intervention component (e.g., only suicide screening or suicide safety planning). However, single-component studies are common among quality improvement and implementation research projects. Its limited scope may have underrepresented the breadth of suicide prevention programming. The current scoping review expands upon this work by exploring current implementation strategies used across a broader range of suicide prevention interventions and programs.

Methods

Approach

The protocol for this scoping review was prospectively published online on PROSPERO (< de-identified >). A completed Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR) Checklist for this manuscript is available in Additional file 1. The research questions for this review were (1) what are the current implementation strategies being used for promoting suicide prevention programming as described in the literature (see “Eligibility criteria” section for further information)?; (2) how effective are these implementation strategies for promoting the use of suicide prevention programming?; and (3) What organizational factors may moderate the effectiveness of these implementation strategies? We were unable to evaluate research questions 2 and 3 due to a low volume of eligible studies and underreporting of necessary information (e.g., explicit descriptions of barriers and facilitators, site- and setting-specific information; issues identified in previous literature and discussed below) [9,10,11]. Additional protocol modifications, described below where applicable, included conducting two additional literature searches, suspending the USA-only eligibility criterion, implementing collaborative full-text screening, and electing to explore the studies’ usage of best practices instead of conducting a standardized quality assessment. We made these modifications to increase the inclusivity of our sample and to address challenges with the limited information present in both abstracts and full-text manuscripts.

Searches

The search strategy (see Additional file 1) was developed in collaboration with a health sciences education and research librarian following an initial review of relevant articles (e.g., [12]). The strategy was designed to cover a broad range of topics related to suicide prevention implementation research (e.g., program development, quality improvement). Articles were obtained by searching PubMed, Scopus, PsycInfo, and the EBSCO Psychology and Behavioral Sciences Collection. The search was initially conducted in October 2019. Two additional searches were conducted in June 2021 and October 2022 due to a low volume of eligible articles from the first search.

Eligibility criteria

To be included in the review, articles were required to have been published between January 1, 2013, and October 25, 2022 (date of the final search), be written in English, describe the implementation of a suicide prevention or intervention program (i.e., not a theory or concept paper), and describe the use of at least one implementation strategy as defined by the Consolidated Framework for Implementation Research (CFIR) [13]. Randomized controlled trials that focused only on establishing the initial effectiveness of an intervention (and not its implementation), clinical case studies, editorials, opinion pieces, newspaper articles, and other forms of popular media were excluded. During the first round of screening, reviewers decided to include studies conducted outside of the USA due to the low number of eligible studies.

Study selection

After the removal of duplicates, two reviewers collaboratively screened the full texts of all articles for inclusion in the review. Full-text screening was used due to the limited ability to identify the use of implementation strategies from titles, abstracts, and keywords. As the use of at least one implementation strategy was required for inclusion, full-text screening was conducted collaboratively to prevent false negatives. Incongruence between reviewers was resolved by joint consensus.

Data extraction and synthesis

The following study characteristics were initially extracted: author(s), publication year, population(s), intervention/program type, and intervention and implementation outcome(s) assessed. Data extraction was carried out primarily by one reviewer (BR) and checked for accuracy by the other (JC). Following the coding of two training studies [14, 15] to establish initial reliability, both reviewers coded implementation strategies from each article independently using a spreadsheet tool. A round of coding was conducted after each of the three literature searches. Discrepancies were resolved by joint consensus. Subsequently, reviewers collaboratively explored adherence to study conduct and reporting best practices based on the extant literature (e.g., clarity of implementation activities, assessment of implementation strategy fidelity) [11]. This protocol modification was utilized in lieu of planned quality assessment tools [16] to better fit the included studies and the implementation science context as well as the limited information available within included studies (e.g., many quality assessment domains could not be coded due to lack of information). The hybrid effectiveness-implementation study type was also determined via joint consensus at this stage based on standardized definitions from the literature [17, 18].

During implementation strategy coding, singular implementation activities that involved more than one implementation strategy were allowed to count toward all applicable strategies. CFIR implementation strategy definitions were often more granular than common narrative descriptions of study activities. For example, it was uncommon for any study to develop educational materials without distributing them. Utilizing this approach, we also sought to avoid underrepresenting strategies that commonly co-occur.

To facilitate data synthesis, reporting, and interpretation, implementation strategies were clustered based on prior publications from the Expert Recommendations for Implementing Change (ERIC) study [19, 20] (see Table 1). Clusters ranged in size from containing 3 to 17 total strategies. Revised cluster assignments (e.g., unassigned strategies, a new cluster focused on messaging-based strategies) were developed based on joint consensus.

Table 1 CFIR implementation strategy clusters and implementation strategies

Results

Following initial deduplication, full texts of 174 articles were screened. Thirty-two studies were included in the review following full-text screening [12, 14, 15, 21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49]. The most common reason for exclusion was the absence of any reported implementation activities (e.g., no intervention implemented; see Fig. 1). Study characteristics are provided in Table 2. Most studies were conducted in the USA (n = 26) and were single-site (i.e., implementation took place in a single organizational unit, such as one clinic; n = 23). Multi-site studies ranged from 3 to 65 sites. Half (n = 16) of the included studies described the implementation of suicide risk screening and/or risk identification, such as in settings that did not previously have such protocols. Half of the included studies utilized a hybrid effectiveness-implementation design, testing both an intervention’s effectiveness and its implementation with at least one implementation strategy [17, 18]. Of those, most (n = 9) were coded as type 1 hybrid effectiveness-implementation studies (i.e., focused mostly on an intervention’s effectiveness while also exploring its implementation). There were five type 2 studies (focused roughly equally on implementation and effectiveness) and two type 3 studies (focused mostly on formally testing implementation strategies while also exploring effectiveness).

Fig. 1
figure 1

Study selection flow diagram

Table 2 Coding percent agreements by batch of articles

Intervention and implementation outcomes were not regularly distinguished by authors among the included studies. Some outcomes appeared to serve both roles depending on an intervention’s scope. For example, if training is being conducted to screen for suicide risk, training is the implementation strategy, screening is the clinical intervention, and the screening rate can be considered an implementation outcome (e.g., provider adoption) as well as a secondary intervention outcome (with patient-level suicidality the primary outcome). As such, outcomes were categorized as either intervention or implementation outcomes based on content domains to avoid misrepresenting how outcomes were used by the authors in practice.

General organizational factors outcomes (e.g., intervention adoption, costs, fidelity, leadership support) were most common (n = 21), followed by education- and training-related outcomes (e.g., knowledge, awareness, attitudes; n = 19). Studies also commonly reported effectiveness outcomes such as risk identification outcomes (e.g., screening rates; n = 17) and follow-up care outcomes (e.g., referral rates, appointments, psychiatric medication usage; n = 15). The least commonly measured were outcomes related to suicidal behavior (e.g., suicide attempts, deaths; n = 7) and feedback from patients (n = 4). Three studies provided narrative reflections on implementation processes without structured quantitative or qualitative measurement of outcomes.

Most articles adhered to at least some study conduct and reporting best practices described in the extant literature [11]. For example, most studies included some definition of their implementation outcomes (e.g., a new definition or some reference to the extant literature) and included at least some quantitative or qualitative measurement of their outcomes with clear specification of data sources (e.g., clinician feedback, electronic health record integration).

Several gaps were identified in the reporting of implementation activities. For example, several studies did not include clear implementation processes and data collection timelines (i.e., detailed enough to discern the order of events and support replication). Of the 9 multi-site studies, only Luci et al. [33] provided information on setting-level variations in the implementation process and disaggregated data by setting. Additionally, only three of the 32 included studies reported fidelity to at least one of their implementation strategies [24, 25, 34]. Overall, implementation strategies were not regularly referred to as implementation strategies (with or without citation of the ERIC framework) and were not regularly distinguished from intervention activities.

Use of implementation strategies

Percent agreement for independent implementation strategy coding was good (see Table 2).

Table 3 provides definitions, cluster assignments, and observed frequencies for all implementation strategies (i.e., the raw number of times each strategy was consensus-coded across all studies). Seventeen of the ERIC implementation strategies were not identified among the included studies. Among implementation strategies that were utilized, each was utilized 5.11 times on average (SD = 5.09) across studies suggesting studies on average employed multiple implementation strategies. ‘Purposefully reexamining the implementation’, a strategy focused on monitoring implementation progress to inform ongoing quality improvement, was most common (n = 20). Figure 2 shows the raw utilization of each of the individual strategies included in each cluster (i.e., sum of all individual strategy frequencies within a cluster). Strategies from the ‘train and educate stakeholders’ cluster (e.g., ‘conduct educational meetings’, ‘develop educational materials’) were utilized most often (n = 109). Relative to the number of strategies in each cluster (i.e., total strategy utilizations divided by cluster size), the evaluative and iterative strategies cluster (e.g., ‘purposefully reexamine the implementation’, ‘conduct local needs assessment’) cluster was the most popular.

Table 3 Included studies (N = 32), study characteristics, and total strategies coded per study (descending)
Fig. 2
figure 2

Total utilizations of strategies from each cluster across studies. Legend: Cluster sizes (number of strategies included in a cluster) are shown next to cluster names. Cluster sizes and utilization counts add to more than the total strategies and utilizations due to strategies assigned to more than one cluster (see Additional file 2)

Figure 3 shows the count of studies that utilized at least one strategy from each cluster. The ‘train and educate stakeholders’ (n = 28) and ‘use evaluative and iterative strategies’ (n = 28) clusters were the most broadly used by this metric. Conversely, the ‘support clinicians’ (n = 6) and ‘utilize financial strategies’ (n = 4) clusters were used in the fewest studies. Reviewers identified the use of 10.63 implementation strategies per study on average (SD = 6.07; see Table 4 for counts per study, Additional file 2). On average, studies utilized strategies from 4.97 of the 10 strategy clusters (SD = 1.82; see Additional file 2). These results are partially attributable to frequently co-occurring strategies and strategies that belonged to more than one cluster, respectively. For example, studies that developed and evaluated a training described utilizing multiple implementation strategies that were inherent to implementing suicide prevention training (e.g., identifying barriers and facilitators, developing education materials, and making training dynamic).

Fig. 3
figure 3

Total studies utilizing at least one strategy from each cluster

Table 4 Implementation strategies, cluster assignments, definitions, and observed frequencies (descending)

Discussion

Overall, our review identified several current patterns in the use of implementation strategies in suicide prevention as well as several gaps in the literature. Consistent with past reviews [9, 10] few manuscripts clearly delineated or described implementation strategies in a comprehensive manner (e.g., implementation details were spread across different sections of the paper, details were limited). On average, we captured fewer strategies than those reported by Rudd and colleagues who were able to identify additional strategies through surveying authorship teams [9]. However, the most common strategy clusters noted by Rudd and colleagues were consistent with our findings. It is possible that authors were unaware they were utilizing implementation strategies and thus could not describe them in detail. As the majority of papers reviewed were not published in implementation science-oriented journals, authors may have also limited the inclusion of detailed implementation strategy information to accommodate the journal audience.

The ‘train and education stakeholders’ cluster of strategies and the ‘use evaluative and iterative strategies’ cluster were the most broadly utilized—all but four studies utilized at least one strategy from this cluster (Fig. 3). Strategies from this cluster (e.g., ‘conduct educational meetings’, ‘distribute educational materials’) were the most frequently utilized overall (Fig. 2). Similarly, education- and training-related outcomes (e.g., knowledge, awareness, attitudes), were the second most common outcome domain. This is unsurprising as the majority of suicide prevention interventions and programs focus on promoting awareness and skill-building among stakeholders [6]. However, fewer studies utilized strategies for supporting active, sustained learning (e.g., ‘provide clinical supervision’, ‘create an online learning collaborative’, ‘make training dynamic’). Additionally, 13 of the 28 studies that utilized at least one training or education strategy did not utilize any strategies from the ‘provide interactive assistance’ cluster (e.g., providing ongoing support). This is of concern, as past research shows that increased knowledge and skills from suicide training initiatives are not sustained long-term, which may, in turn, decrease the overall effectiveness of suicide prevention programming over time [4, 50].

The ‘use evaluative and iterative strategies’ cluster was also commonly reported within our sample. This is congruent with the core principles of implementation science focused on understanding and adapting to organizational contexts to enhance the adoption and maintenance of evidence-based strategies [51]. The most commonly utilized strategy from this cluster was to ‘purposefully reexamine the implementation’—a critical aspect of quality improvement emphasized across several relevant frameworks and models (e.g., Plan-Do-Study-Act [52]). Interestingly, ‘identification of early adopters’ was among the least commonly used strategies within this cluster, which may have been secondary to the limited number of studies with multiple sites in our sample.

The ‘support clinicians’ (e.g., resource sharing agreements to support clinics) and ‘utilize financial strategies’ (e.g., financial disincentives) clusters were also among the least utilized in our sample. As these strategies often require financial resources, it is possible they are more difficult to implement in light of financial challenges among healthcare systems [53]. In addition, several recent commentaries have raised concerns regarding the impact of the COVID-19 pandemic on the financial resources of hospitals, which may further limit the ability to utilize implementation strategies requiring funding [54, 55]. More popular than these strategies were those aimed toward making use of existing resources, such as those from the ‘change infrastructure’ cluster (e.g., ‘assess and redesign workflow’, ‘change record systems’) to support implementation.

Few studies reported suicide behavior outcomes. While several studies were only focused on implementation, types 1 and 2 hybrid studies remain interested in an intervention's effectiveness while exploring or formally testing its implementation and can offer vital information for informing future dissemination and implementation [17, 18]. A broader range of suicide-related outcomes would better enable such studies to evaluate whether promising interventions remain effective in practice, a key advantage of hybrid study designs. Funding agencies may wish to encourage the incorporation of Type I hybrid study procedures (e.g., qualitative inquiry on barriers and facilitators post-implementation of interventions) to ensure research studies collect sufficient information regarding implementation processes to increase future adoption and uptake of findings. Additionally, past literature has highlighted tailoring to patient needs as a key facilitator in the implementation of suicide prevention interventions [8]. However, outcomes involving feedback from patients were the least common in our sample. Similarly, the ‘engage consumers’ strategy cluster was among the least popular clusters (see Figs. 2 and 3).

Limited systematic reporting of implementation strategies and their corresponding outcomes, as well as a lack of type 3 hybrid studies (focused on formal implementation testing), limited our ability to explore the relative effectiveness of individual implementation strategies for improving suicide prevention programming (Research Question 2). Our literature search only captured studies within a 10-year period due to a desire to report on the most recent research available and excluded pertinent studies with more systematic reporting of implementation strategies outside this time period. Additionally, there was an overall low volume of multi-site studies. Among them, information necessary to explore organizational factors that could moderate the effectiveness of implementation strategies was mostly absent or unclear (e.g., specific barriers and facilitators, site-specific procedures, disaggregated data; Research Question 3).

It is possible that this information, as well as the breadth of implementation strategies, was underreported in the text of the reviewed manuscripts. Similar challenges have been reported by other reviews focused on narrower sets of suicide prevention studies [9, 10]. Rudd and colleagues found that direct outreach to authors was required to get a more comprehensive understanding of implementation science strategies present in a given study [10]. Future manuscripts may wish to utilize existing reporting guidelines, such as the Standards for Reporting Implementation Studies (StaRI) checklist in combination with frameworks that guided this review (e.g., CFIR), to help ensure implementation strategies are appropriately documented to inform the broader field [56].

Conclusion

Implementation science remains an important and promising area of research for increasing sustainable adoption and deployment of evidence-based suicide prevention interventions and programming. Although we identified commonly used implementation science strategies and current gaps in the literature, our review was limited by the inconsistent reporting of implementation strategies within our sample. Future implementation science studies in suicide prevention should consider clearer, systematic documentation of implementation strategies utilized and associated outcomes to better inform the broader suicide prevention field. For example, journals accepting manuscripts on the implementation of suicide prevention programming may encourage the use of a common lexicon of implementation science terms or provide explicit reporting requirements. In addition, direct testing of implementation strategies through type 3 hybrid studies remains necessary to enhance the effectiveness of implementation and dissemination of suicide prevention programming.

Availability of data and materials

The datasets used and analyzed during the current study are available from the corresponding author on reasonable request.

Abbreviations

CFIR:

Consolidated Framework for Implementation Research

ERIC:

Expert Recommendations for Implementing Change

References

  1. World Health Organization. Suicide world wide in 2019: Global health estimates. 2021. https://www.who.int/publications/i/item/9789240026643

  2. Cramer RJ, Kapusta ND. A Social-Ecological Framework of Theory, Assessment, and Prevention of Suicide. Front Psychol. 2017 Oct;8(1756). https://www.frontiersin.org/article/https://doi.org/10.3389/fpsyg.2017.01756

  3. Goldston DB, Walrath CM, McKeon R, Puddy RW, Lubell KM, Potter LB, et al. The Garrett Lee Smith memorial suicide prevention program. Suicide Life Threat Behav. 2010;40(3):245–56. https://doi.org/10.1521/suli.2010.40.3.245.

    Article  PubMed  PubMed Central  Google Scholar 

  4. Godoy Garraza L, Kuiper N, Goldston D, McKeon R, Walrath C. Long-term impact of the Garrett Lee Smith Youth Suicide Prevention Program on youth suicide mortality, 2006–2015. J Child Psychol Psychiatry. 2019;60(10):1142–7. https://doi.org/10.1111/jcpp.13058.

    Article  PubMed  Google Scholar 

  5. Calati R, Courtet P. Is psychotherapy effective for reducing suicide attempt and non-suicidal self-injury rates? Meta-analysis and meta-regression of literature data. J Psychiatr Res. 2016;1(79):8–20. https://doi.org/10.1016/j.jpsychires.2016.04.003.

    Article  Google Scholar 

  6. Hofstra E, van Nieuwenhuizen C, Bakker M, Özgül D, Elfeddali I, de Jong SJ, et al. Effectiveness of suicide prevention interventions: A systematic review and meta-analysis. Gen Hosp Psychiatry. 2020;1(63):127–40. https://doi.org/10.1016/j.genhosppsych.2019.04.011.

    Article  Google Scholar 

  7. Peterson K, Parsons N, Vela K, Denneson LM, Dobscha KS. Compendium: Systematic Reviews on Suicide Prevention Topics. Washington, DC: Health Services Research and Development Service, Office of Research and Development; 2019. Report No.: VA ESP Project #09–199. https://www.hsrd.research.va.gov/centers/core/SPRINT-Compendium-Reviews.pdf

  8. Kasal A, Táborská R, Juríková L, Grabenhofer-Eggerth A, Pichler M, Gruber B, et al. Facilitators and barriers to implementation of suicide prevention interventions: Scoping review. Camb Prisms Glob Ment Health. 2023;10:e15. https://doi.org/10.10172/gmh.2023.9. (2023/03/13 ed).

    Article  Google Scholar 

  9. Krishnamoorthy S, Mathieu S, Armstrong G, Ross V, Francis J, Reifels L, et al. Utilisation and application of implementation science in complex suicide prevention interventions: A systematic review. J Affect Disord. 2023;1(330):57–73. https://doi.org/10.1016/j.jad.2023.02.140.

    Article  Google Scholar 

  10. Rudd BN, Davis M, Doupnik S, Ordorica C, Marcus SC, Beidas RS. Implementation strategies used and reported in brief suicide prevention intervention studies. JAMA Psychiatry. 2022;79(8):829–31. https://doi.org/10.1001/jamapsychiatry.2022.1462.

    Article  PubMed  PubMed Central  Google Scholar 

  11. Lengnick-Hall R, Gerke DR, Proctor EK, Bunger AC, Phillips RJ, Martin JK, et al. Six practical recommendations for improved implementation outcomes reporting. Implement Sci. 2022;17(1):16. https://doi.org/10.1186/s13012-021-01183-3.

    Article  PubMed  PubMed Central  Google Scholar 

  12. Roaten K, Johnson C, Genzel R, Khan F, North CS. Development and Implementation of a Universal Suicide Risk Screening Program in a Safety-Net Hospital System. Jt Comm J Qual Patient Saf. 2018;44(1):4–11. https://doi.org/10.1016/j.jcjq.2017.07.006.

    Article  PubMed  Google Scholar 

  13. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10:21. https://doi.org/10.1186/s13012-015-0209-1.

    Article  PubMed  PubMed Central  Google Scholar 

  14. Donald M, Dower J, Bush R. Evaluation of a suicide prevention training program for mental health services staff. Community Ment Health J. 2013;49(1):86–94. https://doi.org/10.1007/s10597-012-9489-y.

    Article  PubMed  Google Scholar 

  15. Chugani CD. Dialectical behavior therapy in college counseling centers: practical applications and theoretical considerations. 2017; http://libproxy.chapman.edu/login?url=https://search.ebscohost.com/login.aspx?direct=true&AuthType=ip,uid&db=psyh&AN=2016-47712-095&site=eds-live

  16. National Heart, Lung, and Blood Institute. Study Quality Assessment Tools. 2023. https://www.nhlbi.nih.gov/health-topics/study-quality-assessment-tools Cited 2023 Sep 3

  17. Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care. 2012;50(3):217–26. https://doi.org/10.1097/MLR.0b013e3182408812.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Landes SJ, McBain SA, Curran GM. An introduction to effectiveness-implementation hybrid designs. Psychiatry Res. 2019;280:112513. https://doi.org/10.1016/j.psychres.2019.112513.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Waltz TJ, Powell BJ, Matthieu MM, Damschroder LJ, Chinman MJ, Smith JL, et al. Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: results from the Expert Recommendations for Implementing Change (ERIC) study. Implement Sci. 2015;10(1):109. https://doi.org/10.1186/s13012-015-0295-0.

    Article  PubMed  PubMed Central  Google Scholar 

  20. Perry CK, Damschroder LJ, Hemler JR, Woodson TT, Ono SS, Cohen DJ. Specifying and comparing implementation strategies across seven large implementation interventions: a practical application of theory. Implement Sci. 2019;14(1):32. https://doi.org/10.1186/s13012-019-0876-4.

    Article  PubMed  PubMed Central  Google Scholar 

  21. Belhumeur J, Butts E, Michael KD, Zieglowsky S, Decoteau D, Four Bear D, et al. Adapting crisis intervention protocols: rural and tribal voices from Montana. In: Jameson JP, editor., et al., Handbook of Rural School Mental Health. 1st ed. Springer; 2017. p. 307–21. https://doi.org/10.1007/978-3-319-64735-7_20.

    Chapter  Google Scholar 

  22. Blake C. Depression screening implementation: quality improvement project in a primary care clinic for first responders. Workplace Health Saf. 2022. https://doi.org/10.1177/21650799221119147.

    Article  PubMed  Google Scholar 

  23. Bose J, Zeno R, Warren B, Sinnott LT, Fitzgerald EA. Implementation of universal adolescent depression screening: quality improvement outcomes. J Pediatr Health Care. 2021;35(3):270–7. https://doi.org/10.1016/j.pedhc.2020.08.004. (2021/02/15 ed).

    Article  PubMed  Google Scholar 

  24. Boudreaux ED, Haskins BL, Larkin C, Pelletier L, Johnson SA, Stanley B, et al. Emergency department safety assessment and follow-up evaluation 2: an implementation trial to improve suicide prevention. Contemp Clin Trials. 2020;95:106075 (2020/06/23 ed).

    Article  PubMed  PubMed Central  Google Scholar 

  25. Boudreaux ED, Larkin C, Sefair AV, Mick E, Clements K, Pelletier L, et al. Studying the implementation of zero suicide in a large health system: challenges, adaptations, and lessons learned. Contemp Clin Trials Commun. 2022;30:100999. https://doi.org/10.1016/j.cct.2020.106075.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Cramer RJ, Judah MR, Badger NL, Holley AM, Judd S, Peterson M, et al. Suicide on college campuses: a public health framework and case illustration. J Am Coll Health. 2022;70(1):1–8. https://doi.org/10.1080/07448481.2020.1739053. (2020/03/25 ed).

    Article  PubMed  Google Scholar 

  27. Day SC, Day G, Keller M, Touchett H, Amspoker AB, Martin L, et al. Personalized implementation of video telehealth for rural veterans(PIVOT-R). mHealth. 2021;7:24. https://doi.org/10.21037/mhealth.2020.03.02.

    Article  PubMed  PubMed Central  Google Scholar 

  28. Garner MS, Kunkel DE. Quality improvement of pastoral care for major depression in the community of an African American Religious Organization. Issues Ment Health Nurs. 2020;41(7):568–73. https://doi.org/10.1080/01612840.2019.1701155.

    Article  PubMed  Google Scholar 

  29. Horowitz LM, Bridge JA, Tipton MV, Abernathy T, Mournet AM, Snyder DJ, et al. Implementing suicide risk screening in a pediatric primary care setting: from research to practice. Acad Pediatr. 2022;22(2):217–26. https://doi.org/10.1016/j.acap.2021.10.012.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Kabatchnick R. Training nursing staff to recognize and respond to suicidal ideation in a nursing home. 2018. https://cdr.lib.unc.edu/concern/dissertations/5999n445d?locale=en

  31. Lai CCS, Law YW, Shum AKY, Ip FWL, Yip PSF. A community-based response to a suicide cluster: A Hong Kong experience. Crisis. 2020;41(3):163–71. https://doi.org/10.1027/0227-5910/a000616.

    Article  PubMed  Google Scholar 

  32. Landes SJ, Jegley SM, Kirchner JE, Areno JP, Pitcock JA, Abraham TH, et al. Adapting Caring Contacts for Veterans in a Department of Veterans Affairs Emergency Department: Results From a Type 2 Hybrid Effectiveness-Implementation Pilot Study. Front Psychiatry. 2021;12:746805. https://doi.org/10.3389/fpsyt.2021.746805.

    Article  PubMed  PubMed Central  Google Scholar 

  33. Luci K, Simons K, Hagemann L, Jacobs ML, Bower ES, Eichorst MK, et al. SAVE-CLC: an intervention to reduce suicide risk in older veterans following discharge from VA nursing facilities. Clin Gerontol. 2020;43(1):118–25. https://doi.org/10.1080/07317115.2019.1666444. (2019/09/17 ed)

    Article  PubMed  Google Scholar 

  34. Marshall E, York J, Magruder K, Yeager D, Knapp R, De Santis ML, et al. Implementation of online suicide-specific training for VA providers. Acad Psychiatry. 2014;38(5):566–74. https://doi.org/10.1007/s40596-014-0039-5. (2014/02/25ed).

    Article  PubMed  Google Scholar 

  35. McManus JQ. School nurses identifying at-risk adolescents for depression. Gd Canyon Univ ProQuest Diss Publ. 2021; https://www.proquest.com/openview/038681fc92427985a9ecd30fd73841b1/1?pq-origsite=gscholar&cbl=18750&diss=y

  36. Mokkenstorm J, Franx G, Gilissen R, Kerkhof A, Smit JH. Suicide prevention guideline implementation in specialist mental healthcare institutions in the Netherlands. Int J Environ Res Public Health. 2018;15(5):910. https://doi.org/10.3390/ijerph15050910.

    Article  PubMed  PubMed Central  Google Scholar 

  37. Mueller KL, Naganathan S, Griffey RT. Counseling on Access to Lethal Means-Emergency Department (CALM-ED) a quality improvement program for firearm injury prevention. West J Emerg Med. 2020;20;21(5):1123–30. https://doi.org/10.5811/westjem.2020.5.46952. (2020/09/25 ed).

    Article  Google Scholar 

  38. Noelck M, Velazquez-Campbell M, Austin JP. A quality improvement initiative to reduce safety events among adolescents hospitalized after a suicide attempt. Hosp Pediatr. 2019;9(5):365–72. https://doi.org/10.1542/hpeds.2018-0218. (2019/04/07 ed).

    Article  PubMed  Google Scholar 

  39. Powell N, Dalton H, Perkins D, Considine R, Hughes S, Osborne S, et al. Our healthy clarence: a community-driven wellbeing initiative. Int J Environ Res Public Health. 2019;16(19):3691. https://doi.org/10.3390/ijerph16193691.

    Article  PubMed  PubMed Central  Google Scholar 

  40. Riblet NB, Varela M, Ashby W, Zubkoff L, Shiner B, Pogue J, et al. Spreading a strategy to prevent suicide after psychiatric hospitalization: results of a quality improvement spread initiative. Jt Comm J Qual Patient Saf. 2022;48(10):503–12. https://doi.org/10.1016/j.jcjq.2022.02.009.

    Article  PubMed  PubMed Central  Google Scholar 

  41. Rudd BN, George JM, Snyder SE, Whyte M, Cliggitt L, Weyler R, et al. Harnessing quality improvement and implementation science to support the implementation of suicide prevention practices in juvenile detention. Psychotherapy. 2022;59(2):150–6. https://doi.org/10.1037/pst0000377.

    Article  PubMed  Google Scholar 

  42. Ryan K, Tindall C, Strudwick G. Enhancing Key Competencies of health professionals in the assessment and care of adults at risk of suicide through education and technology. Clin Nurse Spec. 2017;31(5):268–75. https://doi.org/10.1097/nur.0000000000000322. (2017/08/15 ed).

    Article  PubMed  Google Scholar 

  43. Siau CS, Wee LH, Ibrahim N, Visvalingam U, Yeap LLL, Wahab S. Gatekeeper suicide training’s effectiveness among malaysian hospital health professionals: a control group study with a three-month follow-up. J Contin Educ Health Prof. 2018;38(4):227–34. https://doi.org/10.1097/ceh.0000000000000213. (2018/07/24 ed).

    Article  PubMed  Google Scholar 

  44. Snyder DJ, Jordan BA, Aizvera J, Innis M, Mayberry H, Raju M, et al. From pilot to practice: implementation of a suicide risk screening program in hospitalized medical patients. Jt Comm J Qual Patient Saf. 2020;46(7):417–26. https://doi.org/10.1016/j.jcjq.2020.04.011. (2020/06/01 ed).

    Article  PubMed  Google Scholar 

  45. Sullivant SA, Brookstein D, Camerer M, Benson J, Connelly M, Lantos J, et al. Implementing universal suicide risk screening in a pediatric hospital. Jt Comm J Qual Patient Saf. 2021;47(8):496–502. https://doi.org/10.1016/j.jcjq.2021.05.001. (2021/06/15 ed).

    Article  PubMed  Google Scholar 

  46. Tennant J. Implementation of the signs of suicide prevention program with 9th grade students in a public school setting. 2017; https://researchrepository.wvu.edu/etd/6782/

  47. Vaughan B. Implementation and evaluation of the P4 suicide screening tool among sexual assault nurse examiners: a suicide prevention and intervention strategy. 2019; https://www.proquest.com/openview/fe6c57e3aa478424e76679074c354d09/1.pdf?pq-origsite=gscholar&cbl=18750&diss=y

  48. Wright-Berryman J, Hudnall G, Bledsoe C, Lloyd M. Suicide concern reporting among Utah youths served by a school-based peer-to-peer prevention program. Child Sch. 2019;41(1):35–44. https://doi.org/10.1093/cs/cdy026.

    Article  Google Scholar 

  49. Yeung K, Richards J, Goemer E, Lozano P, Lapham G, Williams E, et al. Costs of using evidence-based implementation strategies for behavioral health integration in a large primary care system. Health Serv Res. 2020;55(6):913–23. https://doi.org/10.1111/1475-6773.13592. (2020/12/02 ed).

    Article  PubMed  PubMed Central  Google Scholar 

  50. Holmes G, Clacy A, Hermens DF, Lagopoulos J. The long-term efficacy of suicide prevention gatekeeper training: a systematic review. Arch Suicide Res. 2021;25(2):177–207. https://doi.org/10.1080/13811118.2019.1690608.

    Article  PubMed  Google Scholar 

  51. Handley MA, Gorukanti A, Cattamanchi A. Strategies for implementing implementation science: a methodological overview. Emerg Med J. 2016;33(9):660–4. https://doi.org/10.1136/emermed-2015-205461. (2016/02/20 ed).

    Article  PubMed  Google Scholar 

  52. Langley G, Moen R, Nolan K, Nolan T, Norman C, Provost L. The improvement guide: a practical approach to enhancing organizational performance. 2nd ed. San Francisco, CA: Jossey-Bass Publishers; 2009.

    Google Scholar 

  53. Bai G, Yehia F, Chen W, Anderson GF. Varying trends in the financial viability of US rural hospitals, 2011–17. Health Aff (Millwood). 2020;39(6):942–8. https://doi.org/10.1377/hlthaff.2019.01545.

    Article  PubMed  Google Scholar 

  54. Khullar D, Bond AM, Schpero WL. COVID-19 and the Financial Health of US Hospitals. JAMA. 2020;323(21):2127–8. https://doi.org/10.1001/jama.2020.6269.

    Article  CAS  PubMed  Google Scholar 

  55. Barnett Michael L, Mehrotra A, Landon Bruce E. Covid-19 and the upcoming financial crisis in health care. NEJM Catal Non-Issue Content. 2022 Apr 29. https://doi.org/10.1056/CAT.20.0153

  56. Pinnock H, Barwick M, Carpenter CR, Eldridge S, Grandes G, Griffiths CJ, et al. Standards for Reporting Implementation Studies (StaRI) Statement. BMJ. 2017;6(356):i6795. https://doi.org/10.1136/bmj.i6795.

    Article  Google Scholar 

Download references

Acknowledgements

We would like to express our appreciation to the following individuals for their support with this manuscript: Basia Delawska-Elliot, MLS for providing technical support in the development of the literature search strategy; Devan Kansagara, MD, MCR for providing initial feedback early in the development of this manuscript; the VA Partnerships in Implementation and Evaluation (PIE) Lab for early conceptual feedback; Riley Murphy, BA for helping with data collection and organization for this manuscript.

Funding

This project was funded by a VA Health Services Research & Development Career Development Award (CDA 18–185; PI: Chen). This material is the result of work supported with resources and the use of facilities at the VA Portland Health Care System, Portland, OR. The views expressed in this article are those of the authors and do not necessarily reflect the position or policy of the Department of Veterans Affairs or the United States government.

Author information

Authors and Affiliations

Authors

Contributions

JIC contributed to the conception, design, data collection, analysis, interpretation, and writing of this manuscript. BR contributed to data collection, analysis, interpretation, and writing. SKD contributed to the conception design, interpretation, drafting, and revising of the manuscript. JCL contributed to the conception, design, data collection, analysis, interpretation, and writing of this manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Jason I. Chen.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors report having no competing interests in relation to this manuscript.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

Contains the Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR) Checklist and this review’s full search strategy.

Additional file 2.

Contains two worksheets. 1. “Coded Strategies by Study” provides the raw, consensus data from implementation strategy coding. 2. “Cluster Assignments” specifies the applicable strategy cluster(s) for each implementation strategy and whether this review changed the cluster assignment (e.g., new assignment, reassignment) from the original assignments in the literature.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chen, J.I., Roth, B., Dobscha, S.K. et al. Implementation strategies in suicide prevention: a scoping review. Implementation Sci 19, 20 (2024). https://doi.org/10.1186/s13012-024-01350-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13012-024-01350-2