Skip to main content
  • Study protocol
  • Open access
  • Published:

Study protocol for a type III hybrid effectiveness-implementation trial of strategies to implement firearm safety promotion as a universal suicide prevention strategy in pediatric primary care

Abstract

Background

Insights from behavioral economics, or how individuals’ decisions and behaviors are shaped by finite cognitive resources (e.g., time, attention) and mental heuristics, have been underutilized in efforts to increase the use of evidence-based practices in implementation science. Using the example of firearm safety promotion in pediatric primary care, which addresses an evidence-to-practice gap in universal suicide prevention, we aim to determine: is a less costly and more scalable behavioral economic-informed implementation strategy (i.e., “Nudge”) powerful enough to change clinician behavior or is a more intensive and expensive facilitation strategy needed to overcome implementation barriers?

Methods

The Adolescent and child Suicide Prevention in Routine clinical Encounters (ASPIRE) hybrid type III effectiveness-implementation trial uses a longitudinal cluster randomized design. We will test the comparative effectiveness of two implementation strategies to support clinicians’ use of an evidence-based firearm safety practice, S.A.F.E. Firearm, in 32 pediatric practices across two health systems. All pediatric practices in the two health systems will receive S.A.F.E. Firearm materials, including training and cable locks. Half of the practices (k = 16) will be randomized to receive Nudge; the other half (k = 16) will be randomized to receive Nudge plus 1 year of facilitation to target additional practice and clinician implementation barriers (Nudge+). The primary implementation outcome is parent-reported clinician fidelity to the S.A.F.E Firearm program. Secondary implementation outcomes include reach and cost. To understand how the implementation strategies work, the primary mechanism to be tested is practice adaptive reserve, a self-report practice-level measure that includes relationship infrastructure, facilitative leadership, sense-making, teamwork, work environment, and culture of learning.

Discussion

The ASPIRE trial will integrate implementation science and behavioral economic approaches to advance our understanding of methods for implementing evidence-based firearm safety promotion practices in pediatric primary care. The study answers a question at the heart of many practice change efforts: which strategies are sufficient to support change, and why? Results of the trial will offer valuable insights into how best to implement evidence-based practices that address sensitive health matters in pediatric primary care.

Trial registration

ClinicalTrials.gov, NCT04844021. Registered 14 April 2021.

Background

Implementation science focuses on clinician behavior change within organizational constraints as a key target to improve care quality and patient outcomes [1]. A range of approaches from many disciplines, including organizational theory [2] and systems science [3] have been applied to understand how to change clinician behavior within organizations. One current limitation of the field is the assumption that clinicians maximize rationality and utility when making clinical decisions [4]. Behavioral economics focuses on how context and an individual’s limited resources (e.g., time, attention) shape decisions and behavior [5], and has identified common, predictable cognitive heuristics or shortcuts that people use in making decisions [6,7,8]. These heuristics can be harnessed through choice architecture, which involves changing the environment to facilitate the desired choice [9]. Implementation strategies informed by behavioral economics have been underused in efforts to increase the use of evidence-based practices. Deployment of these approaches through the electronic health record (EHR) can guide medical decision-making in ways that do not disrupt workflow and can be effective and low cost [10,11,12,13]. Given that more than 90% of hospitals, healthcare systems, and clinical practices in the United States (US) use an EHR [14], choice architecture strategies deployed in the EHR (e.g., a Best Practice Alert reminding clinicians to engage in evidence-based care)—hereafter called “Nudges”—are also highly scalable. EHR-delivered behavioral economic strategies have been used to change clinician practice in multiple areas of medicine and are highly promising [10, 15,16,17,18]. However, in the case of interventions targeting sensitive topics, such as firearm safety, sexual health behavior, or mental health and substance use, additional strategies may be needed to address clinician and practice factors such as clinician comfort with the intervention or leadership endorsement [19].

One promising strategy to address these barriers is implementation or practice facilitation (hereafter referred to as facilitation), an evidence-based implementation approach in which trained facilitators collaborate with local stakeholders to identify and address site-specific implementation barriers with the goal of building organizational capacity for improvement and increasing uptake and sustainment of the desired practice [20,21,22]. Although facilitation has been associated with increased clinician adoption of evidence-based practices and patient reach [23], it is resource-intensive, which may limit its scalability. Both scalability and effectiveness are key considerations when designing strategies to implement interventions addressing major public health problems, such as youth suicide by firearm. As such, in this trial, we will compare two approaches to implementing a firearm safety program in pediatric primary care as a universal suicide prevention strategy. Specifically, we will answer: is the less costly and scalable EHR-based “Nudge” powerful enough or is more intensive and expensive facilitation needed to overcome implementation barriers? We will also test the mechanisms through which our implementation strategies operate. We use firearm safety promotion as an example given the public health need [24], existing evidence-to-practice gap [25], and momentum nationally for health systems to play a role in reducing pediatric firearm injury and mortality [26].

Research-to-practice gap: safe firearm storage program in pediatric primary care as a universal suicide prevention strategy

The US is experiencing a rise in youth suicide deaths. Firearms are the most common and lethal method of suicide attempt [27]. Reducing access to firearms is a promising yet underused suicide prevention strategy [28]. Addressing firearm storage is critical to suicide prevention efforts [29, 30] given that firearms are present in one in three US homes [31]. Recent research has found that seven in 10 firearm-owning families with children do not store all firearms in their home locked and unloaded as recommended by leading organizations including the American Academy of Pediatrics [32] and National Shooting Sports Foundation [33]. This means that approximately 4.6 million US children live in homes in which at least one firearm is stored unlocked and/or loaded [34]. Given that the presence of firearms in the home is a robust risk factor for suicide [35], safe storage of firearms in the home is imperative for reducing youth suicide attempts and death. Simulation research has found that even a modest increase in safer firearm storage could prevent as many as 32% of youth firearm deaths due to suicide and accidents [36]. Thus, efforts to increase implementation of interventions to improve secure firearm storage could save young lives nationally from suicide and unintentional injury.

The evidence-based practice

Safety Check is an evidence-based pediatric primary care program targeting parental firearm storage as part of a bundle of violence prevention strategies that was originally developed for parents of youth ages 2–11 years [37]. The program, which is delivered by pediatricians and informed by a harm reduction approach aiming to reduce firearm injury, includes (1) screening for presence of firearms, firearm storage, and parental concerns about firearm injuries where children live and/or play; (2) counseling using brief motivational interviewing; and (3) providing firearm safe storage tools, such as cable locks, to parents. A large clinical trial found that parents receiving Safety Check reported double the odds of safe firearm storage (OR = 2.0, p < .001) compared to the control group. The intervention group showed a 10% increase in parent-reported use of cable locks, while there was a 12% decrease in cable locks in the control group. These results led major professional organizations to recommend use of Safety Check, but it has not been routinely implemented [38].

To increase our understanding of how best to implement Safety Check as a universal suicide prevention strategy [39], we conducted pre-implementation work in two health systems, guided by the Consolidated Framework for Implementation Research (CFIR) [40]. This work allowed us to gather key information about determinants (i.e., barriers and facilitators) to the implementation of Safety Check within the current national context, including clinician attitudes about discussing firearm safety with parents and the perspectives of firearm stakeholders (e.g., firearm safety course instructors) [38, 41]. This has paved the way for adaptations to Safety Check using an established adaptation framework [42, 43]. The adaptations made include expanding the reach to a broader age range (i.e., parents of children ages 5–17), changing the entry point of the counseling conversation from an identified parental concern to universal counseling for all parents, clarifying that firearm ownership status will not be documented in the EHR but that documentation may note that a conversation about firearm safe storage took place, offering additional resources from credible sources such as brochures or website links, and changing the program name. Based on crowdsourced feedback from parents, the program is now called S.A.F.E. (Suicide and Accident Prevention through Family Education) Firearm [43]. Our preliminary work also led to the development of implementation strategies using implementation mapping [44] to be tested in the proposed trial.

Study contributions

The proposed research draws on multiple streams of evidence to maximize impact in the context of an urgent and sensitive topic and incorporates the latest advances in implementation science by merging behavioral economics and implementation science approaches. This offers an opportunity to test the support needed for implementation of S.A.F.E. Firearm and will also provide unique insights into implementation of sensitive evidence-based practices in primary care more broadly. Testing these implementation strategies in the context of a hybrid effectiveness-implementation trial may also reduce youth suicide and unintentional injury deaths. Additionally, despite the proliferation of conceptual frameworks [45, 46] and hypothesized determinants of practice within implementation science [47], little is known about which of the hypothesized determinants are causally related to implementation of evidence-based practices [1, 48] because very few trials test mechanisms or the processes responsible for change [49]. Our analysis of implementation strategy mechanisms will be critical to understanding how the strategies work and key to future efforts to optimize the effectiveness of our approaches. We will also gather information on associated implementation strategy costs to inform national scale-up efforts.

Methods/design

This manuscript adheres to the Standards for Reporting Implementation Studies (StaRI) Statement (Additional file 1) [50].

The Adolescent and child Suicide Prevention in Routine clinical Encounters (ASPIRE) trial is a hybrid type III effectiveness-implementation trial [51] with a longitudinal cluster randomized design [52,53,54]. We will answer questions related to implementation strategy effectiveness in 32 pediatric and/or family medicine practices (henceforth referred to as “pediatric practices”) nested within two health systems within the Mental Health Research Network (MHRN), a National Institute of Mental Health-funded practice-based research network of 21 health systems. This study will be conducted in Henry Ford Health System (HFHS) and Kaiser Permanente Colorado (KPCO). During the active implementation period, 32 pediatric practices in the two health systems will receive S.A.F.E. Firearm materials, including brief training and cable locks. Half of the practices (k = 16) will be randomized to receive Nudge; the other half (k = 16) will be randomized to receive Nudge plus 1 year of facilitation to target additional clinician and practice implementation barriers (Nudge+). Trial study recruitment will start in 2022.

Regulatory approvals

The ASPIRE trial was registered on ClinicalTrials.gov on April 14, 2021 (NCT04844021). The University of Pennsylvania institutional review board (IRB) serves as the single IRB (sIRB); reliance agreements were completed by both participating health systems. The study was approved on December 2, 2020 (#844327). The study is overseen by a data safety and monitoring board (DSMB) comprised of experts in implementation science methods, suicide prevention, and firearm safety promotion. The DSMB had an introductory meeting in February 2021 and will convene annually.

Study team and governance

The study team includes an interdisciplinary group of researchers, clinicians, and health system stakeholders with expertise in implementation science, behavioral economics, firearm safety promotion, suicide prevention, biostatistics, mixed methods, and pediatric clinical care. The following consultants also contribute expertise to the study: the original developer of Safety Check, the developer of the hybrid design approach, and firearm safety experts (i.e., master firearm safety course instructors) who provide perspectives on the broader firearm landscape to ensure ecological validity of the work.

Implementation framework, targets, and mechanisms

Our research is guided by two implementation science frameworks: the Proctor et al. framework and CFIR [40, 55]. The Proctor et al. framework guides the relationship between our implementation strategies and implementation outcomes, listed in Fig. 1. Fidelity, operationalized as parent-reported clinician delivery of the two components of S.A.F.E. Firearm (brief counseling around firearm safe storage, offering cable locks), is the primary study outcome. Secondary outcomes include reach (EHR-documented program delivery) and acceptability (i.e., parent- and clinician-report of acceptability via online survey) of S.A.F.E. Firearm as well as implementation strategy costs [55]. CFIR guides our understanding of mechanisms related to inner setting factors (i.e., clinician and practice factors) that may mediate and/or moderate the relationship between implementation strategies and fidelity. Our primary mechanism of interest is practice adaptive reserve, a self-report practice-level measure composed of six factors: infrastructure, facilitative leadership, sense-making, teamwork, work environment, and culture of learning.

Fig. 1
figure 1

Guiding implementation frameworks. This figure depicts the contextual factors—guided by the Consolidated Framework for Implementation Research (CFIR) [40] (left)—that will be examined in relation to S.A.F.E. Firearm implementation and trial outcomes—guided by the Proctor et al. framework [55] (right)

Study aims and approach

Setting

We will conduct the proposed study in two geographically diverse MHRN systems that serve urban, suburban, and rural communities to maximize generalizability of our findings. HFHS (Michigan) includes the Detroit metro area and serves over 1.25 million patients per year, 38% of whom are racial or ethnic minorities. This is important given evidence of racial and ethnic disparities in suicide generally [56, 57] and firearm injury and mortality specifically [57, 58]. HFHS includes seven hospitals and more than 50 ambulatory care practices, 14 of which are pediatric practices. Our second partner, KPCO, serves approximately 600,000 members across Colorado including urban, suburban, and rural samples. It has 27 ambulatory care practices, including 24 pediatric practices (some stand alone, some are multi-specialty clinics), of which we will purposively choose 18 representative practices to participate. Thus, we will include 32 practices across the two sites. (Please see Fig. 2, CONSORT diagram.) Both health systems use the Epic electronic health record system. Recent estimates indicate that 45% of households in Colorado and 40% of households in Michigan owned firearms [59], putting Colorado above the national average of ownership [31].

Fig. 2
figure 2

CONSORT diagram

Participants

Participants will include parents of youth seen in pediatric primary care, pediatric and family medicine clinicians (hereafter referred to as “clinicians”), and health system leaders. Clinicians delivering the S.A.F.E. Firearm program will include physicians (MD, DO) and advanced practice clinicians (nurse practitioner, physician assistant) who regularly conduct well-child visits with children and work in pediatric or family medicine departments.

Parents of youth seen in pediatric primary care

We will include parents and/or legal guardians (hereafter referred to as “parents”) at participating pediatric practices who have a child ages 5–17 years who attends a well-child visit. At least one parent must attend to be eligible. Our target age range of youth reflects the fact that suicide is the second leading cause of death among youth ages 10 and over [60], and rates are increasing in youth ages 5–12, particularly among Black or African American children [61,62,63]. Our upper limit is based on the age when most young people transition out of pediatrics in the participating health care systems. To optimize ecological validity, there are no exclusion criteria. We expect an N of approximately 58,866 eligible youth over the course of one year.

Clinicians and health system leaders

There are currently 137 physicians and 14 non-physician clinicians within the two systems who see young people within pediatrics and family medicine. Leaders (n = 20) include practice and department chiefs and health plan directors.

Evidence-based practice/intervention

Safety Check was developed using social cognitive theory [64] and uses a harm reduction approach to meet parents where they are with regard to their storage behavior [65, 66]. For this study, we will deploy an adapted version of Safety Check which maintains the key components of the original intervention (i.e., counseling and offering a cable lock) [37, 67] but extends its reach and acceptability [19, 38, 41]. Drawing on the ADAPT-ITT framework [42], we collaborated with parents, firearm safety experts, clinicians, and health system leaders [19, 38, 41, 43] to adapt Safety Check to reach a broader age group (i.e., youth < 18) and to serve as a universal suicide prevention strategy in pediatric primary care. Parents have been involved in the selection of name and logo (see Fig. 3); the program is now renamed S.A.F.E Firearm. Both firearm-owning and non-firearm-owning parents reported high acceptability of the adapted program [68, 69].

Fig. 3
figure 3

S.A.F.E Firearm name and logo based on crowdsourcing. S.A.F.E. Firearm name and logo, which was identified based on feedback from firearm owning and non-firearm owning parents [43]

Implementation strategies

Prior to randomization, all 32 practices will receive S.A.F.E. Firearm materials and training. Clinicians will be strongly encouraged by pediatric leadership to access brief online training prior to trial launch [70, 71]. The video will include targeted information on how to counsel parents about firearm safety using motivational interviewing, an evidence-based approach that takes a nonjudgmental stance.

Nudge

All participating practices will receive the Nudge, which will be delivered via the EHR. During the study’s preparation phase, we will work with pediatric practice leadership and Epic information technology specialists to refine the design and functioning of our Nudge. We will prototype and pilot the Nudge to ensure it is consistent with current workflow, effective, and unobtrusive. We have decided to use a EHR SmartList, which is a pre-defined list of choices that users can select using their mouse or keyboard and are particularly helpful for documenting values that a clinician is required to use repeatedly, thus saving time and keyboard strokes. SmartLists are already used for other types of visit documentation in both health systems, which means clinicians are familiar with their functionality. We will add a default SmartList to the standard “Well-Child Visit” documentation template to serve as a Nudge and allow for tracking of S.A.F.E. Firearm implementation. The clinician will be asked to select a value from a drop-down list (e.g., “Discussed safe firearm storage” or “Did not discuss safe firearm storage;” “Offered a cable lock” or “Did not offer a cable lock”). Clinicians will be trained in how to document intervention delivery as part of annual training requirements. Our Nudge condition is informed by behavioral economic theory by enabling choice and bringing the desired behavior to the attention of the clinician [72]. While a hard stop in the EHR requiring a decision or a default where the desired behavior is preselected is likely more powerful [18], our approach is responsive to health system stakeholder preferences.

Nudge+

Practices randomized to this condition will receive the Nudge as described above, as well as 12 months of facilitation [73]. The role of the facilitator is to engage with study practices, to assist each practice in setting change and performance goals around the implementation of S.A.F.E. Firearm, and to troubleshoot implementation barriers.

Our approach to facilitation is informed by established facilitation manuals (i.e., Veteran Health Affairs Quality Enhancement Research Initiative [QUERI] facilitation manual [21, 74] and Agency for Healthcare Research and Quality [AHRQ] practice facilitation manual [22]) and includes six stages. First, facilitators will engage in an informal pre-implementation readiness assessment with each practice to identify potential implementation barriers and to develop relationships with stakeholders. Second, facilitators will support practices in addressing these barriers and launching the implementation strategy activities. These activities include identifying where in the workflow S.A.F.E. Firearm can be implemented, when S.A.F.E. Firearm will be delivered during the well-child visit, who in the practice will be responsible for storing the cable locks, where the locks will be stored, and other workflow matters. In keeping with behavioral economic principles, we will pay close attention to cable lock storage locations so locks can serve as visual reminders of the program (e.g., in baskets by documentation stations). Third, in the first 3 months of the active implementation period, facilitators will work with practices to set goals and establish metrics to monitor S.A.F.E. Firearm implementation. During this period, the facilitator will regularly engage with practice leadership and clinicians. In addition, facilitators will begin to develop a sustainment plan in collaboration with stakeholders. Fourth, in months 3–9, the facilitators will continue to work with practices to address barriers identified in the pre-implementation phase as well as new barriers that emerge as clinicians and practices begin implementing. This includes established implementation strategies such as Plan-Do-Study-Act cycles [75] and audit and feedback [76]. Fifth, in months 9–12, facilitators will engage in continued efforts to maintain gains and begin to enact the sustainment plan in preparation for the end of facilitation. Sixth, in month 12, facilitation activities will end, and the practices will transition to the formal sustainment period. Over the course of the active implementation period, facilitators (i.e., members of the study team who are trained in facilitation and include masters and doctoral level prepared colleagues) will offer expert consultation (i.e., webinars and technical assistance via email and phone as needed) and regular peer-to-peer calls supported by facilitators where practices can share their experience. All activities will be tracked via logs [21, 74] to ensure the ability to measure which strategies are delivered via facilitation (i.e., implementation fidelity).

Randomization

We will randomize practices to the active implementation conditions (Nudge [k = 16] or Nudge+ [k = 16]), using covariate-constrained randomization [77]. Covariate-constrained randomization enumerates a large number of possible assignments of the strategies to the practices and quantifies the balance across arms with respect to a set of pre-specified covariates for each one. Then, from a subset of possible assignments that achieve adequate balance, one is randomly chosen as the final allocation of strategies for the study. We will implement this randomization procedure to achieve balance with respect to three practice-level covariates: health system, practice size, and percent of patient panel that lives in a rural (i.e., non-metropolitan) [78] area based on geocoded patient home address.

Study timeline

Year 1 will be devoted to carefully planning and piloting our procedures to optimize our approach, including the collection of our primary outcome. In Year 2, we will begin collecting parent-reported clinician fidelity to allow us to capture baseline rates. The trial will launch in Year 2 and run for 12 months. During this period, both systems will deploy the EHR Nudge in all practices. Practices randomized to the Nudge+ condition will also receive facilitation. In years 3 and 4, the Nudge will continue in all practices but facilitation will be discontinued in the Nudge+ practices; we will continue to collect data from all practices to look at sustainment for 1 year. We will collect survey, interview, practice logs, and EHR data to answer study questions and test hypotheses. Aim 1 will examine the effects of Nudge+ relative to Nudge on parent-reported clinician fidelity, reach, cable lock distribution, acceptability, and implementation cost [55, 79]. See Fig. 4.

Fig. 4
figure 4

Study timeline. Timeline depicting study phases (pre-trial “pre-implementation” phase, trial “active implementation” phase, and post-trial “sustainment” phase) and study activities

Aim 1: Examine the effects of Nudge vs. Nudge+ on implementation outcomes

Primary outcome

Fidelity is defined as a patient-level outcome indexing whether the patient received S.A.F.E. Firearm as prescribed by the program model; we call this “target S.A.F.E. Firearm.” The achievement of this outcome requires the patient’s clinician to follow both intervention steps (i.e., counseling and offering cable locks). Patients’ receipt of target S.A.F.E. Firearm will be measured via the following yes/no questions on a parent survey: (a) did someone on the healthcare team talk to you about firearm storage during your child’s recent visit? and (b) were free cable firearm locks made available to you during your child’s recent visit? Patients will receive a binary fidelity score indicating whether the clinician completed both (a) and (b) with them. In addition, we will code whether the steps occurred separately for supplemental analyses.

Secondary outcomes

Reach, or the number of parent-youth dyads who receive the intervention divided by the number of eligible parent-youth dyads [79], will be extracted from EHR data, based on clinician responses to EHR documentation. EHR data collection represents an exceptional opportunity to understand clinician behavior with all parents of youth rather than restricting data collection to a subset of clinicians who self-select to provide self-report or allow observation of their behavior [80], and we will be able to determine the entire clinical population denominator rather than the sample denominator.

As an additional measure of reach, the number of cable locks distributed in each practice will be recorded by research staff on a monthly basis. Because families will be permitted to take more than one lock, this metric will offer a proxy for the maximum number of firearms that may have been secured due to the intervention.

Acceptability will be measured from the perspective of both parents and clinicians. The parent survey will inquire about the acceptability of each S.A.F.E. Firearm program component separately with a single yes/no item (i.e., I found/would have found it acceptable to talk about firearm storage during my child’s visit; I found/would have found it acceptable to have free cable firearm locks made available to me during my child’s visit). Clinicians will rate the acceptability of each S.A.F.E. Firearm program component and implementation strategy separately via a single item rated on a six-point Likert scale (strongly disagree to strongly agree). This approach is based on our previous work assessing clinician acceptability of firearm safety programming [38].

To collect fidelity and acceptability data, all eligible parents in both health systems will be contacted within 2 weeks of their completed well-child visit, via email, mail, patient portal message, text message, or phone call by research specialists employed by their respective health system. The message will invite them to complete a survey via REDCap, a secure, web-based application for collecting and managing survey data that can be completed via computer or mobile device [81]. Follow-up contacts (e.g., phone calls, texts) will be made up to approximately 4 weeks after the well-child visit to enhance response rates. Follow-up recruitment strategies will differ and will be informed by best practices at each respective health system. Participants will be eligible for an incentive via lottery for survey completion (e.g., $100 gift card). We anticipate that we will be able to obtain responses from approximately 18,665 individuals using these methods.

To collect acceptability data, clinicians (N = 151) will be contacted via email using the Dillman Tailored Design Method [82] to boost response rates. Clinician participants will receive gift cards/gifts each time they complete a survey if allowed by their health system. Alternatively, an altruistic incentive will be used where the study will contribute to a charitable organization for each returned survey.

Cost will be measured using a pragmatic method to capture all resources needed to deploy the implementation strategies [83,84,85]. The primary objective of the cost analysis is to estimate the cost of each strategy at the system level to gather information that will allow other decision makers to assess the resources needed to take this approach to scale within their systems. We will capture these costs by prospectively and pragmatically using spreadsheet-based templates on a monthly basis consistent with our previous studies [83, 84, 86]. These templates provide the framework for capturing costs related to each component of the implementation strategy (e.g., Epic build and maintenance; facilitation training and activities).

Hypotheses

We will compare the effects of two active implementation conditions, Nudge (EHR SmartList) vs. Nudge+ (EHR SmartList + facilitation) at the end of the implementation period as well as at the end of a 1-year sustainment period. We will test a total of four related hypotheses:

  1. 1)

    Change in the probability of target fidelity from the pre-implementation period to the active implementation period will be equivalent in Nudge vs. Nudge+.

  2. 2)

    Change in the probability of target fidelity from the pre-implementation period to the active implementation period will be superior in Nudge+ relative to Nudge.

  3. 3)

    Change in the probability of target fidelity from the pre-implementation period to the sustainment period will be equivalent in Nudge vs. Nudge+.

  4. 4)

    Change in the probability of target fidelity from the pre-implementation period to the sustainment period will be superior in Nudge+ relative to Nudge.

These hypotheses will also be tested with regard to the secondary implementation outcomes of reach, acceptability, and cost. Finally, we will descriptively evaluate each arm separately to determine the magnitude of change in the probability of target fidelity and other implementation outcomes over time.

Aim 2: Use mixed methods to identify implementation strategy mechanisms

Our understanding of the mechanisms through which the implementation strategies work is informed by previous research [73, 87] describing the practice-level mechanism, clinical adaptive reserve, through which facilitation, a practice-level implementation strategy, operates. We hypothesize that facilitation will increase practice adaptive reserve, or the ability to make and sustain change at the practice level, because it will allow for problem-solving and tailoring specific to the individual practice. Previous research [73, 87] suggests that facilitation improves practice relationship infrastructure; aligns management functions in which clinical care, practice operations, and financial functions share a consistent vision; facilitates leadership and teamwork; and improves the work environment to create a culture of learning [87]. These are all components of adaptive reserve.

Participants and procedure

Participants will include clinicians and health system leaders (e.g., practice directors, department chairs, and health plan directors) in the two systems. In addition to surveys assessing the hypothesized mechanism at pre-implementation and active implementation as described in Aim 1, we will also conduct qualitative interviews with a subset of clinicians (n = 24) and leaders (n = 14) at the end of the active implementation period.

Primary mediator

We will measure practice-level adaptive reserve using the Practice Adaptive Reserve Scale [87], a self-report practice-level measure that is completed by practice staff and aggregated into an organizational construct composed of six factors that include relationship infrastructure, facilitative leadership, sense-making, teamwork, work environment, and culture of learning. The tool has high internal consistency, has been found to be associated with greater implementation in previous cross-sectional research [88], and is sensitive to change due to facilitation [87].

Moderators

We will measure clinician attitudes towards firearm safety promotion in pediatric healthcare settings using questions from the American Academy of Pediatrics Periodic Survey [89, 90]. We will also examine patient demographic variables (e.g., race, ethnicity, gender identity) as potential moderators.

Qualitative interviews

We will conduct brief interviews with a purposive sample of clinician survey respondents (equally distributed across health system and arm) to obtain more detailed information from those demonstrating high (n = 12 [6 per arm]) and low (n = 12 [6 per arm]) fidelity measured via EHR documentation. The purpose of these interviews will be to identify additional mechanisms through which implementation strategies might operate such as motivation, self-efficacy [91], and psychological safety (i.e., safe environment for risk taking) [92]. The interview guide will be developed using the Consolidated Framework for Implementation Research [40]. We will oversample for clinicians who report firearm ownership on the survey. We will interview all leaders who agree to participate (total N = 20; anticipated n = 14). Participants will receive $25 or an equivalent gift for participation as allowed by their health system as denoted above.

Aim 3: Examine the effects of the adapted intervention on clinical outcomes

The objective of this exploratory aim is to examine clinical outcomes to assess the public health impact of wide-scale health system implementation.

Participants and procedures

As described in Aim 1, we will survey all eligible parents in the participating practices within two weeks following their child’s well-child visit.

Exploratory effectiveness outcomes

We will assess parent-reported firearm storage behavior, as well as youth suicide attempts, death, and unintentional firearm injuries as exploratory outcomes. Firearm storage behavior will be assessed with two questions on the parent survey that ask parents: (1) whether they have made firearms less accessible to their children since their child’s recent visit, and if so, what changes they have made, and if no, (2) whether they intend to make firearms less accessible to their children since their child’s visit. The Theory of Planned Behavior informed the development of these questions [93]. Questions were piloted with parents to ensure sensitivity and appropriateness. Responses to the intention question will be rated on a five-point Likert scale ranging from strongly disagree to strongly agree.

Youth suicide attempts, deaths, and unintentional firearm injury and mortality data will be extracted from administrative data from each health system. Relevant events will be identified via ICD-10 codes and will include all codes typically used to identify suicide attempts (including non-firearm suicide attempts) as well as official state and federal mortality records that have already been matched to health system patient records.

Sample size calculation

Sample sizes differ by aim and approach. For quantitative outcomes, we powered on our primary implementation outcome of fidelity (i.e., parent-reported clinician delivery of the program). After accounting for non-response, we expect to include data from 18,556 parents of youth within 32 practices. Power calculations were implemented Computer Program PASS Power Analysis and Sample Size Software, (NCSS LLC, 2019) were based on a GEE test for two proportions in a cluster randomized design. Assuming an average practice size of 730 patients and an ICC of 0.03, we will have at least 89% power to detect a difference of .1 in the probability of fidelity between Nudge and Nudge+ in the active implementation period. For qualitative data, we will use purposive sampling until thematic saturation is reached (in the case of clinicians) or until all individuals within the group agree (in the case of leaders) [94].

Data analysis

In Aim 1, the primary dependent variable is parent-reported fidelity. For each observation period (pre-implementation, active implementation, sustainment) and for each implementation condition (Nudge, Nudge+), we will describe the proportion of parents who reported having received the intervention with fidelity. We will calculate fidelity using three binary outcomes that will be modeled separately: received counseling (yes/no), offered lock (yes/no), both (yes/no). For each fidelity outcome, we will fit a single model to simultaneously examine differences between the pre-exposure and active implementation periods for both conditions as well as differences between Nudge and Nudge+. For comparing the change in the log-odds of fidelity from pre-exposure to active implementation between Nudge and Nudge+, we will use a three-sided test to simultaneously test for equivalence and superiority (as well as non-inferiority) of Nudge+ relative to Nudge [95]. Based on input from leadership in the two health systems and a review of the literature [96,97,98], we established that in order for Nudge+ to be considered meaningfully superior to Nudge, the difference in the change in the probability of fidelity relative to pre-implementation would need to be detect a difference of .1 in the probability. All analyses will be repeated using the sustainment period outcomes in place of the active implementation period outcomes. We will also repeat these analyses for parent-reported safe storage and exploratory effectiveness variables including youth suicide attempts, deaths, and unintentional firearm injury and mortality. Additionally, we will conduct a sensitivity analysis to explore whether intervention effectiveness varies significantly by health system.

Mediation will be tested using the product of coefficients method [99,100,101]. In this approach, the total effect of Nudge+ relative to Nudge will be parsed into direct and indirect effects through the mediator, practice adaptive reserve. Models will test (a) the effect of Nudge+ relative to Nudge on practice adaptive reserve and (b) the effect of practice adaptive reserve on log-odds of fidelity, controlling for Nudge+ versus Nudge. All models will include covariates to address potential mediator-outcome confounds including baseline values of the mediator and outcome variables. We will also conduct sensitivity analyses to test for an exposure-mediator interaction and will model if appropriate. An unbiased estimate of the indirect effect will be derived via the product of coefficients from the two models and confidence intervals for the indirect effect will be generated using Monte Carlo methods [100,101,102,103]. We will test the statistical significance of the indirect effect using the joint significance test [103].

Variables that potentially modify the effect of Nudge+ relative to Nudge will be tested separately by adding terms for each moderator and its interaction with the exposure to the Aim 1 models for the active implementation period. These models will estimate the conditional relationships between Nudge+ (relative to Nudge) and implementation outcomes across different values of the putative moderators.

Qualitative analysis and mixed methods

Text answers from open-ended survey questions with parents from Aims 1 and 3, and digitally recorded and transcribed interviews with clinicians and leaders on the mechanisms of the implementation strategies, will be loaded into NVivo qualitative data analysis software [104]. Analysis will be guided by an integrated approach [105], which outlines a rigorous, systematic method for analyzing qualitative data using an inductive and deductive process of iterative coding to identify recurrent themes, categories, and relationships. The structure of our mixed methods approach is sequential (quantitative data is primarily collected before qualitative data and quantitative data is weighed more strongly than qualitative; QUAN>qual). The function is “complementarity” (to elaborate upon the quantitative findings to understand the how of implementation), and the process is connecting (having the qualitative data set build upon the quantitative data set) [106]. To integrate the quantitative and qualitative results, we will follow guidelines for best practices in mixed methods [107].

Discussion

The ASPIRE trial is a hybrid type III effectiveness-implementation trial with a longitudinal cluster randomized design. This research is a collaborative effort to combine insights from behavioral economics, diverse firearm safety stakeholders, clinicians, and health systems to test strategies to implement firearm safety as a universal suicide prevention strategy. This will be the first large-scale multi-health system study testing behavioral economic-informed implementation strategies. Both health systems included in our study indicated that they would adopt the Nudge if we can demonstrate its effectiveness in this trial, suggesting the sustainability of the proposed work. The health systems have indicated that practice facilitation (Nudge+) would need to show strong cost-effectiveness outcomes compared to Nudge for widespread adoption, considering the higher costs associated with facilitation. The evidence and insights generated can be taken to scale in the MHRN which includes 21 closely integrated health systems.

There are several strengths in the study. First, we use principles of behavioral economics and compare the effectiveness of a low cost, highly scalable implementation strategy to a more intensive strategy intended to address implementation barriers. Second, we have carefully designed our implementation strategies and adapted program based on end-user feedback, particularly firearm stakeholders who have often not been included in the conversation around firearm safety promotion in health care settings [41]. Lack of stakeholder input can be detrimental to eventual viability of programs and implementation strategies [1, 108]. Third, we will assess the costs of the implementation strategies, which have been understudied to date [109, 110]. Fourth, we will explore mechanisms of our implementation strategies; mechanistic research represents the next frontier of implementation research [48, 110]. Fifth, a strong partnership between our research team and health system stakeholders directly drives the research and we leverage the strong foundation of the MHRN. Sixth, we include an active comparison condition that is a true comparator and leverage the power of a cluster randomized trial to maximize methodological rigor.

There are also limitations. First, we did not include a control condition because our health system partners felt the public health urgency of our study topic required all practices to be assigned to an active implementation condition. Second, our reliance on EHR data to measure program reach may not sufficiently measure program delivery since it is possible for a clinician to deliver the intervention without documenting it. However, our analytic strategy to assume non-documentation reflects non-delivery of the program will, if anything, lead to conservative conclusions about program reach, and program reach may be greater than our analyses conclude. Third, our measurement for the number of cable locks taken from practices as a proxy for the maximum number of firearms that are secured after receipt of the S.A.F.E. Firearm program is limited, since the program could either prompt parents to secure their firearms using other locking devices that would not be captured via this metric, or could be an overestimate given that parents may take locks that they never intend to use or plan to use and don’t. Fourth, emerging evidence suggests that quick access safes are a preferred firearm storage mechanism for handguns [111], which is the most common type of firearm in the US, because they enable storage of loaded firearms for protection purposes [112]. However, quick access safes are more expensive and impractical to distribute at practices due to their size. Furthermore, cable locks work universally on nearly all types of firearms, including many handguns and long guns [113]; this contributes to their appropriateness for the present study since youth may be more likely than adults to use long guns in suicide. Offering cable locks is effective [67], and we will also distribute resources that assist parents in obtaining additional, alternative safe storage options. We will continue to work closely with lock manufacturers, potential partner organizations, and health systems to identify sustainable ways to source cable locks. Given that large health systems such as the Veterans Health Administration are moving towards purchasing and stocking free cable locks, we are confident that we will be able to work with the health systems to identify a sustainable plan for after the trial ends.

The ASPIRE trial will integrate implementation science and behavioral economic approaches to implement an evidence-based practice for firearm safety promotion in pediatric primary care. The study uses sophisticated methods to answer a question at the heart of many practice change efforts: which strategies are sufficient to support change, and why? Furthermore, the work can provide support for approaches that can bear significant outcomes for little cost. If successful, the proposed study will offer valuable insights into how best to implement evidence-based practices in pediatric primary care, particularly those that are sensitive in nature. We must act now to understand how best to implement evidence-based firearm safety programs to save the lives of American youth.

Availability of data and materials

No study data has been collected yet. Upon study completion, any datasets used and/or analyzed during the current study will be available from the corresponding author (RSB) on reasonable request.

Abbreviations

EHR:

Electronic health record

US:

United States

CFIR:

Consolidated Framework for Implementation Research

S.A.F.E Firearm:

Suicide and Accident Prevention through Family Education Firearm

StaRI:

Standards for Reporting Implementation Studies

ASPIRE:

Adolescent and child Suicide Prevention in Routine clinical Encounters

MHRN:

Mental Health Research Network

HFHS:

Henry Ford Health System

KPCO:

Kaiser Permanente Colorado

IRB:

Institutional Review Board

sIRB:

Single Institutional Review Board

DSMB:

Data safety and monitoring board

QUERI:

Quality Enhancement Research Initiative

AHRQ:

Agency for Healthcare Research and Quality

GEE:

Generalized Estimating Equation

ICD-10:

International Statistical Classification of Diseases and Related Health Problems – 10

References

  1. Williams NJ, Beidas RS. Annual research review: the state of implementation science in child psychology and psychiatry: a review and suggestions to advance the field. J Child Psychol Psychiatry. 2019;60(4):430–50. https://doi.org/10.1111/jcpp.12960.

    Article  PubMed  Google Scholar 

  2. Aarons GA, Ehrhart MG, Moullin JC, Torres EM, Green AE. Testing the Leadership and Organizational Change for Implementation (LOCI) intervention in substance abuse treatment: a cluster randomized trial study protocol. Implement Sci. 2017;12(1):29. https://doi.org/10.1186/s13012-017-0562-3.

    Article  PubMed  PubMed Central  Google Scholar 

  3. Zimmerman L, Lounsbury DW, Rosen CS, Kimerling R, Trafton JA, Lindley SE. Participatory system dynamics modeling: increasing stakeholder engagement and precision to improve implementation planning in systems. Adm Policy Ment Health. 2016;43(6):834–49. https://doi.org/10.1007/s10488-016-0754-1.

    Article  PubMed  PubMed Central  Google Scholar 

  4. Beidas RS, Buttenheim AM, Mandell DS. Transforming mental health care delivery through implementation science and behavioral economics. JAMA Psychiatry. 2021. https://doi.org/10.1001/jamapsychiatry.2021.1120.

  5. Fiske ST, Taylor SE. Social cognition: from brains to culture. 2nd ed. Thousand Oaks: Sage; 2013. https://doi.org/10.4135/9781446286395.

    Book  Google Scholar 

  6. Tversky A, Kahneman D. The framing of decisions and the psychology of choice. Science. 1981;211(4481):453–8. https://doi.org/10.1126/science.7455683.

    Article  CAS  PubMed  Google Scholar 

  7. Kahneman D, Tversky A. Prospect theory: an analysis of decision under risk. Econometrica. 1979;47(2):263–92. https://doi.org/10.2307/1914185.

    Article  Google Scholar 

  8. Kahneman D, Tversky A, editors. Chices, values, and frames. New York: Russel Sage Foundation; 2000. https://doi.org/10.1017/CBO9780511803475.

    Book  Google Scholar 

  9. Beidas RS, Volpp KG, Buttenheim AN, Marcus SC, Olfson M, Pellecchia M, et al. Transforming mental health delivery through behavioral economics and implementation science: protocol for three exploratory projects. JMIR Res Protoc. 2019;8(2):e12121. https://doi.org/10.2196/12121.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Patel MS, Day SC, Halpern SD, Hanson CW, Martinez JR, Honeywell S, et al. Generic medication prescription rates after health system-wide redesign of default options within the electronic health record. JAMA Intern Med. 2016;176(6):847–8. https://doi.org/10.1001/jamainternmed.2016.1691.

    Article  PubMed  PubMed Central  Google Scholar 

  11. Meeker D, Linder JA, Fox CR, Friedberg MW, Persell SD, Goldstein NJ, et al. Effect of behavioral interventions on inappropriate antibiotic prescribing among primary care practices: a randomized clinical trial. JAMA. 2016;315(6):562–70. https://doi.org/10.1001/jama.2016.0275.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  12. Patel MS, Volpp KG, Small DS, Wynne C, Zhu J, Yang L, et al. Using active choice within the electronic health record to increase influenza vaccination rates. J Gen Intern Med. 2017;32(7):790–5. https://doi.org/10.1007/s11606-017-4046-6.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Patel MS, Volpp KG. Leveraging insights from behavioral economics to increase the value of health-care service provision. J Gen Intern Med. 2012;27(11):1544–7. https://doi.org/10.1007/s11606-012-2050-4.

    Article  PubMed  PubMed Central  Google Scholar 

  14. Hsiao CJ, Hing E. Use and characteristics of electronic health record systems among office-based physician practices: United States, 2001-2012. NCHS Data Brief. 2012;(111):1-8. doi:https://doi.org/10.1097/01.sa.0000451505.72517.a5, 4, 206.

  15. Doshi JA, Lim R, Li P, Young PP, Lawnicki VF, State JJ, et al. A synchronized prescription refill program improved medication adherence. Health Aff. 2016;35(8):1504–12. https://doi.org/10.1377/hlthaff.2015.1456.

    Article  Google Scholar 

  16. Patel MS, Kurtzman GW, Kannan S, Small DS, Morris A, Honeywell S, et al. Effect of an automated patient dashboard using active choice and peer comparison performance feedback to physicians on statin prescribing: the prescribe cluster randomized clinical trial. JAMA Netw Open. 2018;1(3):e180818-e. doi:https://doi.org/10.1001/jamanetworkopen.2018.0818.

  17. Patel MS, Volpp KG, Asch DA. Nudge units to improve the delivery of health care. N Engl J Med. 2018;378(3):214–6. https://doi.org/10.1056/NEJMp1712984.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Last BS, Buttenheim AM, Timon CE, Mitra N, Beidas RS. Systematic review of clinician-directed nudges in healthcare contexts. BMJ Open. 2021;11(7):e048801. https://doi.org/10.1136/bmjopen-2021-048801.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Wolk CB, Van Pelt AE, Jager-Hyman S, Ahmedani BK, Zeber JE, Fein JA, et al. Stakeholder perspectives on implementing a firearm safety intervention in pediatric primary care as a universal suicide prevention strategy: a qualitative study. JAMA Netw Open. 2018;1(7):e185309-e. doi:https://doi.org/10.1001/jamanetworkopen.2018.5309.

  20. Baskerville NB, Liddy C, Hogg W. Systematic review and meta-analysis of practice facilitation within primary care settings. Ann Fam Med. 2012;10(1):63–74. https://doi.org/10.1370/afm.1312.

    Article  PubMed  PubMed Central  Google Scholar 

  21. Ritchie MJ, Dollar KM, Miller CJ, Oliver KA, Smith JL, Lindsay JA, et al. Using implementation facilitation to improve care in the Veterans Health Administration (version 2). Veterans Health Administration, Quality Enhancement Research Initiative (QUERI) for Team-Based Behavioral Health. 2017. https://www.queri.research.va.gov/tools/implementation/Facilitation-Manual.pdf. Accessed 3 Aug 2021.

  22. Agency for Healthcare Research and Quality. The practice facilitation handbook: training modules for new facilitators and their trainers. 2013. https://www.ahrq.gov/sites/default/files/publications/files/practicefacilitationhandbook.pdf. Accessed 3 Aug 2021.

  23. Ritchie MJ, Kirchner JE, Parker LE, Curran GM, Fortney JC, Pitcock JA, et al. Evaluation of an implementation facilitation strategy for settings that experience significant implementation barriers. Implement Sci. 2015;10(Suppl 1). doi:https://doi.org/10.1186/1748-5908-10-S1-A46.

  24. Cunningham RM, Walton MA, Carter PM. The major causes of death in children and adolescents in the United States. N Engl J Med. 2018;379(25):2468–75. https://doi.org/10.1056/NEJMsr1804754.

    Article  PubMed  PubMed Central  Google Scholar 

  25. Cunningham RM, Carter PM, Ranney ML, Walton M, Zeoli AM, Alpern ER, et al. Prevention of firearm injuries among children and adolescents: consensus-driven research agenda from the Firearm Safety Among Children and Teens (FACTS) consortium. JAMA Pediatr. 2019;173(8):780-9. doi:https://doi.org/10.1001/jamapediatrics.2019.1494.

  26. National Academy of Sciences. Health systems interventions to prevent firearm injuries and death: proceedings of a workshop. 2019. http://nationalacademies.org/hmd/Reports/2019/health-systems-interventions-prevent-firearm-injuries-death.aspx. Accessed 3 Aug 2021.

  27. Web-Based Injury Statistics Query and Reporting System (WISQARS). Centers for Disease Control and Prevention, National Center for Injury Prevention and Control. 2005. www.cdc.gov/injury/wisqars. Accessed 3 Aug 2021.

  28. Glenn CR, Franklin JC, Nock MK. Evidence-based psychosocial treatments for self-injurious thoughts and behaviors in youth. J Clin Child Adolesc Psychol. 2015;44(1):1–29. https://doi.org/10.1080/15374416.2014.945211.

    Article  PubMed  Google Scholar 

  29. Compressed Mortality File 1999-2010 – CDC WONDER Online Database. Centers for Disease Control and Prevention. https://wonder.cdc.gov/controller/datarequest/D140. Accessed 3 Aug 2021.

  30. Spicer RS, Miller TR. Suicide acts in 8 states: incidence and case fatality rates by demographics and method. Am J Public Health. 2000;90(12):1885–91. https://doi.org/10.2105/ajph.90.12.1885.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  31. Miller M, Azrael D, Barber C. Suicide mortality in the United States: the importance of attending to method in understanding population-level disparities in the burden of suicide. Annu Rev Public Health. 2012;33(1):393–408. https://doi.org/10.1146/annurev-publhealth-031811-124636.

    Article  PubMed  Google Scholar 

  32. American Academy of Pediatrics. Addressing gun violence at the practice level. 2021. [cited 2021 Aug 3]. Available from: https://www.aap.org/en-us/advocacy-and-policy/aap-health-initiatives/Pages/Unintentional-Injury-in-Practice.aspx.

  33. National Shooting Sports Foundation. Safety. 2021. [cited 2021 Aug 3]. Available from: https://www.nssf.org/safety/.

  34. Azrael D, Cohen J, Salhi C, Miller M. Firearm storage in gun-owning households with children: results of a 2015 national survey. J Urban Health. 2018;95(3):295–304. https://doi.org/10.1007/s11524-018-0261-7.

    Article  PubMed  PubMed Central  Google Scholar 

  35. Anglemyer A, Horvath T, Rutherford G. The accessibility of firearms and risk for suicide and homicide victimization among household members: a systematic review and meta-analysis. Ann Intern Med. 2014;160(2):101–10. https://doi.org/10.7326/M13-1301.

    Article  PubMed  Google Scholar 

  36. Monuteaux MC, Azrael D, Miller M. Association of increased safe household firearm storage with firearm suicide and unintentional death among US youths. JAMA Pediatr. 2019;173(7):657–62. https://doi.org/10.1001/jamapediatrics.2019.1078.

    Article  PubMed  PubMed Central  Google Scholar 

  37. Barkin SL, Finch SA, Ip EH, Scheindlin B, Craig JA, Steffes J, et al. Is office-based counseling about media use, timeouts, and firearm storage effective? Results from a cluster-randomized, controlled trial. Pediatrics. 2008;122(1):e15–25. https://doi.org/10.1542/peds.2007-2611.

    Article  PubMed  Google Scholar 

  38. Beidas RS, Jager-Hyman S, Becker-Haimes E, Wolk C, Ahmedani B, Zeber J, et al. Acceptability and use of evidence-based practices for firearm storage in pediatric primary care. Acad Pediatr. 2019;19(6):670–6. https://doi.org/10.1016/j.acap.2018.11.007.

    Article  PubMed  Google Scholar 

  39. Wolk CB, Jager-Hyman S, Marcus SC, Ahmedani BK, Zeber JE, Fein JA, et al. Developing implementation strategies for firearm safety promotion in paediatric primary care for suicide prevention in two large US health systems: a study protocol for a mixed-methods implementation study. BMJ Open. 2017;7(6):e014407. https://doi.org/10.1136/bmjopen-2016-014407.

    Article  PubMed  PubMed Central  Google Scholar 

  40. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4(1). https://doi.org/10.1186/1748-5908-4-50.

  41. Jager-Hyman S, Wolk CB, Ahmedani BK, Zeber JE, Fein JA, Brown GK, et al. Perspectives from firearm stakeholders on firearm safety promotion in pediatric primary care as a suicide prevention strategy: a qualitative study. J Behav Med. 2019;42(4):691–701. https://doi.org/10.1007/s10865-019-00074-9.

    Article  PubMed  PubMed Central  Google Scholar 

  42. Wingood GM, DiClemente RJ. The ADAPT-ITT model: a novel method of adapting evidence-based HIV interventions. J Acquir Immune Defic Syndr. 2008;47(suppl 1):S40–S6. https://doi.org/10.1097/QAI.0b013e3181605df1.

    Article  PubMed  Google Scholar 

  43. Davis M, Johnson C, Pettit AR, Barkin S, Hoffman BD, Jager-Hyman S, et al. Adapting safety check as a universal suicide prevention strategy in pediatric primary care. Acad Pediatr. 2021. https://doi.org/10.1016/j.acap.2021.04.012.

  44. Fernandez ME, Ten Hoor GA, van Lieshout S, Rodriguez SA, Beidas RS, Parcel G, et al. Implementation mapping: using intervention mapping to develop implementation strategies. Front Public Health. 2019;7. https://doi.org/10.3389/fpubh.2019.00158.

  45. Tabak RG, Khoong EC, Chambers DA, Brownson RC. Bridging research and practice: models for dissemination and implementation research. Am J Prev Med. 2012;43(3):337–50. https://doi.org/10.1016/j.amepre.2012.05.024.

    Article  PubMed  PubMed Central  Google Scholar 

  46. Strifler L, Cardoso R, McGowan J, Cogo E, Nincic V, Khan PA, et al. Scoping review identifies significant number of knowledge translation theories, models, and frameworks with limited use. J Clin Epidemiol. 2018;100:92–102. https://doi.org/10.1016/j.jclinepi.2018.04.008.

    Article  PubMed  Google Scholar 

  47. Krause J, Van Lieshout J, Klomp R, Huntink E, Aakhus E, Flottorp S, et al. Identifying determinants of care for tailoring implementation in chronic diseases: an evaluation of different methods. Implement Sci. 2014;9(1):102. https://doi.org/10.1186/s13012-014-0102-3.

    Article  PubMed  PubMed Central  Google Scholar 

  48. Lewis CC, Klasnja P, Powell BJ, Lyon AR, Tuzzio L, Jones S, et al. From classification to causality: advancing understanding of mechanisms of change in implementation science. Front Public Health. 2018;6(136). doi:https://doi.org/10.3389/fpubh.2018.00136.

  49. Kazdin AE. Mediators and mechanisms of change in psychotherapy research. Annu Rev Clin Psychol. 2007;3(1):1–27. https://doi.org/10.1146/annurev.clinpsy.3.022806.091432.

    Article  PubMed  Google Scholar 

  50. Pinnock H, Barwick M, Carpenter CR, Eldridge S, Grandes G, Griffiths CJ, et al. Standards for reporting implementation studies (StaRI) statement. Br Med J (Clin Res Ed). 2017;356(i6795). doi:https://doi.org/10.1136/bmj.i6795.

  51. Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care. 2012;50(3):217–26. https://doi.org/10.1097/MLR.0b013e3182408812.

    Article  PubMed  PubMed Central  Google Scholar 

  52. Liang K-Y, Zeger SL. Longitudinal data analysis of continuous and discrete responses for pre-post designs. Sankhya Ser B. 2000;62:134–48. https://doi.org/10.2307/25053123.

    Article  Google Scholar 

  53. Localio AR, Berlin JA, Have TRT. Longitudinal and repeated cross-sectional cluster-randomization designs using mixed effects regression for binary outcomes: bias and coverage of frequentist and Bayesian methods. Stat Med. 2006;25(16):2720–36. https://doi.org/10.1002/sim.2428.

    Article  PubMed  Google Scholar 

  54. Diggle P, Heagerty P, Liang K-Y, Zeger S. Analysis of longitudinal data. Oxford: Oxford University Press; 2002.

    Google Scholar 

  55. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. 2011;38(2):65–76. https://doi.org/10.1007/s10488-010-0319-7.

    Article  PubMed  Google Scholar 

  56. Anderson LM, Lowry LS, Wuensch KL. Racial differences in adolescents’ answering questions about suicide. Death Stud. 2015;39(10):600–4. https://doi.org/10.1080/07481187.2015.1047058.

    Article  PubMed  Google Scholar 

  57. Bridge JA, Asti L, Horowitz LM, Greenhouse JB, Fontanella CA, Sheftall AH, et al. Suicide trends among elementary school-aged children in the United States from 1993 to 2012. JAMA Pediatr. 2015;169(7):673–7. https://doi.org/10.1001/jamapediatrics.2015.0465.

    Article  PubMed  Google Scholar 

  58. Martin CA, Unni P, Landman MP, Feurer ID, McMaster A, Dabrowiak M, et al. Race disparities in firearm injuries and outcomes among Tennessee children. J Pediatr Surg. 2012;47(6):1196–203. https://doi.org/10.1016/j.jpedsurg.2012.03.029.

    Article  PubMed  Google Scholar 

  59. State-Level Estimates of Household Firearm Ownership. Schell TL, Peterson S, Vegetabile BG, Scherling A, Smart R, Morral AR. RAND Corporation. 2020. https://www.rand.org/pubs/tools/TL354.html. Accessed 3 Aug 2021.

  60. Curtin SC, Heron M, Miniño AM, Warner M. Recent increases in injury mortality among children and adolescents aged 10-19 years in the United States: 1999-2016. Natl Vital Stat Rep. 2018;67(4):1–16.

    PubMed  Google Scholar 

  61. Plemmons G, Hall M, Doupnik S, Gay J, Brown C, Browning W, et al. Hospitalization for suicide ideation or attempt: 2008–2015. Pediatrics. 2018;141(6):e20172426. https://doi.org/10.1542/peds.2017-2426.

    Article  PubMed  Google Scholar 

  62. Bridge JA, Horowitz LM, Fontanella CA, Sheftall AH, Greenhouse J, Kelleher KJ, et al. Age-related racial disparity in suicide rates among US youths from 2001 through 2015. JAMA Pediatr. 2018;172(7):697–9. https://doi.org/10.1001/jamapediatrics.2018.0399.

    Article  PubMed  PubMed Central  Google Scholar 

  63. NIMH Office of Behavioral and Social Sciences Research. Identifying research priorities in child suicide risk. 2019; Bethesda. Available from: https://www.nimh.nih.gov/news/events/announcements/identifying-research-priorities-in-child-suicide-risk.shtml

  64. Bandura A. Social foundations of thought and action: a social cognitive theory. Englewood Cliffs: Prentice-Hall; 1986.

    Google Scholar 

  65. Haught K, Grossman D, Connell F. Parents' attitudes toward firearm injury prevention counseling in urban pediatric clinics. Pediatrics. 1995;96(4):649–53.

    CAS  PubMed  Google Scholar 

  66. Leslie KM. Canadian Paediatric Society, Adolescent Health Committee. Harm reduction: an approach to reducing risky health behaviours in adolescents. Paediatr Child Health. 2008;13(1):53–6. https://doi.org/10.1093/pch/13.1.53.

    Article  Google Scholar 

  67. Rowhani-Rahbar A, Simonetti JA, Rivara FP. Effectiveness of interventions to promote safe firearm storage. Epidemiol Rev. 2016;38(1):111–24. https://doi.org/10.1093/epirev/mxv006.

    Article  PubMed  Google Scholar 

  68. Hoskins K, Johnson C, Davis M, Pettit A, Barkin S, Cunningham R, et al. Applying the ADAPT-ITT framework to the Safety Check safe firearm program to optimize acceptability and effectiveness. 13th Annual Conference on the Science of Dissemination and Implementation (virtual); 2020.

  69. Johnson CA, Davis M, Pettit AR, Barkin SL, Cunningham R, Hemenway D, et al. Adaptation of the Safety Check safe firearm storage program using the ADAPT-ITT framework. 2nd Annual Firearm Safety Among Children and Teens (FACTS) Annual Symposium (virtual); 2020.

  70. Reed S, Shell R, Kassis K, Tartaglia K, Wallihan R, Smith K, et al. Applying adult learning practices in medical education. Curr Probl Pediatr Adolesc Health Care. 2014;44(6):170–81. https://doi.org/10.1016/j.cppeds.2014.01.008.

    Article  PubMed  Google Scholar 

  71. Rahm AK, Price D, Beck A, Martin C, Boggs J, Backer T, et al. CC2-02: feasibility of implementing Screening, Brief Intervention, and Referral to Treatment (SBIRT) at Kaiser Permanente Colorado. Clin Med Res. 2012;10(3):143–98. https://doi.org/10.3121/cmr.2012.1100.cc2-02.

    Article  PubMed Central  Google Scholar 

  72. Harrison JD, Patel MS. Designing nudges for success in health care. AMA J Ethics. 2020;22(9):796–801. https://doi.org/10.1001/amajethics.2020.796.

    Article  Google Scholar 

  73. Shelley DR, Ogedegbe G, Anane S, Wu WY, Goldfeld K, Gold HT, et al. Testing the use of practice facilitation in a cluster randomized stepped-wedge design trial to improve adherence to cardiovascular disease prevention guidelines: HealthyHearts NYC. Implement Sci. 2015;11(1):88. https://doi.org/10.1186/s13012-016-0450-2.

    Article  Google Scholar 

  74. Kirchner JE, Ritchie MJ, Pitcock JA, Parker LE, Curran GM, Fortney JC. Outcomes of a partnered facilitation strategy to implement primary care–mental health. J Gen Intern Med. 2014;29(4):904–12. https://doi.org/10.1007/s11606-014-3027-2.

    Article  PubMed  PubMed Central  Google Scholar 

  75. Agency for Healthcare Research and Quality. Training program summary: Millard Fillmore College Practice Facilitator Certificate Program. 2014. https://www.ahrq.gov/ncepcr/tools/case-studies/fillmore.html. Accessed 3 Aug 2021.

  76. Ivers N, Jamtvedt G, Flottorp S, Young JM, Odgaard-Jensen J, French SD, et al. Audit and feedback: effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev. 2012;6. https://doi.org/10.1002/14651858.CD000259.pub3.

  77. Moulton LH. Covariate-based constrained randomization of group-randomized trials. Clin Trials. 2004;1(3):297–305. https://doi.org/10.1191/1740774504cn024oa.

    Article  PubMed  Google Scholar 

  78. National Center for Health Statistics. NCHS urban-rural classification scheme for counties. 2013. https://www.cdc.gov/nchs/data_access/urban_rural.htm. Accessed 6 Aug 2021.

  79. Gaglio B, Shoup JA, Glasgow RE. The RE-AIM framework: a systematic review of use over time. Am J Public Health. 2013;103(6):e38–46. https://doi.org/10.2105/AJPH.2013.301299.

    Article  PubMed  PubMed Central  Google Scholar 

  80. Prusaczyk B, Fabbre V, Carpenter CR, Proctor E. Measuring the delivery of complex interventions through electronic medical records: challenges and lessons learned. eGEMs. 2018;6(1):1–12. https://doi.org/10.5334/egems.230.

    Article  Google Scholar 

  81. Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, Conde JG. Research electronic data capture (REDCap)—a metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform. 2009;42(2):377–81. https://doi.org/10.1016/j.jbi.2008.08.010.

    Article  PubMed  Google Scholar 

  82. Dillman DA, Smyth JD, Christian LM. Internet, phone, mail, and mixed-mode surveys: the Tailored Design Method. Hoboken, NJ: John Wiley & Sons; 2014.

    Google Scholar 

  83. Ritzwoller DP, Sukhanova A, Gaglio B, Glasgow RE. Costing behavioral interventions: a practical guide to enhance translation. Ann Behav Med. 2009;37(2):218–27. https://doi.org/10.1007/s12160-009-9088-5.

    Article  PubMed  Google Scholar 

  84. Ritzwoller DP, Glasgow RE, Sukhanova AY, Bennett GG, Warner ET, Greaney ML, et al. Economic analyses of the Be Fit Be Well program: a weight loss program for community health centers. J Gen Intern Med. 2013;28(12):1581–8. https://doi.org/10.1007/s11606-013-2492-3.

    Article  PubMed  PubMed Central  Google Scholar 

  85. Ritzwoller DP, Sukhanova AS, Glasgow RE, Strycker LA, King DK, Gaglio B, et al. Intervention costs and cost-effectiveness for a multiple-risk-factor diabetes self-management trial for Latinas: economic analysis of ¡Viva Bien! Transl Behav Med. 2011;1(3):427–35. https://doi.org/10.1007/s13142-011-0037-z.

    Article  PubMed  PubMed Central  Google Scholar 

  86. Boggs JM, Ritzwoller DP, Beck A, Dimidjian S, Segal ZV. Cost-effectiveness of a web-based program for residual depressive symptoms: Mindful Mood Balance. Psychiatr Serv. 2021:appi.ps.2020004. https://doi.org/10.1176/appi.ps.202000419.

  87. Nutting PA, Crabtree BF, Stewart EE, Miller WL, Palmer RF, Stange KC, et al. Effect of facilitation on practice outcomes in the National Demonstration Project model of the patient-centered medical home. Ann Fam Med. 2010;8(Suppl. 1):S33–44. https://doi.org/10.1370/afm.1119.

    Article  PubMed  PubMed Central  Google Scholar 

  88. Tu S-P, Young V, Coombs LJ, Williams R, Kegler M, Kimura A, et al. Practice adaptive reserve and colorectal cancer screening best practices at community health center clinics in seven states. Cancer. 2015;121(8):1241–8. https://doi.org/10.1002/cncr.29176.

    Article  PubMed  Google Scholar 

  89. Olson LM, Christoffel KK, O’Connor KG. Pediatricians’ involvement in gun injury prevention. Inj Prev. 2007;13(2):99–104. https://doi.org/10.1136/ip.2006.012401.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  90. American Academy of Pediatrics. Periodic Survey of Fellows. 2021. https://www.aap.org/en-us/professional-resources/Research/pediatrician-surveys/Pages/Periodic-Survey-of-Fellows.aspx. Accessed 6 Aug 2021.

  91. Grol R, Wensing M. What drives change? Barriers to and incentives for achieving evidence-based practice. Med J Aust. 2004;180(S6):S57–60. https://doi.org/10.5694/j.1326-5377.2004.tb05948.x.

    Article  PubMed  Google Scholar 

  92. Edmondson A. Psychological safety and learning behavior in work teams. Adm Sci Q. 1999;44(2):350–83. https://doi.org/10.2307/2666999.

    Article  Google Scholar 

  93. Fishman J, Lushin V, Mandell DS. Predicting implementation: comparing validated measures of intention and assessing the role of motivation when designing behavioral interventions. Implement Sci Commun. 2020;1(1):81. https://doi.org/10.1186/s43058-020-00050-4.

    Article  PubMed  PubMed Central  Google Scholar 

  94. Guest G, Namey E, Chen M. A simple method to assess and report thematic saturation in qualitative research. PLoS One. 2020;15(5):e0232076. https://doi.org/10.1371/journal.pone.0232076.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  95. Goeman JJ, Solari A, Stijnen T. Three-sided hypothesis testing: simultaneous testing of superiority, equivalence and inferiority. Stat Med. 2010;29(20):2117–25. https://doi.org/10.1002/sim.4002.

    Article  PubMed  Google Scholar 

  96. Rusticus SA, Lovato CY. Applying tests of equivalence for multiple group comparisons: demonstration of the confidence interval approach. Pract Assess Res Eval. 2011;16. https://doi.org/10.7275/d5wf-5p77.

  97. Bauer BW, Martin RL, Allan NP, Fink-Miller EL, Capron DW. An investigation into the acquired capability for suicide. Suicide and Life-Threatening Behavior. 2019;49(4):1105–18. https://doi.org/10.1111/sltb.12502.

    Article  PubMed  Google Scholar 

  98. Lewis I, Watson B, White KM. Internet versus paper-and-pencil survey methods in psychological experiments: equivalence testing of participant responses to health-related messages. Aust J Psychol. 2009;61(2):107–16. https://doi.org/10.1080/00049530802105865.

    Article  Google Scholar 

  99. Krull JL, MacKinnon DP. Multilevel modeling of individual and group level mediated effects. Multivariate Behav Res. 2001;36(2):249–77. https://doi.org/10.1207/S15327906MBR3602_06.

    Article  CAS  PubMed  Google Scholar 

  100. Pituch KA, Murphy DL, Tate RL. Three-level models for indirect effects in school- and class-randomized experiments in education. J Exp Educ. 2009;78(1):60–95. https://doi.org/10.1080/00220970903224685.

    Article  Google Scholar 

  101. Zhang Z, Zyphur MJ, Preacher KJ. Testing multilevel mediation using hierarchical linear models: problems and solutions. Organ Res Methods. 2009;12(4):695–719. https://doi.org/10.1177/1094428108327450.

    Article  Google Scholar 

  102. Glisson C, Williams NJ, Hemmelgarn A, Proctor E, Green P. Aligning organizational priorities with ARC to improve youth mental health service outcomes. J Consult Clin Psychol. 2016;84(8):713–25. https://doi.org/10.1037/ccp0000107.

    Article  PubMed  PubMed Central  Google Scholar 

  103. MacKinnon DP, Lockwood CM, Hoffman JM, West SG, Sheets V. A comparison of methods to test mediation and other intervening variable effects. Psychol Methods. 2002;7(1):83–104. https://doi.org/10.1037/1082-989X.7.1.83.

    Article  PubMed  PubMed Central  Google Scholar 

  104. Computer Program NVivo Qualitative Data Analysis Software, (QSR International, 2012).

  105. Bradley EH, Curry LA, Devers KJ. Qualitative data analysis for health services research: developing taxonomy, themes, and theory. Health Serv Res. 2007;42(4):1758–72. https://doi.org/10.1111/j.1475-6773.2006.00684.x.

    Article  PubMed  PubMed Central  Google Scholar 

  106. Palinkas LA, Aarons GA, Horwitz S, Chamberlain P, Hurlburt M, Landsverk J. Mixed method designs in implementation research. Adm Policy Ment Health. 2011;38(1):44–53. https://doi.org/10.1007/s10488-010-0314-z.

    Article  PubMed  Google Scholar 

  107. Creswell JW, Klassen AC, Clark VLP, Smith KC. Best practices for mixed methods research in the health sciences. National Institutes of Health Office of Behavioral and Social Sciences. 2011. https://obssr.od.nih.gov/wp-content/uploads/2016/02/Best_Practices_for_Mixed_Methods_Research.pdf. Accessed 3 Aug 2021.

  108. Mittman BS. Implementation science in health care. In: Brownson RC, Colditz GA, Proctor EK, editors. Dissemination and implementation research in health: translating science to practice. New York: Oxford University Press; 2012. p. 400–18. https://doi.org/10.1093/acprof:oso/9780199751877.003.0019.

    Chapter  Google Scholar 

  109. Raghavan R, Bright CL, Shadoin AL. Toward a policy ecology of implementation of evidence-based practices in public mental health settings. Implement Sci. 2008;3(1). https://doi.org/10.1186/1748-5908-3-26.

  110. Eisman AB, Kilbourne AM, Dopp AR, Saldana L, Eisenberg D. Economic evaluation in implementation science: making the business case for implementation strategies. Psychiatry Res. 2020;283:112433. https://doi.org/10.1016/j.psychres.2019.06.008.

    Article  PubMed  Google Scholar 

  111. Ditty MS, Landes SJ, Doyle A, Beidas RS. It takes a village: a mixed method analysis of inner setting variables and dialectical behavior therapy implementation. Adm Policy Ment Health. 2015;42(6):672–81. https://doi.org/10.1007/s10488-014-0602-0.

    Article  PubMed  PubMed Central  Google Scholar 

  112. Stroebe W, Leander NP, Kruglanski AW. Is it a dangerous world out there? The motivational bases of American gun ownership. Pers Soc Psychol Bull. 2017;43(8):1071–85. https://doi.org/10.1177/0146167217703952.

    Article  PubMed  Google Scholar 

  113. Nestadt PS, MacKrell K, McCourt AD, Fowler DR, Crifasi CK. Prevalence of long gun use in Maryland firearm suicides. Inj Epidemiol. 2020;7(1):4. https://doi.org/10.1186/s40621-019-0230-y.

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Acknowledgements

◦ Data Safety and Monitoring Board: Jeff Bridge, PhD (Chair); Daniel Almirall, PhD; Marian (Emmy) Betz, MD, MPH

◦ Consultants: Shari Barkin, MD; Geoffrey Curran, PhD; Ken Lewis, MBA; Jose Morales; Amy Pettit, PhD

◦ FACTS Consortium for supporting the pilot project to adapt S.A.F.E Firearm

◦ Mental Health Research Network

Funding

The National Institute of Mental Health funded this study (R01 MH123491: PI, Rinad S. Beidas, PhD). Molly Davis and Katelin Hoskins are supported by a National Institute of Mental Health Training Fellowship (T32 MH109433; MPI, David S. Mandell, ScD, Rinad S. Beidas, PhD). Kaiser Permanente Colorado and Henry Ford Health System are both part of the National Institute of Mental Health Mental Health Research Network (U19 MH092201; PI, Gregory E. Simon, MD, MPH). The funders had no role in the design of the study and will have no role in data collection, analysis, interpretation, or manuscript writing.

Author information

Authors and Affiliations

Authors

Contributions

RSB designed the study, secured funding, and drafted the manuscript. JMB and BKA are the site PIs who contributed to study design and grant-writing and will lead all implementation activities and evaluation at their respective sites. KAL is the lead study biostatistician. MM, AMB, ALB, BM, KH, MEE, MD, MFD, SJH, DPR, DSS, CBW, and SCM are co-investigators and/or significant contributors who provided input into study design, data collection, and data analytic procedures. CJ, JW, LW, and AL are senior study staff who have provided input into study design, data collection, and data analytic procedures. NJW is consultant who provided input into mediational analyses. All authors read, revised, and approved the final manuscript.

Authors’ information

◦ RSB is Director of the Penn Medicine Nudge Unit and Director of the Penn Implementation Science Center at the Leonard Davis Institute of Health Economics; Associate Director of the Center for Health Incentives and Behavioral Economics; and Associate Professor of Psychiatry, Medical Ethics and Health Policy, and Medicine at the University of Pennsylvania. She is a Senior Fellow at the Leonard Davis Institute of Health Economics.

◦ BKA is Director of the Center for Health Policy and Health Services Research and Director of Behavioral Health Research at Henry Ford Health System and Chair of the Mental Health Research Network Suicide Prevention Scientific Interest Group.

◦ JMB is a Scientific Research Associate at the Institute for Health Research at Kaiser Permanente Colorado.

◦ KAL is an Assistant Professor of Biostatistics, Epidemiology, and Informatics at the Perelman School of Medicine at the University of Pennsylvania.

◦ CJ is a Senior Research Coordinator at the Penn Medicine Nudge Unit and at the University of Pennsylvania Perelman School of Medicine.

◦ MM is an Investigator and Assistant Scientist at the Center for Health Policy and Health Services Research at Henry Ford Health System.

◦ JW is a Project Coordinator at the Center for Health Policy and Health Services Research at Henry Ford Health System.

◦ LW is a Project Manager and Community Research Liaison at the Institute for Health Research at Kaiser Permanente Colorado.

◦ ALB is a Senior Investigator in the Institute for Health Research at Kaiser Permanente Colorado; Kaiser Permanente Colorado lead investigator for the Mental Health Research Network; an Associate Professor of Family Medicine at the University of Colorado School of Medicine; and Chair Emeritus of the Board of Directors for the Jefferson Center for Mental Health.

◦ AMB is Scientific Director at the Center for Health Incentives and Behavioral Economics and Associate Professor of Nursing and Health Policy at Penn Nursing and Perelman School of Medicine. She is a Senior Fellow of the Leonard Davis Institute of Health Economics.

◦ MFD is a Senior Investigator at the Institute for Health Research and a practicing pediatrician at Kaiser Permanente Colorado; an Associate Professor in Pediatrics at the University of Colorado School of Medicine; and a Fellow in the American Academy of Pediatrics.

◦ MD is a Research Psychologist in the Department of Child and Adolescent Psychiatry and Behavioral Sciences at the Children’s Hospital of Philadelphia.

◦ SJH is Assistant Professor in the Department of Psychiatry at the University of Pennsylvania.

◦ KH is a Postdoctoral Fellow at the Penn Medicine Nudge Unit and at the University of Pennsylvania Perelman School of Medicine; and a Psychiatric-Mental Health Nurse Practitioner at the Children’s Hospital of Philadelphia.

◦ AL is Director of Research Operations of the Penn Medicine Nudge Unit and in the University of Pennsylvania Perelman School of Medicine.

◦ DPR is an Economist and Senior Investigator at the Institute for Health Research at Kaiser Permanente Colorado and an Adjunct Professor in the Department of Health Systems, Management, and Policy at the University of Colorado School of Public Health.

◦ DSS is the Universal Furniture Professor of Statistics and Data Science and Department Chair at the University of Pennsylvania.

◦ CBW is an Assistant Professor in the Department of Psychiatry; a Senior Fellow at the Leonard Davis Institute of Health Economics; and an Associate Fellow at the Center for Public Health Initiatives at the University of Pennsylvania.

◦ NJW is an Associate Professor in the School of Social Work at Boise State University and a Licensed Clinical Social Worker.

◦ SCM is a Research Associate Professor in the School of Social Policy and Practice at the University of Pennsylvania.

◦ MEE is an American Board of Pediatrics-certified pediatrician at Henry Ford Health System.

◦ BM is an American Board of Pediatrics-certified pediatrician at Henry Ford Health System.

Corresponding author

Correspondence to Rinad S. Beidas.

Ethics declarations

Ethics approval and consent to participate

This study was approved by the University of Pennsylvania institutional review board (IRB; protocol #844327). In compliance with National Institutes of Health, all participating health systems have agreed to rely on a single IRB (sIRB); the University of Pennsylvania IRB is the designated sIRB. Reliance agreements were completed by all participating sites. Informed consent will be obtained from all participants for surveys and qualitative interviews.

Consent for publication

Not applicable.

Competing interests

The authors declare the following competing interests: RSB receives royalties from Oxford University Press. She has served as a consultant to Camden Coalition of Healthcare Providers. She provides consultation to United Behavioral Health. She serves on the Clinical and Scientific Advisory Board for Optum Behavioral Health.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

Standards for Reporting Implementation Studies: the StaRI checklist for completion.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Beidas, R.S., Ahmedani, B., Linn, K.A. et al. Study protocol for a type III hybrid effectiveness-implementation trial of strategies to implement firearm safety promotion as a universal suicide prevention strategy in pediatric primary care. Implementation Sci 16, 89 (2021). https://doi.org/10.1186/s13012-021-01154-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13012-021-01154-8

Keywords