Skip to main content
  • Study protocol
  • Open access
  • Published:

Alarm with care—a de-implementation strategy to reduce fall prevention alarm use in US hospitals: a study protocol for a hybrid 2 effectiveness-implementation trial

Abstract

Background

Fall prevention alarms are commonly used among US hospitals as a fall prevention strategy despite limited evidence of effectiveness. Further, fall prevention alarms are harmful to healthcare staff (e.g., alarm fatigue) and patients (e.g., sleep disturbance, mobility restriction). There is a need for research to develop and test strategies for reducing use of fall prevention alarms in US hospitals.

Methods

To address this gap, we propose testing the effectiveness and implementation of Alarm with Care, a de-implementation strategy to reduce fall prevention alarm use using a stepped-wedge randomized controlled trial among 30 adult medical or medical surgical units from nonfederal US acute care hospitals. Guided by the Choosing Wisely De-Implementation Framework, we will (1) identify barriers to fall prevention alarm de-implementation and develop tailored de-implementation strategies for each unit and (2) compare the implementation and effectiveness of high- versus low-intensity coaching to support site-specific de-implementation of fall prevention alarms. We will evaluate effectiveness and implementation outcomes and examine the effect of multi-level (e.g., hospital, unit, and patient) factors on effectiveness and implementation. Rate of fall prevention alarm use is the primary outcome. Balancing measures will include fall rates and fall-related injuries. Implementation outcomes will include feasibility, acceptability, appropriateness, and fidelity.

Discussion

Findings from this line of research could be used to support scale-up of fall prevention alarm de-implementation in other healthcare settings. Further, research generated from this proposal will advance the field of de-implementation science by determining the extent to which low-intensity coaching is an effective and feasible de-implementation strategy.

Trial registration

ClinicalTrials.gov identifier: NCT06089239. Date of registration: October 17, 2023.

Background

Falls among hospitalized patients are common, costly, and serious. Each year, up to one million patient falls occur in US hospitals [1]. Patient falls can lead to harm, including injury and death, and increased healthcare costs for patients, payers, and healthcare systems [2,3,4,5,6]. Given the serious consequences, patient falls and fall-related injuries are considered key quality measures for US hospitals. The National Quality Forum recommends that patient falls and fall-related injuries be used as quality measures for nursing-sensitive care [7, 8]. Many healthcare systems monitor total falls and fall-related injuries through reporting systems, such as the National Database of Nursing Quality Indicators® (NDNQI®) [9, 10]. Fall-related injury is also used by the Centers for Medicare and Medicaid Services (CMS) for payment policy. CMS prohibits reimbursement for fall-related injuries during hospitalization [11, 12] and imposes financial penalties for hospitals with poor performance on fall-related injuries [13]. Therefore, hospitals are under significant financial pressure to reduce falls. In response, hospitals have increasingly relied upon unproven strategies, such as fall prevention alarms, since the passage of CMS’s falls-related payment policies [14].

Fall prevention alarms are commonly used among US hospitals as a fall prevention strategy despite limited evidence of effectiveness. A study examining fall prevention practices among 59 acute care nursing units found that over one-third of observed patients (531 out of 1489) had a fall prevention alarm in the “on” position during an audit conducted by nursing staff [15]. Prior studies estimate that 90–99% of US hospitals use fall prevention alarms [16, 17]. While fall prevention alarm use is common, there is limited evidence to support alarm use and evidence that alarms may be harmful. Several randomized controlled trials have demonstrated that fall prevention alarms do not reduce patient falls or fall-related injuries and are not a cost-effective strategy for falls prevention [18, 19]. Fall prevention alarms have a high false-positive rate (ranging from 50 to 99%) and contribute to alarm fatigue among healthcare staff [20,21,22,23,24]. Further, fall prevention alarms lead to mobility restriction, agitation, and sleep disturbance among patients [20, 25,26,27]. The harm caused by fall prevention alarms has led CMS to restrict the use of fall prevention alarms in nursing homes [25]. Similar efforts are needed to encourage de-implementation of fall prevention alarms in US hospitals.

Few studies have examined barriers to fall prevention alarm de-implementation in US hospitals. Nursing staff often report pressure from hospital leadership to use fall prevention alarms and fear reprimand or punishment if a patient falls during their shift [23, 28]. There is limited research on how to overcome barriers to fall prevention alarm de-implementation, however. To address this gap, we propose testing the effectiveness and implementation of a fall prevention alarm de-implementation strategy, Alarm with Care, using a stepped-wedge cluster-randomized controlled trial (SW-CRT) among 30 medical or medical surgical units from nonfederal US acute care hospitals. Our approach will be guided by the Choosing Wisely De-Implementation Framework [29]. In Aim 1, we will identify barriers to fall prevention alarm de-implementation and develop tailored de-implementation strategies for each unit. In the second aim, we will compare the effectiveness and implementation of high- versus low-intensity coaching to support site-specific de-implementation of fall prevention alarms. We will evaluate implementation and effectiveness outcomes and examine the effect of multi-level (e.g., hospital, unit, and patient) factors on effectiveness and implementation. Fall prevention alarm use is the effectiveness outcome. Change in fall rates including total falls and fall-related injuries will be tracked as the balancing measures. Implementation outcomes will include feasibility, acceptability, appropriateness, and fidelity. Findings from this line of research could be used to support scale-up of fall prevention alarm de-implementation in other care settings. Further, research generated from this proposal will advance the field of de-implementation science by determining whether coaching is an effective and feasible strategy to de-implement low-value care.

Methods

Trial design

This hybrid 2 effectiveness-implementation trial will be conducted in two aims. In Aim 1, the study team will employ a mixed-methods approach that includes group concept mapping and focus groups to identify barriers to fall prevention alarm de-implementation and to tailor a core set of de-implementation strategies including education, audit/feedback, and local opinion leaders to each unit. In the second phase, we will use a stepped-wedge cluster-randomized trial (SW-CRT) to compare the effectiveness of high- and low-intensity coaching to de-implement hospital fall prevention alarms using site-specific strategies among 30 medical or medical surgical units.

Trial registration and funding

This multi-site trial is funded by a grant (R01 AG073408-01A1) from the National Institute on Aging (NIA). This trial has been approved by the Ohio State University Institutional Review Board (study number: 2022B0262). This protocol is reported in accordance with the Standards for Reporting Implementation Studies (StaRI) statement [30] and the SPIRIT guidelines [31]. This trial does meet the NIH definition of a clinical trial and was registered on ClinicalTrials.gov (NCT06089239).

Data safety and monitoring

A data safety monitoring board of independent experts in the areas of fall prevention, geriatrics, and implementation science will be convened to oversee the data and safety for the current study. The DSMB will review the study’s data safety monitoring plan, manual of procedures, study protocol, and informed consent documents. Further, the DSMB will be responsible for reviewing periodic progress reports on the study (e.g., assessment of data quality, participant recruitment, accrual and retention, and performance at each study site (e.g., intervention adherence)). The DSMB will meet at least every 6 to 9 months. Additionally, the study team will monitor study progress including recruitment, accrual, retention, data quality, and study site performance on an ongoing basis through biweekly meetings.

Participants

We will target nonfederal acute care hospitals that participate in NDNQI. The following inclusion criteria will be applied: (1) hospitals who have collected, reported, and monitored falls data through the NDNQI managed by Press Ganey for 24 months prior to study enrollment; (2) hospitals who report using fall prevention alarms as a part of their fall prevention protocol; (3) hospitals with an adult medical or medical surgical unit; (4) completion of a signed commitment letter from two senior hospital administrators verifying willingness to participate in the de-implementation intervention; (5) willingness to participate in de-implementation activities (e.g., coaching sessions) and data collection efforts (e.g., fall prevention alarm reporting); and (6) ability to identify a team leader and additional stakeholders to form an interdisciplinary de-implementation team. We will exclude federal hospitals, specialty hospitals (e.g., rehabilitation facilities), and specialty units (e.g., cardiac units), which may have distinct protocols and procedures related to falls prevention compared with general medical or medical surgical units.

Participant recruitment

We will employ methods used in our prior studies to recruit NDNQI-participating hospitals [15, 17]. First, key partners from Press Ganey will send an invitation to potentially eligible NDNQI hospitals that describes the study purpose, team, and required research activities. Second, we will host webinars at varying days/times to describe the study purpose, team, and required research activities in more detail and answer any questions that potential participants may have. Eligible hospitals that agree to participate will be asked to identify one medical or medical surgical unit to participate.

Overview of de-implementation strategy

Implementation coaching

Coaching is an effective strategy for supporting implementation of evidence-based practices [32, 33]. Coaches often support practice change through additional implementation strategies, such as training and audit and feedback, and can tailor their approach to account for differences in context across settings (e.g., available resources, leadership support) [34]. Coaching is particularly useful for interventions that involve complex change (e.g., change at multiple levels of an organization) and may be well-suited for de-implementation, which may encounter more resistance to change compared with implementation [35].

De-implementation of fall prevention alarms may be affected by several barriers. First, there may be limited awareness that fall prevention alarms are low value given how routinely they are used in hospital settings. Second, the decision to use fall prevention alarms is driven by complex factors (e.g., leadership expectations, fear of reprimand if a patient falls, assessments about patients’ fall risk) [23, 28], requiring deliberate and slow thinking. Habits informed by this type of cognition may be harder to “unlearn” than habits informed by automatic cognition [35]. Third, nurses often work within a hierarchical system and may encounter resistance to change at several levels within hospital settings (e.g., leadership, clinicians from other disciplines). To address these complex barriers, a high-intensity coaching model may be needed that is paired with additional de-implementation strategies including provider education, local opinion leaders, and audit and feedback.

Provider education and local opinion leaders

Given that awareness about the evidence for fall prevention alarms may be low, provider education strategies may be a necessary de-implementation strategy. Prior studies suggest that provider education, coupled with other de-implementation strategies, is particularly effective for de-implementing low-value nursing care [36]. Prior studies have demonstrated that nurses report low awareness about low-value care and difficulty with assessing the evidence for an intervention [37, 38], suggesting that provider education may be needed to raise awareness about low-value care. For successful nursing education efforts, prior qualitative studies have recommended strategies, such as designating an individual to oversee education of nursing teams and ensuring that education efforts target vertical key stakeholders (e.g., frontline staff and nursing administration) and horizontal key stakeholders (e.g., nurses, physicians, and other clinicians that play a role in fall prevention efforts) [36, 39]. Local opinion leaders—who are viewed as credible sources of information—can be effective at helping to disseminate information across and within organizations [40]. Therefore, provider education may be most impactful with support from a local de-implementation team and an opinion leader at each site.

Audit and feedback

While it is important to increase knowledge about low-value care, education alone may not be sufficient to change behavior. Prior studies have shown that increased knowledge about low-value care does not decrease the use of low-value nursing care [41]. Audit and feedback has demonstrated effectiveness in decreasing the use of low-value care through several behavior change strategies (e.g., setting and monitoring goals) [42, 43]. Prior studies suggest that audit and feedback may be optimized by ensuring that comparisons are made with similar organizations (e.g., similar resource levels) and that the delivery of feedback is individualized and supportive rather than punitive [44]. Further, audit and feedback may be more effective when strategies are taken to help reduce barriers to practice change [45]. Therefore, audit and feedback may work optimally as a strategy paired with health coaches who can provide technical assistance with overcoming site-specific barriers to practice change and who have experience using best practices for delivering feedback to healthcare organizations (e.g., grouping like organizations, providing supportive feedback) [46].

Tailoring implementation strategies to each site

The design of each implementation strategy (provider education, local opinion leaders, audit and feedback) may work best if it is tailored to each site. While all sites are likely to encounter a common set of de-implementation barriers (e.g., staff resistance to change), there will likely be site-specific barriers to de-implementation (e.g., hospital protocol encouraging use of fall prevention alarms). Therefore, a tailored approach that considers local barriers at each site may be valuable for de-implementation of fall prevention alarms.

Alarm with Care: a multicomponent de-implementation strategy

Alarm with Care combines each of these strategies (coaching, provider education, local opinion leaders, audit and feedback) using a tailored approach to support fall prevention alarm de-implementation. To support tailoring, units will (1) identify and prioritize barriers using group concept mapping and (2) achieve within hospital consensus around targeted barriers and implementation strategies through site-specific focus groups.

During the concept mapping, participating team members for each unit will be asked to identify and prioritize barriers to fall prevention alarm de-implementation using the following prompt, “In order to reduce the use of fall prevention alarms on our unit, we would need to [list suggestions].” The study team will refine the statements (e.g., ensure one idea is captured in each statement, remove irrelevant statements), and unit team members will be asked to group similar statements into piles and rate each statement based on feasibility and importance using a 7-point scale. Concept mapping will be facilitated by the Concept Systems Global Max© GroupWisdom™ secure web-based platform [47].

After concept mapping is complete, one focus group will be scheduled per site. During the focus groups, unit members will discuss barriers to fall prevention alarm de-implementation identified during concept mapping. Sites will review ratings of feasibility and importance for identified barriers and to come to a consensus on which barriers will be prioritized. Sites will review the core set of de-implementation strategies (e.g., provider education, audit and feedback, local champions) and provide feedback on how these strategies may need to be tailored to their site. Sites will also provide feedback on whether additional strategies may be needed for the site (e.g., patient education) based on the prioritized barriers. Sites will be given the flexibility to select additional implementation strategies if needed. Sites will be asked to come to a consensus on which de-implementation strategies will be included at their site. Focus groups approximately 60–90 min in length will be facilitated via videoconference by three study team members (K. T., M. M., L. M.) who have expertise in qualitative methods. Focus groups will be recorded and transcribed verbatim. Data from the concept mapping and focus groups will be used to develop a site-specific de-implementation plan that includes standard operating procedures and a timeline. The plan will be shared with each site, and feedback will be integrated as needed.

Once each site has an established de-implementation plan, hospital units will be randomized to receive either high- or low-intensity coaching services provided by the Helene Fuld National Institute for Evidence-Based Practice in Nursing and Healthcare (i.e., “the Fuld Institute,” located at the Ohio State University College of Nursing). This design was selected to determine whether high-intensity coaching is needed to support successful de-implementation of fall prevention alarms. Coaching sessions will be delivered virtually, and the content will include (1) an introduction to behavior and organizational change theory and principles, (2) implementation facilitation and motivational interviewing for facilitation, (3) building and leveraging teams and opinion leaders, (4) the state of evidence on effective and ineffective fall prevention practices, (5) selection of site-specific de-implementation strategies to address local site barriers, and (6) audit and feedback on fall prevention alarm de-implementation.

Coaching sessions will be led by individuals with a doctorate-level degree who are trained in implementation science. Prior studies have demonstrated the effectiveness of the Fuld Institute coaching services on evidence-based practice implementation including falls prevention [48,49,50]. Prior to the coaching sessions, all groups will receive educational materials on the state of evidence for effective and ineffective fall prevention practices. While the content of the educational materials will be consistent across sites, the method of distribution of these educational materials will be tailored according to local needs. Once the educational materials have been distributed, coaching sessions will commence.

The amount of coaching will depend upon group assignment. Units randomized to the low-intensity group will receive an initial 2-h orientation session and access to monthly office hours for discussion about group progress and troubleshooting barriers. Units randomized to the high-intensity group will receive an initial 4-h orientation, weekly coaching sessions for the first month of implementation, and monthly coaching sessions thereafter, access to office hours, and access to “on call” assistance to troubleshoot problems in real time. Both groups will receive instruction on use of the Fuld Institute Implementation for Sustainability Toolkit [51], which allows sites to create a tailored implementation plan and timeline for de-implementation.

Randomization

The randomization scheme for this SW-CRT is presented in Table 1. Recruitment will be stratified by hospital size (< 300 beds vs. ≥ 300 beds) and teaching status (e.g., teaching vs. non-teaching). Hospitals within each of the four strata will be randomly selected from the hospitals that sign up to participate. Within each stratum, hospitals will be matched based on fall prevention alarm use, and matched pairs will be randomly assigned using a random allocation sequence to one of three waves and to either the high- or low-intensity coaching condition. All sites will be identified and recruited prior to randomization. The allocation sequence will be concealed until all clusters are enrolled and assigned to the intervention. This design was chosen to ensure equivalence across conditions based on baseline fall prevention alarm use and organizational characteristics that may affect resources available for de-implementation efforts (e.g., hospital size, teaching status). Due to the nature of the intervention, it is not possible to blind study participants or interventionists (e.g., coach). Outcome assessors will be blinded to units’ assigned study condition.

Table 1 Randomization scheme for stepped-wedge cluster-randomized controlled trial, N = 30 hospital units

Data collection and measures

Effectiveness outcome

The effectiveness outcome will be measured by assessing monthly prevalence of fall prevention alarm use using an investigator-developed observation form. The form collects information on number of observed patients and number of observed patients with the alarm position in the “on” position. Observations will be conducted by a nurse data collector, a method that has been used in our prior studies [15, 17, 52].

Balancing measures

We will also measure change in total patient fall rates and fall-related injuries using NDNQI measures, which have been previously validated [53, 54]. Fall rates and fall-related injuries are reported as number of incidents per 1000 patient days. Data on NDNQI measures will be obtained from the NDNQI database managed by Press Ganey. Fall rates and fall-related injuries will be reported at the hospital unit level.

Implementation outcomes

Implementation outcomes will be measured by assessing staff perceptions about feasibility, acceptability, and appropriateness of the de-implementation process (e.g., implementation coaching and other de-implementation strategies) using previously validated measures [55]. Additionally, we will measure staff perceptions about the acceptability and appropriateness of the practice (i.e., staff perceptions about whether it is agreeable to reduce fall prevention alarm use) using similar measures. Frontline workers responsible for fall prevention (e.g., nurse) and organizational leaders responsible for setting expectations and providing support for falls prevention (e.g., director of nursing) will be surveyed. Scores will be aggregated at the hospital level. Fidelity will be measured using a checklist to document which components of the implementation plan are delivered at each site to ensure consistency across coaching sessions. Intervention adherence will be measured as the percentage of coaching sessions attended and adherence to the implementation plan (e.g., percentage of core de-implementation strategies completed by each site). Intervention quality will be measured using study team developed items that assess satisfaction with coaching services (e.g., content, delivery, facilitator, scheduling/timing).

Additional measures

Data from the NDNQI database and clinician-level surveys will be used to capture multi-level factors that may influence the effectiveness and/or implementation of the de-implementation strategy. Patient-level data will include socio-demographics. Unit-level data will include unit type (e.g., medical or medical-surgical), staffing characteristics (e.g., staffing ratios), and use of fall prevention interventions that may affect fall alarm prevention use (e.g., availability of companions/sitters). Hospital-level data will include hospital characteristics that may affect fall prevention alarm use (e.g., hospital size, teaching status, rural location).

Sample size

The primary outcome of interest in this study is reduction in fall prevention alarm use. To determine an adequate sample size for estimating a clinically significant difference in alarm use, we conducted a simulation study. We defined a 10% reduction in fall prevention alarm use from baseline in the low-intensity coaching condition and a 15% reduction in fall prevention alarm use in the high-intensity coaching condition as a clinically significance difference. Based on our prior studies, the following assumptions were made for the simulation study: (1) an intra-unit correlation of 0.5, (2) an average of 24 patients assessed per unit each month, and (3) a 33% baseline rate of alarm use. Using a mixed logistic regression model, odds of alarm use in a unit month were determined by study condition (low intensity, high intensity) and a random unit intercept. After calculating each unit’s probability of alarm use, we conducted a random binomial draw for the number of alarms in use, assuming a 20% attrition rate. Power was defined as the proportion of datasets with P < 0.05 for the intervention effect test. The required sample size for the study was determined 30 units (26 units after attrition). The power to detect a 10% difference for the low-intensity condition was 96%.

Data analysis plan

Study results will be presented in accordance with the CONSORT extension for SW-CRT [56]. We will use an intent-to-treat analytic approach. We will compare participant hospital characteristics with nonparticipating NDNQI hospitals to examine for potential selection bias. We will compare hospitals who withdrew from the study early to those who completed the study to examine for potential attrition bias. We will use multiple imputation methods to address missing data if necessary.

Odds of alarm use will be modeled as a function of study condition using a generalized logistic mixed model with a random unit intercept to account for potential clustering. A similar approach will be used for comparing implementation outcomes across study conditions. Additional models will be estimated to identify potential correlates of fall prevention alarm use (e.g., patient, unit, hospital characteristics). For example, implementation outcomes (e.g., unit adherence to de-implementation strategy) may be associated with fall prevention alarm use.

The data analysis plan will also include mixed-methods data collection including group concept mapping and focus groups. For the sorted statements generated through the concept mapping, we will create a similarity matrix to quantify the similarity between every possible pair of statements and develop a point map to visually display the similarity between statements. Additional visual displays, such as a cluster rating maps and pattern matches, will be developed to visually display average scores on feasibility and importance ratings and to compare average scores across groups (e.g., nurses versus physicians’ rating of importance). We will use a hybrid approach of integrating inductive and deductive coding to analyze focus group data, an approach commonly used in implementation science qualitative methods [57, 58]. We will develop deductive codes based on topics identified in the focus group guide [59] and develop additional inductive codes based on themes that emerge from the data [60]. Independent coders will code the data and come to a consensus regarding code application.

Discussion

The overall goal of this study is to examine the effectiveness and implementation of Alarm with Care, a de-implementation strategy to reduce the use of fall prevention alarms in US hospitals. While fall prevention alarms are widely used, there is no evidence to demonstrate their effectiveness and studies to suggest alarms may be harmful to healthcare staff and patients [18, 19]. There has been limited study of barriers to fall alarm de-implementation or the development and testing of strategies to reduce fall prevention alarm use. The Alarm with Care strategy seeks to overcome this research gap by testing a first-in-kind fall prevention alarm de-implementation initiative in US hospitals. Findings from this line of research could be scaled up to additional healthcare settings where fall prevention alarms are used (e.g., specialty hospitals). Further, information from the study will contribute to the growing literature on de-implementation science by determining whether coaching, paired with provider education and feedback, is an effective strategy for reducing low-value care.

Availability of data and materials

Due to collection of sensitive, protected health information, data from this study cannot be made publicly available.

Abbreviations

CMS:

Centers for Medicare and Medicaid Services

DSMB:

Data Safety Monitoring Board

NDNQI:

National Database of Nursing Quality Indicators

NIA:

National Institute on Aging

NIH:

National Institutes of Health

SW-CRT:

Stepped-wedge cluster-randomized trial

US:

United States

References

  1. Currie L. Fall and injury prevention. In: Hughes RG, editor. Patient safety and quality: an evidence-based handbook for nurses (prepared with support from the Robert Wood Johnson Foundation) AHRQ Publication NO08-0043. Rockville: Agency for Healthcare Research and Quality; 2008.

    Google Scholar 

  2. LeLaurin JH, Shorr RI. Preventing falls in hospitalized patients: state of the science. Clin Geriatr Med. 2019;35(2):273–83.

    Article  PubMed  PubMed Central  Google Scholar 

  3. Bouldin EL, Andresen EM, Dunton NE, Simon M, Waters TM, Liu M, Daniels MJ, Mion LC, Shorr RI. Falls among adult patients hospitalized in the United States: prevalence and trends. J Patient Saf. 2013;9(1):13–7.

    Article  PubMed  PubMed Central  Google Scholar 

  4. Burns Z, Khasnabish S, Hurley AC, Lindros ME, Carroll DL, Kurian S, Alfieri L, Ryan V, Adelman J, Bogaisky M, et al. Classification of injurious fall severity in hospitalized adults. J Gerontol A Biol Sci Med Sci. 2020;75:e138 ePub(ePub):ePub-ePub.

    Article  PubMed  Google Scholar 

  5. Wong CA, Recktenwald AJ, Jones ML, Waterman BM, Bollini ML, Dunagan WC. The cost of serious fall-related injuries at three midwestern hospitals. Jt Comm J Qual Patient Saf. 2011;37(2):81–7.

    PubMed  Google Scholar 

  6. Spetz J, Brown DS, Aydin C. The economics of preventing hospital falls: demonstrating ROI through a simple model. J Nurs Adm. 2015;45(1):50–7.

    Article  PubMed  Google Scholar 

  7. Nursing Sensitive Measures. NQF # 0203, restraint prevalence (vest and limb only). Status: endorsed on: August 05, 2009; Steward (s): The Joint Commission. Washington: National Quality Forum; 2009.

    Google Scholar 

  8. National Voluntary Consensus Standards for Nursing-Sensitive Care. an initial performance measure set: a consensus report. Washington, D.C.: National Quality Forum; 2004.

    Google Scholar 

  9. Montalvo I. The National Database of Nursing Quality IndicatorsTM (NDNQI®). OJIN: Online J Issues Nurs. 2007;12(3). Manuscript 2. nursingworld.org.

  10. Garrard L, Boyle DK, Simon M, Dunton N, Gajewski B. Reliability and validity of the NDNQI(R) injury falls measure. West J Nurs Res. 2016;38(1):111–28.

    Article  PubMed  Google Scholar 

  11. Mattie AS, Webster BL. Centers for medicare and medicaid services’ “never events”: an analysis and recommendations to hospitals. Health News. 2008;27(4):338–49.

    Google Scholar 

  12. Inouye SK, Brown CJ, Tinetti ME. Medicare nonpayment, hospital falls, and unintended consequences. N Engl J Med. 2009;360(23):2390–3.

    Article  CAS  PubMed  Google Scholar 

  13. Hospital-Acquired Condition Reduction Program (HACRP) [https://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/HAC-Reduction-Program.html]

  14. Fehlberg EA, Lucero RJ, Weaver MT, McDaniel AM, Chandler AM, Richey PA, Mion LC, Shorr RI. Impact of the CMS no-pay policy on hospital-acquired fall prevention related practice patterns. Innov Aging. 2017;1(3):igx036–igx036.

    Article  PubMed  Google Scholar 

  15. Staggs VS, Turner K, Potter C, Cramer E, Dunton N, Mion LC, Shorr RI. Unit-level variation in bed alarm use in US hospitals. Res Nurs Health. 2020;43(4):365–72.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Shever LL, Titler MG, Mackin ML, Kueny A. Fall prevention practices in adult medical-surgical nursing units described by nurse managers. West J Nurs Res. 2011;33(3):385–97.

    Article  PubMed  Google Scholar 

  17. Turner K, Staggs V, Potter C, Cramer E, Shorr R, Mion LC. Fall prevention implementation strategies in use at 60 United States hospitals: a descriptive study. BMJ Qual Saf. 2020;29:1000.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Sahota O, Drummond A, Kendrick D, Grainge MJ, Vass C, Sach T, Gladman J, Avis M. REFINE (REducing Falls in In-patieNt Elderly) using bed and bedside chair pressure sensors linked to radio-pagers in acute hospital care: a randomised controlled trial. Age Ageing. 2014;43(2):247–53.

    Article  PubMed  Google Scholar 

  19. Shorr RI, Chandler AM, Mion LC, Waters TM, Liu M, Daniels MJ, Kessler LA, Miller ST. Effects of an intervention to increase bed alarm use to prevent falls in hospitalized patients: a cluster randomized trial. Ann Intern Med. 2012;157(10):692–9.

    Article  PubMed  PubMed Central  Google Scholar 

  20. Schoen MW, Cull S, Buckhold FR. False bed alarms: a teachable moment. JAMA Intern Med. 2016;176:741.

    Article  PubMed  Google Scholar 

  21. Capezuti E, Brush BL, Lane S, Rabinowitz HU, Secic M. Bed-exit alarm effectiveness. Arch Gerontol Geriatr. 2009;49(1):27–31.

    Article  PubMed  Google Scholar 

  22. Hughes RG, editor. Patient safety and quality: an evidence-based handbook for nurses. Rockville: Agency for Healthcare Research and Quality (US); 2008. PMID: 21328752.

  23. Fehlberg EA, Cook CL, Bjarnadottir RI, McDaniel AM, Shorr RI, Lucero RJ. Fall prevention decision making of acute care registered nurses. J Nurs Adm. 2020;50(9):442–8.

    Article  PubMed  PubMed Central  Google Scholar 

  24. Brusco NK, Hutchinson AM, Mitchell D, Jellett J, Boyd L, Webb-St Mart M, Raymond M, Clayton D, Farley A, Botti M, et al. Mobilisation alarm triggers, response times and utilisation before and after the introduction of policy for alarm reduction or elimination: a descriptive and comparative analysis. Int J Nurs Stud. 2021;117: 103769.

    Article  PubMed  Google Scholar 

  25. State Operations Manual: Appendix PP - guidance to surveyors for long term care facilities (Rev. 173, 11–22–17) [https://www.cms.gov/Medicare/Provider-Enrollment-and-Certification/GuidanceforLawsAndRegulations/Nursing-Homes.html]

  26. Delaney LJ, Currie MJ, Huang HC, Lopez V, Van Haren F. “They can rest at home”: an observational study of patients’ quality of sleep in an Australian hospital. BMC Health Serv Res. 2018;18(1):524.

    Article  PubMed  PubMed Central  Google Scholar 

  27. Radecki B, Reynolds S, Kara A. Inpatient fall prevention from the patient’s perspective: a qualitative study. Appl Nurs Res. 2018;43:114–9.

    Article  PubMed  Google Scholar 

  28. King B, Pecanac K, Krupp A, Liebzeit D, Mahoney J. Impact of fall prevention on nurses and care of fall risk patients. Gerontologist. 2018;58(2):331–40.

    PubMed  Google Scholar 

  29. Grimshaw JM, Patey AM, Kirkham KR, Hall A, Dowling SK, Rodondi N, Ellen M, Kool T, van Dulmen SA, Kerr EA, et al. De-implementing wisely: developing the evidence base to reduce low-value care. BMJ Qual Saf. 2020;29(5):409–17.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Pinnock H, Barwick M, Carpenter CR, Eldridge S, Grandes G, Griffiths CJ, Rycroft-Malone J, Meissner P, Murray E, Patel A, et al. Standards for Reporting Implementation Studies (StaRI) statement. BMJ. 2017;356: i6795.

    Article  PubMed  PubMed Central  Google Scholar 

  31. Chan AW, Tetzlaff JM, Altman DG, Laupacis A, Gøtzsche PC, Krleža-Jerić K, Hróbjartsson A, Mann H, Dickersin K, Berlin JA, et al. SPIRIT 2013 statement: defining standard protocol items for clinical trials. Ann Intern Med. 2013;158(3):200–7.

    Article  PubMed  PubMed Central  Google Scholar 

  32. Walunas TL, Ye J, Bannon J, Wang A, Kho AN, Smith JD, Soulakis N. Does coaching matter? Examining the impact of specific practice facilitation strategies on implementation of quality improvement interventions in the Healthy Hearts in the Heartland study. Implement Sci. 2021;16(1):33.

    Article  PubMed  PubMed Central  Google Scholar 

  33. Ballengee LA, Rushton S, Lewinski AA, Hwang S, Zullig LL, Ricks KAB, Ramos K, Brahmajothi MV, Moore TS, Blalock DV, et al. Effectiveness of quality improvement coaching on process outcomes in health care settings: a systematic review. J Gen Intern Med. 2022;37(4):885–99.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Penney LS, Bharath PS, Miake-Lye I, Leng M, Olmos-Ochoa TT, Finley EP, Chawla N, Barnard JM, Ganz DA. Toolkit and distance coaching strategies: a mixed methods evaluation of a trial to implement care coordination quality improvement projects in primary care. BMC Health Serv Res. 2021;21(1):817.

    Article  PubMed  PubMed Central  Google Scholar 

  35. Helfrich CD, Rose AJ, Hartmann CW, van Bodegom-Vos L, Graham ID, Wood SJ, Majerczyk BR, Good CB, Pogach LM, Ball SL, et al. How the dual process model of human cognition can inform efforts to de-implement ineffective and harmful clinical practices: a preliminary model of unlearning and substitution. J Eval Clin Pract. 2018;24(1):198–205.

    Article  PubMed  PubMed Central  Google Scholar 

  36. Rietbergen T, Spoon D, Brunsveld-Reinders AH, Schoones JW, Huis A, Heinen M, Persoon A, van Dijk M, Vermeulen H, Ista E, et al. Effects of de-implementation strategies aimed at reducing low-value nursing procedures: a systematic review and meta-analysis. Implement Sci. 2020;15(1):38.

    Article  PubMed  PubMed Central  Google Scholar 

  37. Bourgault AM, Upvall MJ. De-implementation of tradition-based practices in critical care: a qualitative study. Int J Nurs Pract. 2019;25(2): e12723.

    Article  PubMed  Google Scholar 

  38. Bourgault AM, Upvall MJ, Nicastro S, Powers J. Challenges of de-implementing feeding tube auscultation: a qualitative study. Int J Nurs Pract. 2022;28(2): e13026.

    Article  PubMed  Google Scholar 

  39. Hartmann CW, Gillespie C, Sayre GG, Snow AL. De-implementing and sustaining an intervention to eliminate nursing home resident bed and chair alarms: interviews on leadership and staff perspectives. Implement Sci Commun. 2021;2(1):91.

    Article  PubMed  PubMed Central  Google Scholar 

  40. Flodgren G, O’Brien MA, Parmelli E, Grimshaw JM. Local opinion leaders: effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev. 2019;6(6):CD000125.

    PubMed  Google Scholar 

  41. Bourgault AM, Powers J, Aguirre L, Hines RB, Sebastian AT, Upvall MJ. National survey of feeding tube verification practices: an urgent call for auscultation deimplementation. Dimens Crit Care Nurs. 2020;39(6):329–38.

    Article  PubMed  Google Scholar 

  42. Colla CH, Mainor AJ, Hargreaves C, Sequist T, Morden N. Interventions aimed at reducing use of low-value health services: a systematic review. Med Care Res Rev. 2017;74(5):507–50.

    Article  PubMed  Google Scholar 

  43. Davey P, Brown E, Fenelon L, Finch R, Gould I, Hartman G, et al. Interventions to improve antibiotic prescribing practices for hospital inpatients. Cochrane Database Syst Rev. 2005;(4):CD003543. https://doi.org/10.1002/14651858.CD003543.pub2. Update in: Cochrane Database Syst Rev. 2013;4:CD003543.

  44. Ivers N, Jamtvedt G, Flottorp S, Young JM, Odgaard-Jensen J, French SD, et al. Audit and feedback: effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev. 2012;(6):CD000259. https://doi.org/10.1002/14651858.CD000259.pub3. PMID: 22696318.

  45. Foy R, Skrypak M, Alderson S, Ivers NM, McInerney B, Stoddart J, Ingham J, Keenan D. Revitalising audit and feedback to improve patient care. BMJ. 2020;368: m213.

    Article  PubMed  PubMed Central  Google Scholar 

  46. Hysong SJ, Best RG, Pugh JA. Audit and feedback and clinical practice guideline adherence: making feedback actionable. Implement Sci. 2006;1:9.

    Article  PubMed  PubMed Central  Google Scholar 

  47. Group Wisdom [https://groupwisdom.com/groupwisdom?gclid=CjwKCAjwjOunBhB4EiwA94JWsASE_PatDXa0hyNEXXUqwceXRNNj39o5zqUClt4F5OnV8OIK66wWJBoC6EEQAvD_BwE]

  48. Gallagher-Ford L, Koshy Thomas B, Connor L, Sinnott LT, Melnyk BM. The effects of an intensive evidence-based practice educational and skills building program on EBP competency and attributes. Worldviews Evid Based Nurs. 2020;17(1):71–81.

    Article  PubMed  Google Scholar 

  49. Gorsuch CRPF, Gallagher Ford L, Koshy Thomas B, Melnyk BM, Connor L. Impact of a formal educational skill-building program based on the ARCC model to enhance evidence-based practice competency in nurse teams. Worldviews Evid Based Nurs. 2020;17(4):258–68.

    Article  PubMed  Google Scholar 

  50. Melnyk BM, Gallagher-Ford L, Zellefrow C, Tucker S, Van Dromme L, Thomas BK. Outcomes from the first Helene Fuld Health Trust National Institute for Evidence-Based Practice in Nursing and Healthcare Invitational Expert Forum. Worldviews Evid Based Nurs. 2018;15(1):5–15.

    Article  PubMed  Google Scholar 

  51. McNett M, Gorsuch PF, Gallagher-Ford L, Thomas B, Mazurek Melnyk B, Tucker S. Development and evaluation of the fuld institute evidence-based implementation and sustainability toolkit for health care settings. Nurs Adm Q. 2023;47(2):161–72.

    Article  PubMed  Google Scholar 

  52. Turner K, Staggs VS, Potter C, Cramer E, Shorr RI, Mion LC. Fall prevention practices and implementation strategies: examining consistency across hospital units. J Patient Saf. 2022;18(1):e236–42.

    Article  PubMed  PubMed Central  Google Scholar 

  53. Simon MB, Gajewski B, Garrard L. Injury fall reliability and validity study: study 3. Kansas City: National Database of Nursing Quality Indicators; 2010.

    Google Scholar 

  54. Simon MK, Gajewski S, Dunton B. Falls reliability study: final report. Kansas City: National Database of Nursing Quality Indicators; 2010.

    Google Scholar 

  55. Weiner BJ, Amick H, Lee SY. Conceptualization and measurement of organizational readiness for change: a review of the literature in health services research and other fields. Med Care Res Rev. 2008;65(4):379–436.

    Article  PubMed  Google Scholar 

  56. Hemming K, Taljaard M, Grimshaw J. Introducing the new CONSORT extension for stepped-wedge cluster randomised trials. Trials. 2019;20(1):68.

    Article  PubMed  PubMed Central  Google Scholar 

  57. Hamilton AB, Finley EP. Qualitative methods in implementation research: an introduction. Psychiatry Res. 2019;280: 112516.

    Article  PubMed  PubMed Central  Google Scholar 

  58. Fereday J, Muir-Cochrane E. Demonstrating rigor using thematic analysis: a hybrid approach of inductive and deductive coding and theme development. Int J Qual Methods. 2006;5(1):80–92.

    Article  Google Scholar 

  59. Boyatzis RE. Transforming qualitative information: thematic analysis and code development. Thousand Oaks: Sage Publications, Inc; 1998.

    Google Scholar 

  60. Crabtree B, Miller W. A template approach to text analysis: developing and using codebooks. In: Crabtree B, Miller W, editors. Doing qualitative research. Newbury Park: Sage Publications Inc; 1999. p. 163–77.

    Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

This study was funded by a grant (R01 AG073408-01A1) from the National Institute on Aging (NIA).

Author information

Authors and Affiliations

Authors

Contributions

KT, conceptualization, methodology, and writing—original draft. MM, conceptualization, methodology, and writing—review and editing. CP, data curation, resources, and writing—review and editing. EC, formal analysis and writing—review and editing. MAT, resources, project administration, and writing—review and editing. RS, conceptualization, methodology, project administration, funding acquisition, and writing—review and editing. LCM, conceptualization, methodology, project administration, funding acquisition, and writing—review and editing.

Corresponding author

Correspondence to Kea Turner.

Ethics declarations

Ethics approval and consent to participate

This trial has been approved by the Ohio State University Institutional Review Board (study number: 2022B0262).

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Turner, K., McNett, M., Potter, C. et al. Alarm with care—a de-implementation strategy to reduce fall prevention alarm use in US hospitals: a study protocol for a hybrid 2 effectiveness-implementation trial. Implementation Sci 18, 70 (2023). https://doi.org/10.1186/s13012-023-01325-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13012-023-01325-9

Keywords