Skip to main content

Assessing the Veterans Health Administration’s response to intimate partner violence among women: protocol for a randomized hybrid type 2 implementation-effectiveness trial

Abstract

Background

Intimate partner violence (IPV) against women in the United States (US) remains a complex public health crisis. Women who experience IPV are among the most vulnerable patients seen in primary care. Screening increases the detection of IPV and, when paired with appropriate response interventions, can mitigate the health effects of IPV. The Department of Veterans Affairs (VA) has encouraged evidence-based IPV screening programs since 2014, yet adoption is modest and questions remain regarding the optimal ways to implement these practices, which are not yet available within the majority of VA primary care clinics.

Methods/design

This paper describes the planned evaluation of VA’s nationwide implementation of IPV screening programs in primary care clinics through a randomized implementation-effectiveness hybrid type 2 trial. With the support of our VA operational partners, we propose a stepped wedge design to compare the impact of two implementation strategies of differing intensities (toolkit + implementation as usual vs. toolkit + implementation facilitation) and investigate the clinical effectiveness of IPV screening programs. Using balanced randomization, 16–20 VA Medical Centers will be assigned to receive implementation facilitation in one of three waves, with implementation support lasting 6 months. Implementation facilitation in this effort consists of the coordinated efforts of the two types of facilitators, external and internal. Implementation facilitation is compared to dissemination of a toolkit plus implementation as usual. We propose a mixed methods approach to collect quantitative (clinical records data) and qualitative (key informant interviews) implementation outcomes, as well as quantitative (clinical records data) clinical effectiveness outcomes. We will supplement these data collection methods with provider surveys to assess discrete implementation strategies used before, during, and following implementation facilitation. The integrated-Promoting Action on Research Implementation in Health Services (i-PARIHS) framework will guide the qualitative data collection and analysis. Summative data will be analyzed using the Reach Effectiveness Adoption Implementation Maintenance (RE-AIM) framework.

Discussion

This research will advance national VHA efforts by identifying the practices and strategies useful for enhancing the implementation of IPV screening programs, thereby ultimately improving services for and health of women seen in primary care.

Trial registration

NCT04106193. Registered on 23 September 2019.

Peer Review reports

Background

Intimate partner violence (IPV) against women, defined as psychological, physical, and sexual aggression from a past or current intimate partner, is a complex public health problem. Although men also experience IPV, women are more likely to experience severe violence and to face more physical and mental health-related impacts [1, 2]. Nearly 7 million women are physically assaulted, raped, or stalked by an intimate partner in the USA annually [1], and IPV is strongly associated with poorer physical, psychological, and social health [3, 4]. Physical health problems range from injuries directly caused by physical and sexual assaults, to other chronic nervous system, cardiovascular, and reproductive conditions [5, 6]. Psychological problems include posttraumatic stress disorder, depression, anxiety, substance abuse, and suicidality [5,6,7]. Women who experience IPV are 2.4 times more likely to attempt suicide than those who do not experience IPV [8]. Social consequences include homelessness, financial insecurity, and unemployment [9,10,11,12]. The burden of IPV on women and society underscores the need for a feasible and effective health care response [13, 14].

Women who experience IPV present frequently to primary care [15,16,17], which is recognized as an ideal setting for safely identifying women who experience IPV and offering them resources and referrals to health and social services [18]. The United States Preventive Services Task Force [18] and recent research have found evidence that routine screening—paired with an appropriate response to disclosure—can reduce IPV and physical and mental health harms in women of childbearing age [19, 20]. One study found that women who talked to a provider about IPV were four times more likely to use an intervention and 2.6 times more likely to exit the relationship [21].

Routine screening for IPV in primary care is important for the US Department of Veterans Affairs (VA) medical centers, as nearly one in five (19%) Women Veterans (WVs) seen in primary care have experienced IPV in the past year [17] and IPV is more prevalent among WVs compared to women who have not served in the military [22]. Screening WVs for IPV is vital because formative research demonstrates that WVs want to be asked and are more likely to disclose when asked directly [23, 24]. Thus, VA recommends an evidence-based, trauma-informed, and patient-centered approach to IPV screening programs, including respect for patient privacy and autonomy, with an emphasis on three components: IPV screening on an annual basis, brief risk assessment and provision of resources for WVs who report experiencing IPV, and psychosocial service referrals (when desired by patients) [25].

A growing number of VA-based women’s specific primary care clinics have established IPV screening practices consistent with these recommendations [26, 27]. Yet, about two thirds of WVs accessing primary care services in VA do so in mixed-gender clinics or those that share space with clinics that predominantly treat men, such clinics have been slower to adopt evidence-based IPV screening practices. As such, there is a current disparity in the access to IPV screening programs for women seen in mixed-gender settings. To meet this need, in 2018, the study investigators launched a partnership with VA’s Office of Women’s Health Services and the IPV Assistance Program of Care Mangement and Social Work Services to evaluate the implementation of IPV screening practices for WVs in mixed-gender (model 1) and shared space (model 2) primary care clinics. In this manuscript, we describe the recently funded randomized program evaluation trial stemming from these efforts.

Methods

Overview

We have designed a stepped wedge hybrid type 2 implementation-effectiveness cluster randomized trial [28] to investigate both implementation and clinical effectiveness outcomes associated with the rollout of evidence-based IPV screening programs for WVs in VA-based mixed-gender and shared-space primary care clinics. We have received approval from the VA Boston Healthcare System’s Institutional Review Board (IRB) for all study procedures to evaluate this operations-led effort. The integrated-Promoting Action on Research Implementation in Health Services (i-PARIHS) framework [29] will guide the qualitative data collection and analysis. Summative data will be analyzed using the Reach Effectiveness Adoption Implementation Maintenance (RE-AIM) framework [30].

Aims

For this study, we will be comparing two distinct implementation strategies to support the uptake of IPV screening programs. First, all sites will receive a toolkit designed by the IPV Assistance Program, to be distributed to local staff responsible for encouraging IPV screening in primary care (toolkit plus implementation as usual [IAU]). Second, consistent with our stepped wedge design, during the study, each site will be assigned (in a staggered fashion) to cross over to more intensive implementation support in the form of implementation facilitation [31]. Implementation facilitation will entail external Office of Women’s Health Services experts working directly with the local primary care staff at participating sites to encourage IPV screening uptake and will also involve toolkit dissemination (toolkit plus implementation facilitation).

An additional aim of the study will be to compare two IPV screening tools. Specifically, a 5-item screener has been validated to detect probable IPV in primary care settings [9, 32], but feedback from the field to Women's Health Services leadership suggests the length of the 5-item screener may represent an implementation barrier in busy practices. Thus, as a secondary aim, we will compare the utility of the 5-item screener to a 1-item screener. Consistent with the RE-AIM Framework, the proposed research has the following specific aims:

  1. 1.

    Estimate the degree of reach, adoption, implementation fidelity, and maintenance achieved using two different implementation strategies (toolkit + IAU vs. toolkit + implementation facilitation) (implementation aim)

  2. 2.

    Evaluate the clinical effectiveness of IPV screening programs, as evidenced by disclosure rates and post-screening psychosocial service use (i.e., social work and mental health services uptake) (clinical effectiveness aim)

    2a. Compare the clinical effectiveness of two IPV screening tools (5-item vs. 1-item screener) in terms of disclosure rates and post-screening psychosocial service use

  3. 3.

    Identify multi-level barriers to and facilitators of IPV screening program implementation and sustainment

Stepped wedge trial design

We will use a stepped wedge controlled trial design [33], such that all sites will start with the less intensive implementation strategies (toolkit + IAU) before receiving the more intensive implementation support (toolkit + implementation facilitation) in a staggered fashion (Fig. 1). Stepped wedge designs have their roots in balanced incomplete block designs [34]. A stepped wedge has the advantage of minimizing burden on implementation support personnel as start times are staggered, with all sites ultimately receiving the implementation facilitation strategy. We aim to recruit 16–20 VA Medical Centers (VAMCs) as sites, with about six sites assigned to each of the three waves with start times staggered by 6-month intervals. All sites will start with 3 months in the less-intensive toolkit + IAU condition to allow the collection of baseline data. When each site switches to the more intensive intervention, they will receive 6 months of active implementation facilitation followed by 6 months of step-down.

Fig. 1
figure 1

Stepped wedge design and approximate timing of data collection activities. Stepped wedge design (light gray cells denote toolkit + implementation as usual [IAU]; dark gray cells denote implementation facilitation [IF] support)

Our stepped wedge design is novel due to the inclusion of the two types of sites to each wave. This design will accommodate aim 2a above, in that sites within each wave will be assigned to use either the 5-item or 1-item IPV screener. This structure, however, creates novel challenges in assigning sites to study waves. Specifically, we had previously developed a balancing algorithm to minimize the imbalance on key facility-level characteristics between waves in a stepped wedge while retaining some of the benefits of randomization [35, 36]. For the current study, we incorporated our partners’ input in developing a similar algorithm featuring different facility-level variables of contextual relevance to IPV screening program implementation in primary care (Table 1). However, for the current study, we also need to ensure balance between the sites assigned to the 5-item vs. 1-item screener. Thus, our balancing algorithm will identify the least-imbalanced combinations of site assignments across the three study waves and the two screener types on the variables included in Table 1. We will then randomly select from among the 2% least-imbalanced site assignment combinations.

Table 1 Variables in site assignment balancing algorithm

Implementation procedures

Toolkit + IAU

All participating VAMCs will initially receive a toolkit, developed by the IPV Assistance Program, to support the uptake of IPV screening programs. The toolkit will include the following tools to encourage the adoption and tailoring of IPV screening programs at individual facilities: (1) VHA’s recommendations for IPV screening programs; (2) VHA’s protocol for screening, response, and referrals; (3) templates notes with screening, response, and disposition; (4) resources to guide clinical response following screening (e.g., IPV brochure, risk assessments, and safety planning tool) and community resources; (5) documentation guidance addressing issues of privacy and safety; and (6) training tools (e.g., PowerPoint slides). In addition, as a condition for participation, all sites must designate a local staff person (e.g., member of the women’s primary aligned care team (PACT)) responsible for launching IPV screening for WVs in primary care at that site. This designated local staff person will fill the internal facilitator role once the more intensive toolkit + implementation facilitation condition begins, as described next.

Toolkit + implementation facilitation

Based on our stepped wedge design, all participating VAMCs will be assigned to start receiving implementation facilitation support in wave 1, 2, or 3; this implementation facilitation will be tailored to support either the 5-item or 1-item screener as appropriate (Fig. 1). Implementation facilitation is defined as a multifaceted process of enabling and supporting individuals and groups and has been widely used in primary care settings as an umbrella strategy for overcoming barriers and leveraging strengths to foster implementation of evidence-based interventions and care delivery models [31, 37, 38]. It involves the coordinated efforts of the two types of facilitators, external and internal [39]. The core components of implementation facilitation have been specified and will be assessed during the proposed study [39].

External facilitators are located outside of the local VAMC, implementing the innovation and providing high-level implementation expertise and support. External facilitation will be provided by two Office of Women’s Health Services external facilitators who are clinical experts in IPV and have completed an intensive implementation facilitation training through VHA’s Behavioral Health Quality Enhancement Research Initiative (QUERI) Program’s Implementation Facilitation Training Hub. Operational partners in the VHA IPV Assistance Program provide additional consultation to the Office of Women’s Health Services consultants. Internal facilitators are located within the VAMC and provide boots-on-the-ground knowledge to assist with the implementation. These internal facilitators will collaborate closely with the external facilitators (e.g., through regular email correspondence and a minimum of monthly phone meetings) to support the local implementation of the core components of IPV screening program. Internal facilitators are most often expected to be a member of a women’s health Primary Aligned Care Team (i.e., women’s health physician, nurse, and/or medical director) in collaboration with the facility’s IPV Assistance Program Coordinator. Congress recently funded these coordinator positions nationwide, paving the way for these individuals to play a vital role in supporting the primary care clinics in implementing IPV screening programs.

Quantitative evaluation of the implementation outcomes

We propose to use RE-AIM as an evaluation framework to guide our qualitative analyses [30]. RE-AIM examines 5 dimensions: reach into the target population, effectiveness of the intervention, adoption by the setting, implementation fidelity, and maintenance (i.e., degree of sustainment over time). Of these, one (effectiveness of the intervention) pertains to clinical effectiveness (aim 2), while the remaining four relate to implementation (aims 1 and 3). Thus, RE-AIM is ideal for structuring the proposed hybrid implementation-effectiveness evaluation. Table 2 describes the specific outcome measures and data sources for each RE-AIM dimension. Given our hybrid type 2 design, we are equally interested in implementation and clinical effectiveness outcomes. For this program evaluation, our primary implementation outcome will be reach (i.e., the proportion of WVs eligible for IPV screening who receive the screening). Our primary clinical effectiveness outcome will be disclosure rates and post-screening psychosocial care use (i.e., the proportion of women with a positive IPV screen who accept a referral and use psychosocial services in the ensuing 2 months).

Table 2 RE-AIM guides evaluation of the impact and clinical effectiveness of IPV screening programs

We will extract relevant administrative and clinical data for all VAMCs from the VA Corporate Data Warehouse (CDW) for all five dimensions of RE-AIM (see Table 1 below for details). The templated note or clinical reminder is required for the study and includes health factors for IPV screening (i.e., disclosure), response (i.e., resources provided), and referrals offered (i.e., accepted or declined IPV Assistance Program-Coordinator referral; accepted or declined psychosocial services referral). Psychosocial care use, and time to first post-screening psychosocial care use, also come from CDW based on clinic stop codes for social work, psychology, psychiatry, primary care mental health integration, drug or alcohol treatment, and housing services.

Our operation partners are interested in testing whether a brief 1-item screener can be as effective as a more comprehensive (but lengthier) 5-item screener a modified Hurt, Insult, Threaten, Scream tool [40], which has been validated for use with the WV population [9, 32]. The tool asks individuals to indicate how often in the past year a current or past partner has done any of the following: “insulted or talked down to you,” “screamed or cursed at you,” “threatened you with harm,” “physically hurt you,” or “pressured or forced you to have sexual activities.” Response options for each item ranges from 1 (never) to 5 (frequently). Endorsement of any item indicates a positive screen. A more efficient screening tool could potentially enhance the uptake of IPV screening programs. However, prior to formal adoption of a 1-item tool, it is critical to evaluate the effectiveness of the tool in eliciting IPV disclosures (sub-aim 2a). The item was developed by a panel of IPV and women’s health clinical experts and researchers and consists of “In the past 12 months, have you experienced insulting, screaming, threatening, hitting, or unwanted sexual activity by a former or current partner?”

We will use repeated measures generalized estimating equations (GEE) [41,42,43] analyses to address our study aims while controlling for site characteristics (between-site effects) and calendar time, which reflects secular trends (within-site effects). GEE quantifies and apportions the variance in outcomes among relevant factors, thus isolating the change in outcome due to the primary contrast of interest. It extends the traditional general linear model with a continuous outcome (e.g., linear regression) to accommodate binary outcomes for each subject and count outcomes for each site. GEE also accommodates repeated measures (within-subject correlation), random effects (subject), moderate imbalance among independent factors (sites), and various types of missing data. We will include relevant demographic and clinical variables at the patient level (e.g., age, mental health diagnoses, recent psychosocial service use) as covariates. Furthermore, GEE will allow us to explore the results for the patterns of unequal variance, relevant correlation structures, and variance component models to ensure that our results are robust. Our planned sample size will also support the exploration of many site-specific effects by adding site-interaction terms to the model. This method will also allow us to explore the study outcomes using a nested approach (WVs within VAMCs within study conditions). We will use the same GEE-based approach for encompassing each of the five RE-AIM outcome domains included in Table 2 above. Specifically, each primary outcome listed in Table 2 can be expressed as a proportion, and so columns for numerators and denominators are included. Analyses in each of these domains within the GEE framework will consist of logistic regression with random effects.

We will repeat these analyses to compare the clinical effectiveness of the 5-item tool and 1-item tool as well. To evaluate the extent to which IPV screening programs facilitate timely access to psychosocial services, we compare the median number of days to first new psychosocial service visits following a positive screen for women who accepted referrals during and after the IF + toolkit implementation period. In sum, our GEE analyses will allow us to determine the extent to which changes in the quantitative outcome measures described in Table 2 prior to the implementation facilitation differ along two dimensions: (1) based on the changes during and after the implementation period and (2) based on the use of the 5-item vs. 1-item screener.

We will supplement these quantitative data, derived from the CDW, with a 50-item electronic survey to all internal facilitators at pre-facilitation, post-facilitation, and the sustainment phases to assess the use of discrete implementation strategies at each site during each study period. The survey assesses 50 of the 73 strategies described by the Expert Recommendations for Implementing Change (ERIC) study [44], which has shown validity when strategies are assessed via survey [45]. To reduce participant burden, only the 50 strategies most germane to the IPV screening program implementation are queried (i.e., excluded financial incentives as this is irrelevant in VA) and we also inquire about the perceived effectiveness of each strategy on a Likert-type scale. The responses will inform a deeper dive into the use and perceived effectiveness of implementation strategies in subsequent internal facilitator interviews (see Qualitative evaluation of the implementation outcomes and adaptations made to IPV screening programs section).

Power analyses

Given our hybrid type 2 design, we conducted power analyses for implementation outcomes (aim 1) and clinical effectiveness outcomes (aim 2). For aim 1, we specifically estimated power related to the RE-AIM dimension of reach (the proportion of WVs receiving the IPV screening under each of the two implementation conditions, toolkit + IAU vs. IF + toolkit). Based on a previous research examining VA healthcare use among women screened for IPV [46, 47] and our estimated sample size, we are powered above 90% to detect expected differences between conditions, allowing a type I error rate of 5%. For aim 2, we are also powered above 90% to detect expected differences in post-screening psychosocial visit, again allowing a type I error rate of 5%.

Qualitative evaluation of the implementation outcomes and adaptations made to IPV screening programs

To contextualize our RE-AIM findings and address aim 3, we will conduct interviews with 2–3 key informants at each site including clinicians, administrators, and internal facilitators. Key informant interviews are an ideal method for the proposed research as they elicit in-depth information from individuals with first-hand knowledge of the factors influencing local IPV screening programs [48]. The interviews will be guided by a semi-structured interview guide, with open-ended questions and prompts to elicit organic feedback. Interviews will occur following the toolkit + implementation facilitation phase, and 1 year later to assess sustainment. We will base our qualitative analyses on the integrated-Promoting Action on Research Implementation in Health Services framework, a determinant implementation framework that will enable us to characterize and explain the ways in which IPV screening programs have and have not been successfully implemented and sustained (aim 3 [29]). Consistent with the core integrated-Promoting Action on Research Implementation in Health Services constructs, our interview guide and qualitative analyses will assess the impact of factors specific to the local clinical context, IPV screening procedures, implementation facilitation, and the recipients of the innovation on the success of the implementation effort, as done in our formative research on IPV screening programs [26]. We expect that collaboration both within the primary care clinic (e.g., nurses, PCPs, and social workers) and with other services, particularly social work and mental health, will be relevant recipients.

Furthermore, guided by the Wiltsey-Stirman framework [49], we will use our qualitative interview results to assess the adaptations made to IPV screening programs. Identifying those adaptations, and their impacts on IPV screening, will allow us to determine and make recommendations to our partners to facilitate the range of adaptations that are acceptable and avoid those that are not. The qualitative interviews will include additional probes for the use of core implementation facilitation activities and the use of implementation strategies defined by the Expert Recommendations for Implementing Change project [44]. This will inform the time and skills needed to facilitate the implementation of IPV screening programs by our operation partners at other VAMCs.

Rapid content analysis [50, 51] using a hybrid inductive-deductive approach will efficiently reveal IPV screening program practices, adaptations, implementation strategies, and multi-level barriers to and facilitators of implementation and sustainability of IPV screening programs. We will transfer interview and site summaries into matrices and use matrix analysis methods to examine our key domains of (1) IPV screening, response, and referral practices; (2) adaptations; (3) implementation strategies by time, including core implementation facilitation activities and Expert Recommendations for Implementing Change defined strategies; (4) toolkit engagement; (5) barriers to and facilitators of IPV screening program implementation; and (6) barriers to and facilitators of IPV screening program maintenance. Matrices systematically note the similarities, differences, and trends in responses across sites, thereby expediting synthesis and summary of findings [52]. We will use a hybrid deductive and inductive analytic approach [50, 51], where prior constructs and assumptions are evaluated against the data and new themes are incorporated into the coding scheme [53]. For aim 3, we will characterize barriers to and facilitators of implementation and sustainability within and across VAMCs per the RE-AIM domain of maintenance (Table 2). Other sources of quantitative data (e.g., survey responses) will be triangulated with the matrix analysis to provide additional context for findings.

Advisory board

We will convene an advisory board of operations partners and key stakeholders in WV’s health care, IPV Assistance Program, primary care program implementation, implementation science, and WVs. The use of an advisory board is an established community-based participatory research strategy that will help frame and monitor the progress of the study while providing guidance on values and practices to enhance the feasibility and acceptability of future implementation efforts [54].

Limitations and anticipated challenges

A limitation of the proposal is the lack of information about patients’ experiences with the IPV screening program. For example, some patient-level factors that might affect willingness to be screened for IPV or accept psychosocial referrals and engage in follow-up psychosocial services. To address this issue, we have ensured that WVs are represented on our Advisory Board. In addition, our analyses of service use are limited by the reliance on clinical reminder/note templates and the types of psychosocial services accessed in VA. Currently, it is not possible to ascertain whether WVs who experienced IPV access community services (e.g., the National Domestic Violence Hotline). This type of information will be queried generally during key informant interviews, but examining community partnerships and care coordination for IPV is a step for future research. We also recognize that unforeseeable circumstances experienced by our operational partners or participants (i.e., turnover) may impact the execution of the rollout; we have overpowered the study for both of our primary study aims to ensure that our results will be robust even if our original recruitment or site participation goals are not met. Finally, despite its advantages, our use of a nested stepped wedge design—technically a quasi-experimental design—has certain limitations: it precludes subject-level randomization, introduces possible time trends, and means that we do not have a true control group as would be the case in a traditional parallel groups randomized controlled trial. However, our use of a balancing algorithm should minimize time trends, and our GEE analytic approach will allow us to identify such trends. Furthermore, a traditional randomized controlled trial was not appropriate from the perspective of our clinical partners and was not practical from a resource management perspective.

Discussion

There is an urgent need to better support women who are experiencing IPV. VA-based primary care clinics are an ideal setting to implement evidence-based IPV screening programs that can lead to the provision of appropriate healthcare services and other resources for WV’s who may be experiencing IPV. The study protocol described in this manuscript—a stepped wedge hybrid II implementation-effectiveness trial—was developed in close partnership with relevant operations partners in VA and will use state-of-the-art evaluation methods to answer key questions regarding how best to implement and sustain such IPV screening programs. Specifically, our mixed quantitative and qualitative data collection will allow us to develop clear guidance for our operations partners regarding context-sensitive implementation strategies to address multi-level barriers to program implementation and sustainment. Finally, we will make recommendations to help facilitate acceptable adaptations in clinical practices and avoid those that are not. This will help ensure the effectiveness and efficiency of future and ongoing efforts to address IPV.

Availability of data and materials

Not applicable, as this manuscript does not contain any data.

Abbreviations

CDW:

Corporate Data Warehouse

GEE:

Generalized estimating equations

HSR&D:

Health Services Research and Development

IAU:

Implementation as usual

IF:

Implementation facilitation

IPV:

Intimate partner violence

RE-AIM:

Reach Effectiveness Adoption Implementation Maintenance

SDR:

Service Directed Research

VA:

US Department of Veteran Affairs

VAMCs:

VA Medical Centers

VHA:

Veterans Health Administration

WVs:

Women Veterans

References

  1. Basile, K.C., Black, M.C., Breiding, M.J., Chen, J., Merrick, M.T., Smith, S.G., et al., National Intimate Partner and Sexual Violence Survey; 2010 summary report. 2011.

    Google Scholar 

  2. Carbone-López K, Kruttschnitt C, Macmillan R. Patterns of intimate partner violence and their associations with physical health, psychological distress, and substance use. Public Health Rep. 2006;121(4):382–92.

    Article  PubMed  PubMed Central  Google Scholar 

  3. Miller E, McCaw B. Intimate partner violence. N Engl J Med. 2019;380(9):850–7.

    Article  PubMed  Google Scholar 

  4. Gerber MR, Iverson KM, Dichter ME, Klap R, Latta RE. Women veterans and intimate partner violence: current state of knowledge and future directions. J Women's Health. 2014;23(4):302–9.

    Article  Google Scholar 

  5. Campbell JC. Health consequences of intimate partner violence. Lancet. 2002;359(9314):1331–6.

    Article  PubMed  Google Scholar 

  6. Dutton MA. Pathways linking intimate partner violence and posttraumatic disorder. Trauma Violence Abuse. 2009;10:211–24.

    Article  PubMed  Google Scholar 

  7. Dillon G, Hussain R, Loxton D, Rahman S. Mental and physical health and intimate partner violence against women: a review of the literature. Int J Family Med. 2013;2013:313909.

    Article  PubMed  PubMed Central  Google Scholar 

  8. Iverson KM, Dick A, McLaughlin KA, Smith BN, Bell ME, Gerber MR, et al. Exposure to interpersonal violence and its associations with psychiatric morbidity in a U.S. national sample: a gender comparison. Psychol Violence. 2013;3:273–87.

    Article  PubMed  PubMed Central  Google Scholar 

  9. Iverson KM, King M, Resick PA, et al. Clinical utility of an intimate parnter violence screening tool for female VHA patients. J Gen Intern Med. 2013;28(10):1288–93.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Kimerling R, Iverson KM, Dichter ME, et al. Prevalence of intimate partner violence among women veterans who utilize Veterans Health Administration primary care. J Gen Intern Med. 2016;31(8):1–7.

    Article  Google Scholar 

  11. Montgomery AE, Sorrentino AE, Cusack MC, et al. Recent intimate partner violence and housing instability among women veterans. Am J Prev Med. 2018;54(4):584–90.

    Article  PubMed  Google Scholar 

  12. Dichter ME, Haywood TN, Butler AE, Bellamy SL, Iverson KM. Intimate partner violence screening in the Veterans Health Administration: Demographic and military service characteristics. Am J Prev Med. 2017;52(6):761–8.

    Article  PubMed  Google Scholar 

  13. Ghandour RM, Campbell JC, Lloyd J. Screening and counseling for intimate partner violence: a vision for the future. J Women's Health. 2015;24(1):57–61.

    Article  Google Scholar 

  14. Miller E, McCaw B, Humphreys BL, Mitchell C. Integrating intimate partner violence assessment and intervention into healthcare in the United States: a systems approach. J Women's Health. 2015;24(1):92–9.

    Article  Google Scholar 

  15. Coker AL, Reeder CE, Fadden MK, Smith PH. Physical partner violence and Medicaid utilization and expenditures. Public Health Rep. 2004;119(6):557–67.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Rivara FP, Anderson ML, Fishman P, Bonomi AE, Reid RJ, Carrell D, et al. Healthcare utilization and costs for women with a history of intimate partner violence. Am J Prev Med. 2007;32(2):89–96.

    Article  PubMed  Google Scholar 

  17. Kimerling R, Iverson KM, Dichter ME, Rodriguez A, Wong A, Pavao J. Prevalence of intimate partner violence among women veterans who utilize Veterans Health Administration primary care. J Gen Intern Med. 2016;31(8):888–94.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Curry SJ, Krist AH, Owens DK, Barry MJ, Caughey AB, Davidson KW, et al. Screening for intimate partner violence, elder abuse, and abuse of vulnerable adults: US Preventive Services Task Force final recommendation statement. JAMA. 2018;320(16):1678–87.

    Article  PubMed  Google Scholar 

  19. Bair-Merritt MH, Lewis-O’Connor A, Goel S, Amato P, Ismailji T, Jelley M, et al. Primary care-based interventions for intimate partner violence: a systematic review. Am J Prev Med. 2014;46(2):188–94.

    Article  PubMed  Google Scholar 

  20. Feltner C, Wallace I, Berkman N, Kistler CE, Middleton JC, Barclay C, et al. Screening for intimate partner violence, elder abuse, and abuse of vulnerable adults: evidence report and systematic review for the US Preventive Services Task Force. JAMA. 2018;320(16):1688–701.

    Article  PubMed  Google Scholar 

  21. McCloskey LA, Lichter E, Williams C, Gerber M, Wittenberg E, Ganz M. Assessing intimate partner violence in health care settings leads to women’s receipt of interventions and improved health. Public Health Rep. 2006;121(4):435–44.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Dichter ME, Cerulli C, Bossarte RM. Intimate partner violence victimization among women veterans and associated heart health risks. Womens Health Issues. 2011;21(4 Suppl):S190–4.

    Article  PubMed  Google Scholar 

  23. Dichter ME, Wagner C, Goldberg E, Iverson KM. Intimate partner violence detection and care in the Veterans Health Administration: patient and provider perspectives. Womens Health Issues. 2015;25(5):555–60.

    Article  PubMed  Google Scholar 

  24. Iverson KM, Huang K, Wells SY, Wright JD, Gerber MR, Wiltsey-Stirman S. Women veterans’ preferences for intimate partner violence screening and response procedures within the Veterans Health Administration. Res Nurs Health. 2014;37:302–11.

    Article  PubMed  Google Scholar 

  25. Veterans Health Administration. VHA Directive 1198. Intimate Partner Violence Assistance Program; 2018.

  26. Iverson KM, Adjognon O, Grillo AR, Dichter ME, Gutner CA, Hamilton AB, et al. Intimate partner violence screening programs in the Veterans Health Administration: informing scale-up of successful practices. J Gen Intern Med. 2019;34(11):2435–42.

    Article  PubMed  PubMed Central  Google Scholar 

  27. Portnoy GA, Iverson KM, Haskell SG, Czarnogorski M, Gerber MR. A multisite quality improvement initiative to enhance the adoption of intimate partner violence screening practices into routine primary care for Women Veterans. (Under Review); 2019.

    Google Scholar 

  28. Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care. 2012;50(3):217–26.

    Article  PubMed  PubMed Central  Google Scholar 

  29. Harvey G, Kitson A. Implementing evidence-based practice into healthcare: a facilitation guide. London: Taylor & Francis; 2015.

    Book  Google Scholar 

  30. Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health. 1999;89:1322–7.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  31. Ritchie, M.J., Dollar, K.M., Miller, C.J., Oliver, K.A., Smith, J.L., Lindsay, J.A., et al. Using implementation facilitation to improve care in the Veterans Health Administration (Version 2). 2017; Available from: https://www.queri.research.va.gov/tools/implementation/Facilitation-Manual.pdf.

    Google Scholar 

  32. Iverson KM, King M, Gerber MR, Resick PA, Kimerling R, Street AE, Vogt D. Accuracy of an intimate partner violence screening tool for female VHA patients: A replicatoin and extension. J Trauma Stress. 2015;28(1):79–82.

    Article  PubMed  Google Scholar 

  33. Brown CA, Lilford RJ. The stepped wedge trial design: a systematic review. BMC Med Res Methodol. 2006;6(1):54.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Suresh K. An overview of randomization techniques: an unbiased assessment of outcome in clinical research. J Hum Reprod Sci. 2011;4:8–11.

    Article  PubMed  PubMed Central  Google Scholar 

  35. Lew RA, Miller CJ, Kim B, Wu H, Stolzmann K, Baur MS. A method to reduce imbalance for site-level randomization stepped wedge implementation trial designs. Implement Sci. 2019;14(1):46.

    Article  PubMed  PubMed Central  Google Scholar 

  36. Pocock SJ, Simon R. Sequential treatment assignment with balancing for prognostic factors in the controlled clinical trial. Biometrics. 1975;31:103–15.

  37. Baskerville NB, Liddy C, Hogg W. Systematic review and meta-analysis of practice facilitation within primary care settings. Ann Fam Med. 2012;10(1):63–74.

    Article  PubMed  PubMed Central  Google Scholar 

  38. Smith J, Ritchie M, Kim B, Miller CJ, Chinman M, Landes S, Kirchner J. Scoping review to identify the spectrum of activities applied in implementation facilitation strategies. In: Poster presentation at the AcademyHealth Annual Research Meeting, Seattle WA, June 2018; 2018.

    Google Scholar 

  39. Kirchner JE, Ritchie MJ, Pitcock JA, Parker LE, Curran GM, Fortney JC. Outcomes of a partnered facilitation strategy to implement primary care-mental health. J Gen Intern Med. 2014;29(Suppl 4):904–12.

    Article  PubMed  PubMed Central  Google Scholar 

  40. Sherin KM, Sinacore JM, Li XQ, Zitter RE, Shakil A. HITS: A short domestic violence screening tool for use in a family practice setting. Fam Med. 1998;30:508–12.

    CAS  PubMed  Google Scholar 

  41. Verbeke G, Molenberghs G. Linear mixed models for longitudinal data. New York: Springer; 2009.

    Google Scholar 

  42. Daniels MJ, Hogan W. Missing data in longitudinal studies. Boca Raton: Chapman & Hall; 2008.

    Book  Google Scholar 

  43. Rybin D, Doros G, Rosenheck R, Lew R. The impact of missing data on the results of a schizophrenia study. Pharm Stat. 2015;14:4–10.

    Article  PubMed  Google Scholar 

  44. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, Proctor EK, Kichner JE. A refined complilation of implementation strategies: Results from the Expert Recommendations for Implementing Change (ERIC) Project. Implement Sci. 2015;10(21):1–14.

    Google Scholar 

  45. Rogal SS, Yakovchenko V, Waltz TJ, Powell BJ, Kirchner JE, Proctor EK, et al. The association between implementation strategy use and the uptake of hepatitis C treatment in a national sample. Implement Sci. 2017;12(1):60.

    Article  PubMed  PubMed Central  Google Scholar 

  46. Dichter ME, Sorrentino AE, Haywood TN, Bellamy SL, Medvedeva E, Roberts CB, et al. Women’s healthcare utilization following routine screening for past-year intimate partner violence in the Veterans Health Administration. J Gen Intern Med. 2018;33(6):936–41.

    Article  PubMed  PubMed Central  Google Scholar 

  47. Iverson KM, Sorrentino AE, Bellamy SL, et al. Adoption, penetration, and effectiveness of a secondary risk screener for intimate partner violence screening practices in integrated care settings. Gen Hosp Psychiatry. 2018;51:79–84.

    Article  PubMed  Google Scholar 

  48. Hamilton AB, Finley EP. Qualitative methods in implementation research: an introduction. Psychiatry Res. 2019;280:112516.

    Article  PubMed  PubMed Central  Google Scholar 

  49. Witlsey SS, Baumann AA, Miller CJ. The FRAME: an expanded fraework for reporting adaptations and modifications to evidence-based interventions. Implement Sci. 2019;14(1):58.

    Article  Google Scholar 

  50. Fereday J, Muir-Cochrane E. Demonstrating rigor using thematic analysis: a hybrid approach of inductive and deductive coding and theme development. Int J Qual Methods. 2006;5(1):80–92.

    Article  Google Scholar 

  51. Hamilton A. Qualitative methods in rapid turn-around health services research, in spotlight on Women’s Health: VA HSR&D; 2013.

    Google Scholar 

  52. Averill JB. Matrix analysis as a complementary analytic strategy in qualitative inquiry. Qual Health Res. 2002;12(6):855–66.

    Article  PubMed  Google Scholar 

  53. Ritchie J, Spencer L. Qualitative data analysis for applied policy research. In: Bryman A, Burgess R, editors. Analysing Qualitative Data. London: Routledge; 1993. p. 173–94.

    Google Scholar 

  54. Poleshuck E, Mazzotta C, Resch K, Rogachefsky A, Bellenger K, Raimondi C, et al. Development of an innovative treatment paradigm for intimate partner violence victims with depression and pain using community-based participatory research. J Interpers Violence. 2018;33(17):2704–24.

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

This study was funded by the Department of Veterans Affairs, Office of Research and Development, Health Services Research and Development (HSR&D) Services (SDR 18-150: Iverson & Miller). We acknowledge VHA’s Office of Women’s Health Services, VHA’s National IPV Assistance Program, VHA's Office of Primary Care, and VHA's Office of Mental Health and Suicide Prevention. Drs. Iverson and Miller gratefully acknowledge the VHA Behavioral Health Quality Enhancement Research Initiative Program, especially Drs. Mark Bauer and JoAnn Kirchner, as well as their fellowships with the Implementation Research Institute (IRI), at the George Warren Brown School of Social Work, Washington University in St. Louis, through an award from the National Institute of Mental Health (5R25MH08091607).

Funding

This work was supported by VA HSR&D SDR 18-150 (Iverson & Miller, Co-PIs).

Author information

Authors and Affiliations

Authors

Contributions

All authors read and approved the final manuscript. KMI and CJM contributed to the concept, design, data analysis, interpretation, and paper preparation. RAL and KS contributed to the data analysis and interpretation. MED, MRG, LEB, and GAP contributed to the concept and design. OLA contributed to the paper preparation.

Corresponding author

Correspondence to Katherine M. Iverson.

Ethics declarations

Ethics approval and consent to participate

This study was approved by the VA Boston Healthcare System Institutional Review Board, through the Research and Development Committee (R&D #3253-X).

Consent for publication

Not applicable, as this manuscript does not contain any individual person’s data.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Iverson, K.M., Dichter, M.E., Stolzmann, K. et al. Assessing the Veterans Health Administration’s response to intimate partner violence among women: protocol for a randomized hybrid type 2 implementation-effectiveness trial. Implementation Sci 15, 29 (2020). https://doi.org/10.1186/s13012-020-0969-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13012-020-0969-0

Keywords