Skip to main content
  • Study protocol
  • Open access
  • Published:

NIATx-TI versus typical product training on e-health technology implementation: a clustered randomized controlled trial study protocol

Abstract

Background

Substance use disorders (SUDs) lead to tens-of-thousands of overdose deaths and other forms of preventable deaths in the USA each year. This results in over $500 billion per year in societal and economic costs as well as a considerable amount of grief for loved ones of affected individuals. Despite these health and societal consequences, only a small percentage of people seek treatment for SUDs, and the majority of those that seek help fail to achieve long-term sobriety. E-health applications in healthcare have proven to be effective at sustaining treatment and reaching patients traditional treatment pathways would have missed. However, e-health adoption and sustainment rates in healthcare are poor, especially in the SUD treatment sector. Implementation engineering can address this gap in the e-health field by augmenting existing implementation models, which explain organizational and individual e-health behaviors retrospectively, with prospective resources that can guide implementation.

Methods

This cluster randomized control trial is designed to test two implementation strategies at adopting an evidence-based mobile e-health technology for SUD treatment. The proposed e-health implementation model is the Network for the Improvement of Addiction Treatment–Technology Implementation (NIATx-TI) Framework. This project, based in Iowa, will compare a control condition (using a typical software product training approach that includes in-person staff training followed by access to on-line support) to software implementation utilizing NIATx-TI, which includes change management training, followed by coaching on how to implement and use the mobile application. While e-health spans many modalities and health disciplines, this project will focus on implementing the Addiction Comprehensive Health Enhancement Support System (A-CHESS), an evidence-based SUD treatment recovery app framework. This trial will be conducted in Iowa at 46 organizational sites within 12 SUD treatment agencies. The control arm consists of 23 individual treatment sites based at five organizations, and the intervention arm consists of 23 individual SUD treatment sites based at seven organizations

Discussion

This study addresses an issue of substantial public health significance: enhancing the uptake of the growing inventory of patient-centered evidence-based addiction treatment e-health technologies.

Trial registration

ClinicalTrials.gov, NCT03954184. Posted 17 May 2019

Peer Review reports

Background

Every day on average 96 Americans die due to opioid overdoses, and another six die due to alcohol poisoning, making substance use disorders (SUDs) the third-leading cause of preventable death in the USA [1]. Many injuries and diseases (e.g., cancer, diabetes, cardiovascular problems, cirrhosis, and HIV/AIDS) are caused or exacerbated by substance misuse [2]. Substance misuse results in tens-of-thousands of overdose deaths and other forms of preventable deaths [3] and more than $500 billion in societal and economic costs each year [4]. Despite the health and societal consequences of SUDs, just 9.2% of those who need treatment receive it [5], and 75% of those treated fail to achieve long-term sobriety [6]. New, more efficient approaches are required to fill the gaps in both the use and effectiveness of SUD treatment.

Patient-centered mobile e-health technologies offer innovative ways to improve treatment and recovery supports for SUDs [7]. In February 2018, 81% of the adult population and 92% of the persons aged 18–34 in the USA owned a smartphone [8]. Additionally, 75% of US adults below the median US income level owned a smartphone [8]. Several meta-analyses of mobile behavioral e-health interventions have found superior treatment outcomes when compared to a control group [9,10,11,12,13,14]. The combination of consumer interest and research evidence will continue to place positive pressure on the use of e-health. In this study, we use a mobile application because daily mobile technology use is nearly ubiquitous.

Despite the promise of e-health technologies, the benefits are far from being realized [15,16,17]. A prominent example of this is in SUD treatment services. Despite the availability of evidence-based e-health tools such as the Drinker’s Check-up [18, 19], Therapeutic Education System (TES) [20], Computer-Based Training for Cognitive Behavioral Therapy (CBT4CBT) [21], and Addiction Comprehensive Health Enhancement Support System (A-CHESS) [22], a 2012 survey showed all of these technologies together were used by < 1% of SUD treatment providers [23]. Since the onset of COVID-19, researchers and practitioners have focused on e-health technologies to deliver virtual SUD services while in-person treatment has become limited [24,25,26]. This study tests whether an evidence-based technology implementation framework can reduce the gap between patient-centered e-health evidence and practice.

Extensive research supports building models to speed diffusion and sustain technology adoption [27]. Technology adoption research began with models of behavioral change [28] that explained why users abandon traditional practices in favor of new technologies [29]. Over time, the use of these models has expanded due to the fundamental role organizational practices and climate play into individual decisions to adopt and continue to use a technology [30, 31]. These newer frameworks include the key organizational factors of management support, clinician satisfaction, clinical workflow, and financial resources for technology purchase, implementation, and use [32, 33]. Financial resources such as start-up grants and reimbursement supporting technology use are important factors in technology adoption. However, reviews by Brooks et al. [32] and Jeyaraj et al. [33] demonstrate that reimbursement cannot be the only issue considered. Many models describe the barriers in technology adoption [34, 35], but not how to address them. This study examines the competencies and processes that remained unexplored in previous technology implementation research to help health centers and patients participate actively in mobile e-health.

The organizational planning field has used two relatively stable theoretical planning approaches for the past two decades: deliberate/prescriptive planning and emergent/descriptive planning [36]. Porter’s Five Forces Framework [37] created a foundation for deliberate planning. His “look before you leap” planning approach relies on a centralized planning function and development of “the plan”. Concerns with this methodology led Mintzberg and Waters [38] to assert that Porter’s deliberate (sometimes called prescriptive) planning for organizational strategy and change was necessary, but not sufficient. Simply too much occurred after planning to create a single “plan”. They instead developed an emergent (sometimes called descriptive) planning approach, which used data that emerged from the implementation process to improve the plan throughout implementation. This approach moved planning from a purely centralized (top-down) process to include a more emergent (staff engaged) approach [39,40,41]. These concepts from the literature of pre-implementation (deliberate) and post-implementation (emergent), along with our authors’ field experiences [42,43,44,45], lead to the creation of the Network for the Improvement of Addiction Treatment–Technology Implementation (NIATx-TI) framework.

NIATx-TI was piloted in the Iowa Rural Health Information Technology Initiative (IRHIT) with 14 of Iowa’s 105 SUD treatment sites and resulted in a twofold increase in patients receiving virtual treatment. The framework’s deliberate component includes using an organizational change management assessment (OCM) [46]. This tool identifies assets and barriers to technology implementation and adoption. Results from the OCM can be incorporated into the technology’s implementation protocol. The framework’s emergent component includes using a project team to uncover and prioritize implementation barriers as they arise, develop changes to address identified barriers, and monitor selected adoption measures while receiving monthly coaching.

This research will test the technology adoption framework (NIATX-TI) in increasing the use of the Addiction Comprehensive Health Enhancement Support System (A-CHESS), an evidence-based technology platform developed for treating SUD [22, 47,48,49,50,51]. A-CHESS is based on the CHESS platform, which has been studied in 16 randomized control trials (RCTs) across several technology platforms [52,53,54]. These RCTs, each involving hundreds of people, found that the CHESS platform improved key dimensions of quality of life [52, 55], health behavior adherence (e.g., smoking cessation and risky drinking) [56,57,58], and outcomes (e.g., lung cancer death rates and sobriety). A-CHESS is consistent with well-respected health behavioral change models such as self-determination theory (SDT) [59] and Marlatt’s Cognitive Behavior model [60] of relapse prevention. The CHESS platform has also been an effective method for clinicians to interact with their patients [61]. For example, A-CHESS provides a clinician dashboard that provides information about worrisome changes in symptoms collected from the patient during self-monitoring of risk-related items (i.e., sleeping problems, depression, urge, risky situation, and relationship troubles) [62]. The dashboard is intended to bring clinician attention to emerging symptoms quickly. A-CHESS keeps patients in treatment longer by bridging the gap between face-to-face SUD treatment sessions, through clinic communications, peer-support, and having reliable and vetted resources accessible 24/7 [22, 48,49,50]. The Recovering Iowans Supporting Each other (RISE-Iowa) mobile application is an A-CHESS technology, as it contains all of the features of an A-CHESS platform. Having a mobile application dedicated to this study will allow for a controlled on-line discussion environment consisting of only consenting patients from participating organizations.

Product training and on-line support have been the customary approach to technology adoption [63, 64]. In this two-arm study, we will compare a control condition (using a typical product training approach to software implementation followed by access to on-line support) to the intervention condition (the typical product training combined with NIATx-TI). Iowa was selected as the test site for this proposed study. Cluster randomization was used in order to have similar organizations, in terms of the assessed criteria, in both study arms.

Methods

Study objectives

The primary aim of this cluster randomized controlled trial (RCT) is to compare the effectiveness of NIATx-TI (arm 2 only) and product training in implementing an evidence-based platform (A-CHESS), via RISE-Iowa. Impact will be measured using the RE-AIM framework [65]: Reach or Participation (Primary Outcome): (a) percent and representativeness, based on eligible patients who create an account on the RISE-Iowa app, and (b) average RISE-Iowa use, by days of use; Effectiveness: treatment retention rates of sites between the two arms; Adoption: percent and representativeness, based on age, gender, and education level of counselors using RISE-Iowa; Implementation: (a) organizational readiness for, and (b) fidelity of RISE-Iowa implementation and (c) qualitative analyses to develop a better understanding of the implementation processes for delivering NIATx-TI compared to product training/on-line support alone; Maintenance: RISE-Iowa use (e.g., number of RISE-Iowa accounts and days of use) during the sustainability phase. All objectives will be compared at the cluster level.

Trial design

This research extends over 30 months (18M test period and12M sustainability period). We are using a mixed-methods approach to compare the effectiveness of the control: product training alone (arm 1) and intervention: product training + NIATx-TI (arm 2) to increase RISE-Iowa use among adults (> 18 years) receiving outpatient SUD care. Both the intervention and control arms will complete all the same data collection instruments used in this study. A cluster is a pair of organizations that have similar criteria and will be randomly placed in opposite arms of the study.

Participants

We selected Iowa as a partner because of its mix of urban and rural settings, the presence of minority populations, the strength of its data collection and reporting systems, and the limited use of A-CHESS platforms.

Recruitment focused on organizations funded by the Iowa Department of Public Health (IDPH) and licensed to provide outpatient treatment services. We assessed eligible organizations for their readiness for implementation, number of outpatient sites, and mix of urban and rural populations. Each Iowa SUD organization consists of one to seven outpatient sites. The aim was to enroll 40 addiction treatment sites with 20 in each arm and estimated this would require 16 organizations. Of the 20 eligible organizations, 11 Iowa organizations consisting of 43 addiction treatment centers were initially enrolled in this study. Using cluster randomization, we randomized the 11 organizations into two groups: five organizations with 23 addiction treatment centers into the control arm and six organizations with 20 addiction treatment centers into the intervention arm. Post-randomization, one organization in the intervention arm dropped out of the study, and two organizations enrolled post-randomization. The two new organizations were assigned to the intervention arm to achieve a roughly equal number of control and experimental sites. Figure 1 shows the CONSORT diagram, which has a total of 12 organizations with 46 addiction treatment centers, with five organizations with 23 addiction treatment centers in the control arm and seven organizations with 23 addiction treatment centers in the intervention arm.

Fig. 1
figure 1

Consort diagram: RISE-Iowa study recruitment

Randomization

The first step of the stratified randomization (or blocking with matched pairs) was to determine sample matchings for the 11 organizations that originally agreed to participate. The matching of the organizations depended on (a) readiness for technology adoption score (assessed by IDPH based on a previous web portal implementation study), (b) the number of rural sites, and (c) the number of outpatient sites per organization. We applied urn randomization to achieve arm balance, using these three traits. Additionally, the odd number of organizations in the study (n = 11) created one matched pair with two intervention organizations and one control organization. Blinding or concealment of the participants and researchers was not logistically feasible. Post-randomization, a second matched pair was created that paired two intervention organizations to one control organization, to account for the late enrollment of one of the organizations. The second late enrollment replaced the organization that had dropped out of the intervention that was a part of the first 2:1 matched pair. These non-randomized additions were made to keep the total number of SUD treatment centers in each arm even and the pairs as equal as possible in terms of the original randomization matching criteria.

Staff and patient recruitment and consent procedures

A member of the management team at each participating organization completed Klein’s Financial Resource Availability Inventory [66] and the Organizational Change Manager (OCM) for leadership survey [46]. Additionally, a site leader at each location identified up to seven staff to complete the OCM. No staff member at the organization had access to the survey data, nor have they received any data from the surveys. The consent form that precedes the surveys states that completing the survey infers their consent. For those who do not complete the survey, the survey will not be recorded, and no documentation of those who decline to participate. Staff completing surveys are treated as human subjects, informed of possible risks, offered the opportunity to decline participation in the survey, and afforded the protections of our data safety and monitoring plan.

All outpatient counselors and peer recovery support specialists at participating organizations were invited to be part of the study. All clients at our participating organizations with a SUD, > 18 years of age, who understand English, and have access to a smartphone, are introduced to RISE-Iowa during regular treatment at the participating organization. In both arms of the study, counselors give the patient their site location’s unique code to create an account on the RISE-Iowa mobile application. Each organization will create their own protocol for introducing patients to RISE-Iowa. Therefore, how counselors and staff introduce RISE-Iowa to patients and disseminate their site’s unique code for patients to create an account on the app will differ between organizations and potentially between sites within an organization. Once patients are introduced to RISE-Iowa, they can download the mobile application through the Apple Store or Google Play Store. For this study, patient consent is part of the account creation process when the patient downloads and registers on RISE-Iowa. Patients are enrolled in the study after they download RISE-Iowa, create an account, and verify their informed consent.

Interventions

The control arm for this study includes:

  1. 1)

    RISE-Iowa demo for executive leadership with clinical leadership and conducted by the RISE-Iowa product trainer takes place during the project setup phase. This initial RISE-Iowa demo (a) describes the features of RISE-Iowa (tutorial), (b) explains the study design and stipend information, (c) informs leadership how to choose an appropriate RISE-Iowa site coordinator for each site location, and (d) brings understanding to our team about how the organization operates and if special considerations are needed.

  2. 2)

    RISE-Iowa demo for site leaders with the organization’s IT, RISE-Iowa site coordinator, and conducted by the RISE-Iowa product trainer takes place during the project setup phase, after the initial RISE-Iowa demo. This secondary RISE-Iowa demo (a) describes the features of RISE-Iowa (tutorial), (b) provides RISE-Iowa organizational case examples of how it is implemented, and (c) provides examples of how to recruit and prepare clinicians and patients for RISE-Iowa.

  3. 3)

    RISE-Iowa product training is conducted by the RISE-Iowa product trainer and an IT assistant. The product training occurs at the organizational sites. The training serves as the start date of that organizations’ 18-month intervention period. All interested counselors and staff at the organization are invited to participate in the product training, which consists of (a) meeting with organizational leadership to discuss RISE-Iowa implementation plan after the on-site visit, (b) demonstrating RISE-Iowa features and how RISE-Iowa allows patients to self-manage their recovery in conjunction with their counselors, (c) demonstrating how to use the RISE-Iowa clinician dashboard, and (d) providing promotional brochures, posters, flyers.

  4. 4)

    On-line technical support will occur for the 18 months following the product training. Patients and counselors can call or e-mail questions regarding the RISE-Iowa mobile application. Some examples include password/account recovery, account deletion, bug reporting, notification of inappropriate app use, and suggestions for improving the app.

In addition to the above product training and on-line support, the second study arm has a NIATx-TI framework and NIATx-TI structural components. These components include deliberate and emergent planning features suggested in the planning literature, which will allow us to develop an implementation plan that incorporates the organization’s barriers/facilitators to adopting the technology and uses a validated organizational change model. Figure 2 presents the NIATx-TI framework:

  1. a)

    Deliberate (planning) phase: step 1. Define aims: Executive briefing on project setup and discussion of strategic and implementation aims for the organization. This occurs at the initial RISE-Iowa demo. Step 2. Assessment of assets and barriers via organizational change management survey (OCM). This is discussed in detail with the change leader for the intervention arm during the secondary RISE-Iowa demo.

  2. b)

    Emergent phase: step 3. Change leader and change team training and planning on how to modify organizational processes to address barriers to RISE-Iowa implementation (combined with secondary RISE-Iowa demo and RISE-Iowa product training visit). Step 4. Monitor RISE-Iowa implementation reviews progress on aims and gains real-time user feedback. The study team will provide weekly enrollment reports to the site coordinator on the number of patients enrolled in RISE-Iowa. Step 5. Change cycles (or pilot tests) to overcome barriers to implementing RISE-Iowa. Step 6. Sustain changes: Implement a plan to institutionalize gains. Additionally, approximately a year after RISE-Iowa is implemented, the coaches will conduct the sustainability survey verbally with the change team. The survey will help identify the sustainability of the implementation and lead to more productive change team meetings.

Fig. 2
figure 2

Product training and NIATx-TI framework overview

The NIATx-TI structural components involve:

  1. a)

    NIATx coach: an expert versed in technology adoption and organizational change who helps treatment organizations make, sustain, and spread RISE-Iowa adoption efforts. A NIATx-TI coach assists the organizations with applying the NIATx-TI framework, provides the training for this intervention, and provides feedback on the assessments during the session (Fig. 2, step 4) for arm 2 only. They conduct monthly calls with the RISE-Iowa implementation change leader to discuss data monitoring, including the weekly RISE-Iowa enrollment report and monthly RISE-Iowa usage report, and help problem-solve implementation issues.

  2. b)

    Change team: includes change leader, clinical leadership, select staff, and oversees RISE-Iowa implementation (Fig. 2, steps 5–7). The executive leader typically assigns the change leader. Together, they determine and recruit the change team members to include job functions that interact with the RISE-Iowa app. Change teams ideally meet weekly. The change team’s job is to be constantly vigilant for barriers to implementing RISE-Iowa, develop Plan-Do-Study-Act (PDSA) pilot tests to address identified barriers, and then implement any resulting changes to the RISE-Iowa implementation protocol.

The OCM and Klein’s Financial Resource Availability Inventory are administered to both arms through on-line surveys. However, the aggregated results of the OCM will only be provided to the NIATx-TI Arm during protocol development (Fig. 2, step 3) and progress monitoring (Fig. 2, step 5). Additionally, the sustainability survey will be conducted only with the NIATx-TI arm during sustain changes (Fig. 2, step 7).

Timeline of study

During project months 1 to 16, the study team sought ethics approval, created materials for the intervention, recruited and collected baseline data on the organizations, and hired NIATx-TI coaches. In months 13 to 45, the study team implements the interventions and observes the organizations over an 18-month test period. In months 45–57, the organizations will be observed during the sustainability period. Months 55–60 will be used for data analysis and dissemination of research findings.

Sample size

To determine sample size, we fit a linear mixed-effects model to the monthly results for the performance measures, percentage of eligible RISE-Iowa users, and frequency of RISE-Iowa use. These performance measures are calculated for each of the participating organizational sites and clustered by organization to estimate the NIATx-TI framework and product training effects. The power of the study design is determined by the anticipated standardized effect size based on effects experienced in a previous study conducted by one of our authors on e-health adoption in Iowa. This previous study found that NIATx-TI increased client technology use rates by 12.4% (increase of intervention success). This yields a Cohen’s d = 0.352 [67]. Additionally, intraclass correlation (ICC) among sites affects the power of cluster randomized trials [68]. An estimate of ICC is around 0.10 [22]. With a total of 5 or 7 organizations, with 23 organizational sites in each arm, where each site has an average 62 eligible patients, the study will achieve a power of .92 with a type I error rate of 0.05.

Data collection and measures

Table 1 shows the frequency and data sources for all outcome measures. Our primary outcome is the (a) percentage of eligible patients who create an account on the RISE-Iowa app and (b) frequency of use by app user (in days). For the percentage who create RISE-Iowa accounts, data will be collected during the 18-month test period and 12-month sustainability period for that organization. Frequency of use data will be collected during the 18 months following the initial use of RISE-Iowa. The denominator for primary measure (a) will be based on the number of adult (age > 18) SUD patients. This data will be supplied by the Iowa Department of Public Health (IDPH) central data repository. The numerator for primary measure (a) will be the number of unique consumers who create accounts on the RISE-Iowa app. These, along with the frequency of RISE-Iowa use for primary measure (b), are collected in time-stamped log files and include when a patient accesses RISE-Iowa and the service(s) selected.

Table 1 Measures, sources, and tool frequency

Within the two study conditions, we have the additional secondary measures of effectiveness, adoption, and mediation. Effectiveness: Retention rates will be measured using the Simpson measure of admission date to last therapeutic visit [69], because this length-of-stay measure has been positively associated with outpatient outcomes [70]. Adoption: Percent of counselors will be measured from the RISE-Iowa log-in data and compared against counselor numbers stated in the organizational survey, because counselor adoption of a technologies has been found to cause variation in an organization’s ability to apply a technology [32, 33, 66, 71]. Mediation: Organizational readiness (for technology use) will be monitored through the OCM tool [46]. Financial resource availability will be assessed through the completion of Klein’s Financial Resource Availability Inventory (α = .93) [66]. This inventory assesses the organization’s purchasing, training, and implementation resources. Additional questions will be added regarding reimbursement.

Data analysis

A preliminary analysis of organizational data will compare baseline characteristics (admissions, rural v. urban, and gender/age/ethnicity) between organizations in each of the two intervention arms. The analysis will use the National Survey of Substance Abuse Treatment Services (N-SSATS) [72] and Treatment Episode Data Set (TEDS) [73] data definitions. This data will be collected from the Iowa Department of Public Health (IDPH) central data repository. Fisher’s exact and t tests (or Mann-Whitney tests) will be used to test for statistically significant baseline differences.

Initial exploratory analyses of RISE-Iowa use will assess standard summary statistics and graphical presentations at each level and across levels [74]. The data design represents a multi-site cluster randomized trial, and mixed-effects models (random effects due to organization; fixed effects due to study arm and time) will be used for data analysis. Mixed-effects models are also known as multilevel models. Primary outcomes (a) and (b) are nested within cluster (organizational site), and clusters are nested within organization. In a similar multilevel modeling framework, separate models will be applied for primary outcomes (a) and (b). Significant organizational, counselor, and patient differences between study conditions will be included in the models, along with factors included in the matched pairs sampling, as covariates to estimate the effect of NIATx-TI properly and efficiently. We will examine the NIATx-TI effect at each time point using cross-sectional multilevel models. We will also implement growth curve models across time that include the NIATx-TI effect, time, and NIATx-TI*time, as well as time-varying covariates. The best-fitting covariance structure will be determined before analysis based on the Akaike Information Criterion (AIC) and Bayesian Information Criterion (BIC) [75].

We will examine the mediating effects of organizational factors of percentage of counselors using RISE-Iowa, organizational readiness (via the OCM), and financial resource availability (via Klein’s inventory), through a causal mediational analysis, using data from mid-test, end of test, and end of sustainability (post-test) periods. These factors will be analyzed using mixed-effects models. Through a mediation analysis [76, 77], we can estimate the direct (RISE-Iowa use with NIATx-TI and product training) and indirect effects (adoption and implementation factors) of each implementation arm on RISE-Iowa use. We will test each potential mediator’s mediation effect at each time point and across time. The R package “mediation” will estimate the causal mediation effects, examine moderated mediation effects, and conduct sensitivity analysis [78].

The qualitative portion of the study will support the specific aims by exploring how NIATx-TI and product training are implemented, how implementation affects RISE-Iowa use, and how a staff user reacts to NIATx-TI, product training, and RISE-Iowa. During the study, interviews from each of the organizations (n = 12) will occur at months 40 to 45 (at the end of the test phase) and months 53 to 57 (at the end of the sustainability phase). A randomly selected site from each organization will be used for the interviews in each phase to gain a retrospective understanding of what occurred during implementation. The interviews will include closed- and open-ended questions designed to explore participants’ experiences within each arm and the perceived effectiveness of each approach in assisting with implementing RISE-Iowa. The closed-ended questions will be based on elements of the Consolidated Framework for Implementation Research, developed by Damschroder et al. [79], that will address (1) intervention characteristics, (2) outer setting, (3) inner setting, and (4) process. The additional open-ended questions will allow for the discovery of other factors affecting NIATx-TI, product training, and RISE-Iowa implementation. The information collected will inform our understanding of implementation assets, barriers, and potential revisions for each approach. Interviews of up to 30 min will be conducted with the organization’s executive director, the site’s clinical director, the RISE-Iowa site coordinator, and a randomly selected full-time counselor that has been at the organization for at least 6 months. Four interviews per organization will result in n = 48 interviews per phase. The qualitative analysis will produce a description of the technology implementation process. It will encompass grounded explications of the theoretical concepts that influenced the development of the NIATx-TI framework. This description will enhance our understanding of how the NIATx-TI framework works, promoting valuable insights that can be applied to future dissemination of mobile e-health applications.

Any research protocol today needs to consider the impact that COVID-19 may have on patients, staff, and SUD organizations. Therefore, the interviews will also ask how staff and leaders are responding to the virus and what perceived effect the virus is having on RISE-Iowa adoption and use. Many treatment agencies have had to replace face-to-face treatment with virtual counseling. While RISE-Iowa does not include a virtual counseling service, some of the enrolled treatment agencies have started to use RISE-Iowa to backfill weekly virtual counseling services. For example, virtual sessions can include homework assignments that would require the use of RISE-Iowa between sessions.

Ethics

The study received external approval from Independent Review Board Services (IntegReview IRB).

Trial status

Currently, the trial’s recruitment phase is complete, much of the baseline data has been collected, and the implementation period has started.

Discussion

More than 20 million adults aged 18 or older struggle with a SUD each year in the USA. Yet, only around a tenth seek SUD treatment [4, 5, 80, 81], and even fewer achieve long-term sobriety. E-health applications in healthcare have proven to be effective at sustaining treatment and reaching patients’ treatment otherwise would have missed. However, e-health adoption rates in healthcare have been historically poor. In SUD care, using a treatment and recovery mobile app in conjunction with outpatient SUD treatment is highly innovative and has become an increasingly popular option for SUD treatment providers amid the COVID-19 pandemic. Additionally, thousands of studies and hundreds of theories exist on implementing innovations [28, 82,83,84], and the number of evaluations of e-health applications is expanding rapidly. Yet, beyond those mentioned above, we could not find any RCTs that empirically examined the relative effectiveness of competing implementation strategies. This study builds on the work described above and will, to our knowledge, be the first to compare an evidence-based implementation strategy (NIATx-TI) with a widely used implementation strategy (product training) in a randomized trial. Therefore, we implement an e-health treatment framework into SUD outpatient therapy to assess the impact of NIATx-TI in overcoming barriers to mobile e-health technology and patient outcomes.

Availability of data and materials

Not applicable

Abbreviations

A-CHESS:

Addiction Comprehensive Health Enhancement Support System

AIC:

Akaike Information Criterion

BIC:

Bayesian Information Criterion

CHESS:

Comprehensive Health Enhancement Support System

ICC:

Intraclass correlation

IDPH:

Iowa Department of Public Health

IntegReview IRB:

Independent Review Board Services

IRHIT:

Iowa Rural Health Information Technology Initiative

NIATx-TI:

Network for the Improvement of Addiction Treatment–Technology Implementation

N-SSATS:

National Survey of Substance Abuse Treatment Services

OCM:

Organizational Change Management

PDSA:

Plan Do Study Act

RISE-Iowa:

Recovering Iowans Supporting Each other

SDT:

Self-Determination Theory

SUD:

Substance Use Disorder

TEDS:

Treatment Episode Data Set

References

  1. Stahre M, Roeber J, Kanny D, Brewer RD, Zhang X. Contribution of excessive alcohol consumption to deaths and years of potential life lost in the United States. Prev Chronic Dis. 2014;11:E109.

    Article  PubMed  PubMed Central  Google Scholar 

  2. National Drug Intelligence Center (NDIC). The economic impact of illicit drug use on American society. Washington, DC: United States Department of Justice; 2011. Available from: http://www.justice.gov/archive/ndic/pubs44/44731/44731p.pdf.

    Google Scholar 

  3. Hedegaard H, Miniño AM, Warner M. Drug overdose deaths in the United States, 1999–2017. NCHS data brief no. 329. National Center for Health Statistics: Hyattsville, MD; 2018. Available from: https://www.cdc.gov/nchs/data/databriefs/db329-h.pdf.

    Google Scholar 

  4. Council of Economic Advisers (CEA). The underestimated cost of the opioid crisis. 2017. Available from: https://www.whitehouse.gov/briefings-statements/cea-report-underestimated-cost-opioid-crisis/.

  5. Center for Behavioral Health Statistics and Quality. Behavioral health trends in the United States: results from the 2014 National Survey on Drug Use and Health. HHS Publication No. SMA 15-4927, NSDUH Series H-50. Rockville, MD: Substance Abuse and Mental Health Services Administration; 2015. Available from: https://www.samhsa.gov/data/sites/default/files/NSDUH-FRR1-2014/NSDUH-FRR1-2014.pdf.

  6. Miller WR, Walters ST, Bennett ME. How effective is alcoholism treatment in the United States? J Stud Alcohol. 2001;62(2):211–20.

    Article  CAS  PubMed  Google Scholar 

  7. Budney AJ, Marsch LA, Bickel WK. Computerized therapies: towards an addiction treatment technology test. In: El-Guebaly N, Carra G, Galanter M, editors. Textbook of addiction treatment: international perspectives. Berlin: Springer-Verlag; 2015. p. 987–1006.

    Chapter  Google Scholar 

  8. Pew Research Center. Smartphone ownership is growing rapidly around the world, but not always equally. 2019, Feb 5. Available from: https://www.pewresearch.org/global/2019/02/05/smartphone-ownership-is-growing-rapidly-around-the-world-but-not-always-equally/.

  9. Lindhiem O, Bennett CB, Rosen D, Silk J. Mobile technology boosts the effectiveness of psychotherapy and behavioral interventions: a meta-analysis. Behav Modif. 2015;39(6):785–804.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Ramsey A, Gerke D, Proctor E. Implementation of technology-based alcohol interventions in primary care: a systematic review. Addiction Health Services Researchers Conference; Marina Del Ray, CA; 2015.

  11. Firth J, Torous J, Nicholas J, Carney R, Rosenbaum S, Sarris J. Can smartphone mental health interventions reduce symptoms of anxiety? A meta-analysis of randomized controlled trials. J Affect Disord. 2017;218:15–22.

    Article  PubMed  Google Scholar 

  12. Elbert NJ, van Os-Medendorp H, van Renselaar W, Ekeland AG, Hakkaart-van Roijen L, Raat H, et al. Effectiveness and cost-effectiveness of ehealth interventions in somatic diseases: a systematic review of systematic reviews and meta-analyses. J Med Internet Res. 2014;16(4):e110.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Ho C, Severn M. E-therapy interventions for the treatments of substance use disorders and other addictions: a review of clinical effectiveness [internet]. Canadian Agency for Drugs and Technology in Health: Ottawa, Canada; 2018.

    Google Scholar 

  14. Kaner EFS, Beyer FR, Garnett C, Crane D, Brown J, Muirhead C, et al. Personalised digital interventions for reducing hazardous and harmful alcohol consumption in community-dwelling populations. Cochrane Database of Systematic Reviews [Internet]. 2017;(9). Available from: https://doi.org//https://doi.org/10.1002/14651858.CD011479.pub2.

  15. Miller EA. Solving the disjuncture between research and practice: telehealth trends in the 21st century. Health Policy (New York). 2007;82(2):133–41.

    Article  Google Scholar 

  16. Zanaboni P, Lettieri E. Institutionalizing telemedicine applications: the challenge of legitimizing decision-making. J Med Internet Res. 2011;13(3).

  17. Hebert MA, Korabek B, Scott RE. Moving research into practice: a decision framework for integrating home telehealth into chronic illness care. Int J Med Inform. 2006;75(12):786–94.

    Article  PubMed  Google Scholar 

  18. Hester RK, Delaney HD, Campbell W, Handmaker N. A web application for moderation training: initial results of a randomized clinical trial. J Subst Abus Treat. 2009;37(3):266–76.

    Article  Google Scholar 

  19. Squires DD, Hester RK. Using technical innovations in clinical practice: the Drinker’s check-up software program. J Clin Psychol. 2004;60(2):159–69.

    Article  PubMed  Google Scholar 

  20. Marsch LA, Guarino H, Acosta M, Aponte-Melendez Y, Cleland C, Grabinski M, et al. Web-based behavioral treatment for substance use disorders as a partial replacement of standard methadone maintenance treatment. J Subst Abus Treat. 2014;46(1):43–51.

    Article  Google Scholar 

  21. Carroll K, Ball S, Martino S, Nich C, Babuscio T, Nuro K, et al. Computer-assisted delivery of cognitive-behavioral therapy for addiction: a randomized trial of CBT4CBT. Am J Psychiatry. 2008;165(7):881–8.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Gustafson DH, McTavish FM, Chih MY, Atwood AK, Johnson RA, Boyle MG, et al. A smartphone application to support recovery from alcoholism: a randomized clinical trial. JAMA Psychiatry. 2014;71(5):566–72.

    Article  PubMed  PubMed Central  Google Scholar 

  23. Molfenter T, Capoccia VA, Boyle MG, Sherbeck CK. The readiness of addiction treatment agencies for health care reform. Subst Abuse Treat Prev Policy. 2012;7(1):1–8.

    Article  Google Scholar 

  24. Patterson Silver Wolf DA. A COVID-19 level overreaction is needed for substance use disorder treatment: the future is mobile. Los Angeles, CA: SAGE Publications; 2020.

    Google Scholar 

  25. Ornell F, Moura HF, Scherer JN, Pechansky F, Kessler FHP, von Diemen L. The COVID-19 pandemic and its impact on substance use: implications for prevention and treatment. Psychiatry Res. 2020;289:113096.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  26. Iyengar K, Upadhyaya GK, Vaishya R, Jain V. COVID-19 and applications of smartphone technology in the current pandemic. Diabetes Metab Syndr. 2020;14(5):733–7.

    Article  PubMed  PubMed Central  Google Scholar 

  27. Venkatesh V, Davis FD, Morris MG. Dead or alive? The development, trajectory and future of technology adoption research. J Assoc Info Sys. 2007;8(4):267.

    Google Scholar 

  28. Oliveira T, Martins MF, editors. Information technology adoption models at firm level: review of literature. European Conference on Information Management and Evaluation. Academic Conferences International Limited; 2010.

  29. Holden RJ, Karsh B-T. The technology acceptance model: its past and its future in health care. J Biomed Inform. 2010;43(1):159–72.

    Article  PubMed  Google Scholar 

  30. Novak LL, Holden RJ, Anders SH, Hong JY, Karsh B-T. Using a sociotechnical framework to understand adaptations in health IT implementation. Int J Med Inform. 2013;82(12):e331–e44.

    Article  PubMed  Google Scholar 

  31. Khoja S, Scott RE, Casebeer AL, Mohsin M, Ishaq AFM, Gilani S. E-health readiness assessment tools for healthcare institutions in developing countries. Telemed J E Health. 2007;13(4):425–32.

    Article  PubMed  Google Scholar 

  32. Brooks E, Turvey C, Augusterfer EF. Provider barriers to telemental health: obstacles overcome, obstacles remaining. Telemed J E Health. 2013;19(6):433–7.

    Article  PubMed  Google Scholar 

  33. Jeyaraj A, Rottman JW, Lacity MC. A review of the predictors, linkages, and biases in IT innovation adoption research. J Info Tech. 2006;21(1):1–23.

    Article  Google Scholar 

  34. van Gemert-Pijnen JE, Nijland N, van Limburg M, Ossebaard HC, Kelders SM, Eysenbach G, et al. A holistic framework to improve the uptake and impact of eHealth technologies. J Med Internet Res. 2011;13(4):e111.

    Article  PubMed  PubMed Central  Google Scholar 

  35. Kukafka R, Johnson SB, Linfante A, Allegrante JP. Grounding a new information technology implementation framework in behavioral science: a systematic analysis of the literature on IT use. J Biomed Inform. 2003;36(3):218–27.

    Article  PubMed  Google Scholar 

  36. Hutzschenreuter T, Kleindienst I. Strategy-process research: what have we learned and what is still to be explored. J Manage. 2006;32(5):673–720.

    Google Scholar 

  37. Porter ME. How competitive forces shape strategy. Harv Bus Rev. 1979;137.

  38. Mintzberg H, Waters JA. Of strategies, deliberate and emergent. Strateg Manage J. 1985;6(3):257–72.

    Article  Google Scholar 

  39. Finger S, Dixon JR. A review of research in mechanical engineering design. Part I: descriptive, prescriptive, and computer-based models of design processes. Res Eng Des. 1989;1(1):51–67.

    Article  Google Scholar 

  40. Tsang EW. Organizational learning and the learning organization: a dichotomy between descriptive and prescriptive research. Hum Relat. 1997;50(1):73–89.

    Google Scholar 

  41. Bryson JM. Strategic planning for public and nonprofit organizations: a guide to strengthening and sustaining organizational achievement. Hoboken, NJ: John Wiley & Sons; 2011.

    Google Scholar 

  42. Molfenter TD, Boyle MG, Holloway D, Zwick J. Trends in telemedicine use in addiction treatment. Addict Sci Clin Pract. 2015;10(1):14.

    Article  PubMed  PubMed Central  Google Scholar 

  43. Ford JH 2nd, Alagoz E, Dinauer S, Johnson KA, Pe-Romashko K, Gustafson DH. Successful organizational strategies to sustain use of A-CHESS: a mobile intervention for individuals with alcohol use disorders. J Med Internet Res. 2015;17(8):e201.

    Article  PubMed  Google Scholar 

  44. Johnson K, Richards S, Chih MY, Moon TJ, Curtis H, Gustafson DH. A pilot test of a mobile app for drug court participants. Subst Abuse. 2016;10:1–7.

    CAS  PubMed  PubMed Central  Google Scholar 

  45. Dennis ML, Scott CK, Funk RR, Nicholson L. A pilot study to examine the feasibility and potential effectiveness of using smartphones to provide recovery support for adolescents. Subst Abus. 2015;36(4):486–92.

    Article  PubMed  Google Scholar 

  46. Gustafson DH, Sainfort F, Eichler M, Adams L, Bisognano M, Steudel H. Developing and testing a model to predict outcomes of organizational change. Health Serv Res. 2003;38(2):751–76.

    Article  PubMed  PubMed Central  Google Scholar 

  47. Quanbeck A, Gustafson DH, Marsch LA, Chih M-Y, Kornfield R, McTavish F, et al. Implementing a mobile health system to integrate the treatment of addiction into primary care: a hybrid implementation-effectiveness study. J Med Internet Res. 2018;20(1):e37.

    Article  PubMed  PubMed Central  Google Scholar 

  48. Muroff J, Robinson W, Chassler D, López LM, Lundgren L, Guauque C, et al. An outcome study of the CASA-CHESS smartphone relapse prevention tool for Latinx Spanish-speakers with substance use disorders. Substance Use Misuse. 2019;54(9):1438–49.

    Article  PubMed  Google Scholar 

  49. Scott CK, Dennis ML, Gustafson DH. Using ecological momentary assessments to predict relapse after adult substance use treatment. Addict Behav. 2018;82:72–8.

    Article  PubMed  PubMed Central  Google Scholar 

  50. Johnston DC, Mathews WD, Maus A, Gustafson DH. Using smartphones to improve treatment retention among impoverished substance-using Appalachian women: a naturalistic study. Subst Abuse. 2019;13:1178221819861377.

    PubMed  PubMed Central  Google Scholar 

  51. James R. McKay, David H. Gustafson, Megan Ivey, Fiona McTavish, Kevin G. Lynch, Klaren Pe-Romashko, et al. Efficacy of telephone and automated smartphone remote continuing care for alcohol use disorder. Manuscript in preparation..

  52. Gustafson DH, Hawkins RP, Boberg EW, McTavish F, Owens B, Wise M, et al. CHESS: 10 years of research and development in consumer health informatics for broad populations, including the underserved. Int J Med Inform. 2002;65(3):169–77.

    Article  PubMed  Google Scholar 

  53. Gustafson DH, McTavish FM, Stengle W, Ballard D, Hawkins R, Shaw BR, et al. Use and impact of eHealth system by low-income women with breast cancer. J Health Commun. 2005;10(S1):195–218.

    Article  PubMed  Google Scholar 

  54. Gustafson DH, Hawkins R, McTavish F, Pingree S, Chen WC, Volrathongchai K, et al. Internet-based interactive support for cancer patients: are integrated systems better? J Commun. 2008;58(2):238–57.

    Article  PubMed  PubMed Central  Google Scholar 

  55. Gustafson DH, Hawkins R, Boberg E, Pingree S, Serlin RE, Graziano F, et al. Impact of a patient-centered, computer-based health information/support system. Am J Prev Med. 1999;16(1):1–9.

    Article  CAS  PubMed  Google Scholar 

  56. Patten CA, Croghan IT, Meis TM, Decker PA, Pingree S, Colligan RC, et al. Randomized clinical trial of an internet-based versus brief office intervention for adolescent smoking cessation. Patient Educ Couns. 2006;64(1-3):249–58.

    Article  PubMed  Google Scholar 

  57. Japuntich SJ, Zehner ME, Smith SS, Jorenby DE, Valdez JA, Fiore MC, et al. Smoking cessation via the internet: a randomized clinical trial of an internet intervention as adjuvant treatment in a smoking cessation intervention. Nicotine Tob Res. 2006;8(Suppl 1):S59–67.

    Article  PubMed  Google Scholar 

  58. Gustafson DH, McTavish FM, Schubert CJ, Johnson RA. The effect of a computer-based intervention on adult children of alcoholics. J Addict Med. 2012;6(1):24–8.

    Article  PubMed  Google Scholar 

  59. Ryan RM, Deci EL. Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being. Am Psychol. 2000;55(1):68–78.

    Article  CAS  PubMed  Google Scholar 

  60. Marlatt GA, George WH. Relapse prevention: introduction and overview of the model. Br J Addict. 1984;79(3):261–73.

    Article  CAS  PubMed  Google Scholar 

  61. Mathews D. Evaluation consultant. Evaluation report: combatting addiction with technology for pregnant Appalachian women using smartphones. Hazard, KY: Kentucky River Community Care, Inc.; 2014.

    Google Scholar 

  62. Chih MY, Patton T, McTavish FM, Isham AJ, Judkins-Fisher CL, Atwood AK, et al. Predictive modeling of addiction lapses in a mobile health application. J Subst Abus Treat. 2014;46(1):29–35.

    Article  Google Scholar 

  63. Zandieh SO, Yoon-Flannery K, Kuperman GJ, Langsam DJ, Hyman D, Kaushal R. Challenges to EHR implementation in electronic- versus paper-based office practices. J Gen Intern Med. 2008;23(6):755–61.

    Article  PubMed  PubMed Central  Google Scholar 

  64. Terry AL, Thorpe CF, Giles G, Brown JB, Harris SB, Reid GJ, et al. Implementing electronic health records: key factors in primary care. Can Fam Physician. 2008;54(5):730–6.

    PubMed  PubMed Central  Google Scholar 

  65. Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health. 1999;89(9):1322–7.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  66. Klein KJ, Conn AB, Sorra JS. Implementing computerized technology: an organizational analysis. J Appl Psychol. 2001;86(5):811–24.

    Article  CAS  PubMed  Google Scholar 

  67. Rosenthal R, Rubin DB. A simple, general purpose display of magnitude of experimental effect. J Educ Psychol. 1982;74(2):166.

    Article  Google Scholar 

  68. Hsieh FY, Lavori PW, Cohen HJ, Feussner JR. An overview of variance inflation factors for sample-size calculation. Eval Health Prof. 2003;26(3):239–57.

    Article  CAS  PubMed  Google Scholar 

  69. Simpson DD, Joe GW, Broome KM, Hiller ML, Knight K, Rowan-Szal GA. Program diversity and treatment retention rates in the drug abuse treatment outcome study (DATOS). Psychol Addict Behav. 1997;11(4):279.

    Article  Google Scholar 

  70. Hubbard RL, Craddock SG, Anderson J. Overview of 5-year followup outcomes in the drug abuse treatment outcome studies (DATOS). J Subst Abus Treat. 2003;25(3):125–34.

    Article  Google Scholar 

  71. Knight DK, Broome KM, Simpson DD, Flynn PM. Program structure and counselor-client contact in outpatient substance abuse treatment. Health Serv Res. 2008;43(2):616–34.

    Article  PubMed  PubMed Central  Google Scholar 

  72. Substance Abuse and Mental Health Services Administration (SAMHSA). The N-SSATS report: trends in the use of methadone and buprenorphine at substance abuse treatment facilities: 2003 to 2011. Rockville, MD: Center for Behavioral Health Statistics and Quality. p. 2013.

  73. Substance Abuse and Mental Health Services Administration, Center for Behavioral Health Statistics and Quality. Treatment episode data set (TEDS): 2002-2012. In: National admissions to substance abuse treatment services. BHSIS series S-71, HHS publication no. (SMA) 14-4850. Rockville, MD: Substance Abuse Mental Health Services Administration; 2014.

    Google Scholar 

  74. Kim JS, Anderson CJ, Keller B. Multilevel analysis of assessment data. In: Rutkowski L, von Davier M, Rutkowski D, editors. Handbook of international large-scale assessment: background, technical issues, and methods of data analysis. Boca Raton, FL: CRC Press; 2013. p. 389–424.

    Google Scholar 

  75. Singer JD, Willett JB. Applied longitudinal data analysis: modeling change and event occurrence. New York, NY: Oxford University Press; 2003.

    Book  Google Scholar 

  76. MacKinnon DP. Introduction to statistical mediation analysis. New York, NY: Lawrence Erlbaum Associates; 2008.

    Google Scholar 

  77. Imai K, Keele L, Tingley D. A general approach to causal mediation analysis. Psychol Methods. 2010;15(4):309–34.

    Article  PubMed  Google Scholar 

  78. Tingley D, Yamamoto T, Hirose K, Keele L, Imai K. Mediation: R package for causal mediation analysis. J Stat Softw. 2014;59(5):1–39.

    Article  Google Scholar 

  79. Damschroder LJ, Lowery JC. Evaluation of a large-scale weight management program using the consolidated framework for implementation research (CFIR). Implement Sci. 2013;8:51.

    Article  PubMed  PubMed Central  Google Scholar 

  80. Lipari RN, Park-Lee E, Van Horn S. America’s need for and receipt of substance use treatment in 2015. The CBHSQ report. Rockville, MD: Substance Abuse and Mental Health Services Administration; 2016.

    Google Scholar 

  81. Lipari RN, Van Horn SL. Trends in substance use disorders among adults aged 18 or older. The CBHSQ report. Substance Abuse and Mental Health Services Administration: Rockville, MD; 2017.

    Google Scholar 

  82. Rogers EM. Diffusion of innovations. New York: Simon and Schuster; 2010.

    Google Scholar 

  83. Damschroder LJ, Hagedorn HJ. A guiding framework and approach for implementation research in substance use disorders treatment. Psychol Addict Behav. 2011;25(2):194.

    Article  PubMed  Google Scholar 

  84. Van de Ven AH. Problem solving, planning, and innovation. Part I. test of the program planning model. Hum Relat. 1980;33(10):711–40.

    Article  Google Scholar 

Download references

Acknowledgements

We acknowledge Maureen Fitzgerald for her editing services and Judy Ganch for her assistance with the references for this manuscript.

Funding

Research reported in this publication was supported by the National Institute on Drug Abuse (NIDA) of the National Institutes of Health (NIH) under Award Number R01DA044159. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

Author information

Authors and Affiliations

Authors

Contributions

VW is the main preparer of this manuscript. She developed the training materials and protocols for both the intervention and the control, defined the RE-AIM measures, and selected surveying tools. TM and DG are the two principal investigators. They secured funding and developed the initial project idea and framework and provided input into the project implementation and protocol described here. JH is the project manager and helped facilitate communications between all organizations involved in addition to creating and facilitating training materials and protocols. She established Interview IRB approval. RG helped with developing training protocols. DGjr and his team created RISE-Iowa and helped with on-line support for the app. JSK and OC determined the statistical power and data analysis portion of this paper. EP delivered data and information on how IDPH’s database is constructed which was vital to constructing study aim. PP and AT developed analysis for the OCM. The authors read and approved the final manuscript.

Corresponding author

Correspondence to Veronica M. White.

Ethics declarations

Ethics approval and consent to participate

The study received external approval from Independent Review Board Services (IntegReview IRB).

Consent for publication

Not applicable

Competing interests

Todd Molfenter is a faculty member at CHESS. In addition to his academic affiliation, Dr. Molfenter has a less than .1% ownership with CHESS Health, the organization responsible for making the A-CHESS addiction recovery app commercially available to the public. Dr. Molfenter has worked extensively with his institution to manage any conflicts of interest. An external advisory board approved all survey instruments applied, and the individuals who will conduct the data collection and interpretation for this study will have no affiliation with CHESS Health. Also, parts of the NIATx organizational change model used in part of this trial were developed by the Center for Health Enhancement System Studies (CHESS) at the University of Wisconsin–Madison, where Dr. Molfenter is a faculty member. Dr. Molfenter is also affiliated with the NIATx Foundation, the organization responsible for making the NIATx organizational change model available to the public. For this scenario, Dr. Molfenter also has an institutionally approved plan to manage potential conflicts of interest. The individuals who will conduct the data collection and interpretation for this study manuscript will have no affiliation with the NIATx Foundation.

David Gustafson is a part-owner of CHESS Health, devoted to marketing information technologies to agencies that deliver addiction treatment. He is also on the board of directors of the not-for-profit NIATx Foundation, as well as a small consulting company doing business as David H. Gustafson and Associates. These relationships do not carry with them any restrictions on publication, and any associated intellectual property will be disclosed and processed according to his institution’s policies.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

White, V.M., Molfenter, T., Gustafson, D.H. et al. NIATx-TI versus typical product training on e-health technology implementation: a clustered randomized controlled trial study protocol. Implementation Sci 15, 94 (2020). https://doi.org/10.1186/s13012-020-01053-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13012-020-01053-4

Keywords