Skip to main content

Table 2 Methods and key implementation strategies in the substance use depression study

From: A process for developing an implementation intervention: QUERI Series

Study Component

Description

Relevant Literature

Purpose

Test a multi-component implementation strategy vs. passive dissemination of evidence materials and implementation tools.

[1, 34–36]

Design

Randomized, quasi-experimental, hybrid design with patient-level clinical outcome data and formative evaluation data collected.

[9, 37]

Sample/programs

Four intensive outpatient SUD treatment programs in southern US, matched on program size/structure and current practices for assessing and treating depression. Two programs randomly selected as intervention sites.

 

Evaluation types

  

   • Diagnostic/developmental (formative evaluation)

Site visits, observations of program operations, key informant interviews with staff, and interviews with veterans with depression in SUD clinics.

[9, 13, 17, 38]

   • Implementation-and progress-focused (formative evaluation)

Tracking of: rates of screening, fidelity to screening protocol, consults with program psychiatrists, and use of antidepressants. Frequent phone/e-mail contact with participants to document previously unforeseen barriers/problems and to brainstorm solutions. Number of contacts with site logged.

[9, 38]

   • Interpretive (formative evaluation)

Analysis of all formative evaluation data, including key informant interviews at close of implementation period to document stakeholder experiences.

[9, 37–38]

   • Summative

Quantitative analysis of patient outcomes. Fifty depression patients from each program surveyed during treatment and at 3- and 6-months post treatment.

[37–38]

Implementation strategy

  

   • Development Panels

Local development teams made up of clinicians and administrators from each site and the PI considered barrier/facilitator data from development evaluation and literature on depression management implementation strategies/tools. Panel drafted locally-customized clinical care and implementation strategy/tools. Off-site experts consulted to insure that clinical and implementation tools were evidence-based. Panel iteratively redrafted strategy/tools until panel and experts approved of plans.

[1–7, 13, 39–40]

   • Other implementation interventions considered by Panel

Clinical reminders, audit and feedback, clinical education, marketing, consumer activation, clinical champions, and multi-component vs. single component interventions.

[1, 3–4, 7, 13, 17, 41–46]

   • Facilitation

Internal facilitators to be local "champions" who gather implementation-focused, present at staff meetings, maintain contact with study staff. External facilitation provided by study PI involved problem solving, technical assistance, and creation of educational and clinical support tools.

[9, 17]