Skip to main content

Juvenile Justice—Translational Research on Interventions for Adolescents in the Legal System (JJ-TRIALS): a cluster randomized trial targeting system-wide improvement in substance use services



The purpose of this paper is to describe the Juvenile Justice—Translational Research on Interventions for Adolescents in the Legal System (JJ-TRIALS) study, a cooperative implementation science initiative involving the National Institute on Drug Abuse, six research centers, a coordinating center, and Juvenile Justice Partners representing seven US states. While the pooling of resources across centers enables a robust implementation study design involving 36 juvenile justice agencies and their behavioral health partner agencies, co-producing a study protocol that has potential to advance implementation science, meets the needs of all constituencies (funding agency, researchers, partners, study sites), and can be implemented with fidelity across the cooperative can be challenging. This paper describes (a) the study background and rationale, including the juvenile justice context and best practices for substance use disorders, (b) the selection and use of an implementation science framework to guide study design and inform selection of implementation components, and (c) the specific study design elements, including research questions, implementation interventions, measurement, and analytic plan.


The JJ-TRIALS primary study uses a head-to-head cluster randomized trial with a phased rollout to evaluate the differential effectiveness of two conditions (Core and Enhanced) in 36 sites located in seven states. A Core strategy for promoting change is compared to an Enhanced strategy that incorporates all core strategies plus active facilitation. Target outcomes include improvements in evidence-based screening, assessment, and linkage to substance use treatment.


Contributions to implementation science are discussed as well as challenges associated with designing and deploying a complex, collaborative project.

Trial registration


Peer Review reports


Substance use is common among adolescent offenders and relates to delinquency, psychopathology, social problems, risky sex and sexually transmitted infections like HIV, and other health problems [1, 2]. An estimated 70 % of arrested juveniles have had prior drug involvement [3] and over 1/3 have substance use disorders [4, 5]. Arrested youth initiate substance use earlier than other adolescents, leading to more problematic substance use and higher recidivism [68].

US juvenile courts processed 1,058,500 delinquency cases in 2013, with 31 % of cases adjudicated [9]. Most youth who come into contact with the juvenile justice (JJ) system are supervised in the community [10], and the proportion of youth under community supervision is increasing as states across the country seek alternatives to incarceration/detention [9, 11, 12]. Given the contribution of substance use to recidivism, JJ agencies are uniquely positioned to significantly impact public health through substance use identification and early intervention [13].

Because substance use services are generally provided outside the JJ system [14], cross-system linkage is necessary, but often problematic [1517]. Even when linkages are in place, some community service providers do not consistently offer evidence-based services [18]. Collaboration requires communication across agencies that have historically existed as silos, with distinct cultures and belief systems about the effectiveness and importance of substance use treatment [1921]. This context offers an ideal opportunity for implementation science, as communities strive to better meet the needs of youth.

The JJ-TRIALS Cooperative

The Juvenile Justice—Translational Research on Interventions for Adolescents in the Legal System (JJ-TRIALS) is a cooperative research initiative funded by the National Institute on Drug Abuse (NIDA). Six research centers (RCs: Columbia University, Emory University, Mississippi State University, Temple University, Texas Christian University, University of Kentucky) and one coordinating center (CC: Chestnut Health Systems) were funded in July 2013. Each RC recruited one or more JJ Partners to participate in all planning and implementation activities from the outset. The JJ-TRIALS steering committee (SC: composed of principal investigators, JJ Partners, and a NIDA project scientist) was charged by NIDA with developing a study protocol that achieved two goals: (1) improving the delivery of evidence-based practices (EBPs) in community-based JJ settings and (2) advancing implementation science.

Collaboration and cooperation among JJ-TRIALS researchers, partners, and NIDA personnel are critical for study protocol development, refinement, adherence, and implementation. Each of these constituencies provides input on feasibility, utility, and scientific rigor. This approach ensures a study design that meets scientific and partner expectations, while also keeping feasibility in focus. JJ partners provide a real-world comprehensive understanding of the JJ system and its processes through study development, thus assuring a meaningful focus and increasing the study’s potential impact.

Developing the study protocol

The Study Design Workgroup focused on five goals during the development of the JJ-TRIALS protocol: (1) conceptualizing how substance use should be addressed through partnerships between JJ and behavioral health (BH) agencies, (2) identifying evidence-based tools for addressing substance use, (3) identifying a conceptual framework to understand the process of implementing changes, (4) using that framework to guide overall study design, and (5) testing two distinct strategies for implementing desired changes. The final study protocol conforms to a hybrid implementation design [22]. It examines organizational-level implementation outcomes and youth outcomes, using a mixed-methods approach [23]. Primary aims are to (1) improve the continuum of substance use services for juvenile offenders under community supervision and (2) test the effectiveness of two implementation strategies for promoting system-wide change.

The guiding evidence-based practices framework

Best practices for substance use treatment involve a logically sequenced continuum ranging from initial screening to placement and retention in appropriate care. The JJ-TRIALS Cooperative sought to specify how screening, assessment, service referral, and treatment services are interconnected in the identification and linkage to care. The design team developed a service cascade framework that captured the receipt of BH services and provided a unifying approach to guide site activities and study outcomes across a diverse set of sites with unique needs and goals.

The JJ-TRIALS Behavioral Health Services Cascade (hereinafter the Cascade) was modeled after the HIV care cascade, a widely used framework for depicting both gaps in HIV surveillance and treatment [2426]. The Cascade provides a data-driven framework for understanding how justice-involved youth move from JJ to community-based BH providers as substance use problems are identified and responses implemented. The Cascade is premised on the idea that the overlap between substance use problems and JJ contact necessitates screening of all youth who come into contact with the justice system [27, 28]. In an ideal system, a positive initial screen would lead to a more in-depth assessment and, if warranted, subsequent linkage to evidence-based care in the community. There are numerous evidence-based screening and assessment instruments [29, 30], various evidence-based treatment and prevention interventions [31], and promising interventions for linking youth to community-based providers [32, 33].

Evidence shows that the service continuum begins to break down at the initial step of screening in most JJ settings. A national survey of juvenile probation agencies revealed that only 47.6 % reported using a standardized tool to screen and/or assess substance use [17]. Furthermore, a typical specialty adolescent substance use treatment program only adopts about half of high-quality substance use care indicators and EBPs [34]. Figure 1 represents hypothetical data for the Cascade as youth transition across service systems, with each column representing the difference between ideal and actual levels of service delivery. Differences between ideal and actual levels represent problems related to identification, transition, and retention in care. Youth with substance use problems can only be engaged in appropriate treatment if their needs are identified.

Fig. 1
figure 1

Hypothetical retention in the Cascade as youth transition across service systems

Although the Cascade serves as a framework for setting goals around improved evidence-based practice, the study protocol allows sites to choose where on the Cascade they will focus their improvement efforts. This degree of agency-level autonomy recognizes that different EBPs will “fit” better across different agencies (i.e., address the needs of youth, work within constraints of the system). Each agency, informed by data and best practices, sets its own goals for reducing service gaps. The study protocol uses a series of menus of evidence-based screening and assessment tools and treatments to help guide these decisions, but does not dictate that sites focus on a specific point on the Cascade or a particular EBP.

The guiding implementation science framework

The Exploration, Preparation, Implementation, Sustainment (EPIS) framework of Aarons and colleagues guides the design of this study [35]. Consistent with models of quality improvement in healthcare systems [36], EPIS considers the multilevel nature of service systems, the organizations within systems, and client needs during the process of implementing a new intervention. The EPIS model posits four phases of organizational processes during system change. The Exploration Phase involves identification of the problem, appropriate evidence-based solutions, and factors that might impact implementation. Once a proposed solution is identified for adoption, the Preparation Phase begins. This phase involves bringing together stakeholders in a planning process [37], which can be complex, depending on the number of stakeholders and potentially competing priorities and needs [38]. The Implementation Phase begins when initiating change-related activities. Factors affecting implementation include outer context political and funding concerns, inner organizational context issues (e.g., fit with clinician productivity and work demands), and consumer concerns (e.g., applicability of practices for client needs) [39]. When the new practice is routinely used, the Sustainment Phase begins. Sustainment may be facilitated by the degree to which the new services or changes are institutionalized at different levels in the service setting (i.e., system, organizations).

The Cooperative has adapted EPIS to address the complex context within which the JJ-TRIALS study occurs. First, EPIS has typically been applied to the implementation and adoption of one specific EBP [40]. In JJ-TRIALS, sites are asked to select a target goal from the Cascade and implement an EBP that addresses that goal. Thus, each study site could potentially implement a different EBP. Second, while the linear nature of EPIS guides the general design (timing of implementation strategies and measurement), it also implies a dynamic process. In the current study, sites are taught to use data to inform implementation decisions through the application of rapid-cycle testing [4143]. With each “test,” there are subsequent periods of exploration (e.g., what worked, what went wrong), preparation (e.g., modifications to the original plan), and implementation (e.g., enacting the revised plan). JJ-TRIALS is designed to capture these activities to explore and refine the EPIS model.


Selecting the implementation interventions

Implementation studies typically have focused on a single evidence-based intervention [4446], a specific set of best practices [47, 48], generic best practices [49], or a single evidence-based instrument [50]. Few studies have focused on outcomes that cross service system sectors of care [44]. Head-to-head organizational comparative effectiveness trials are rare, in part because the resources needed to execute them often exceed those available in a typical National Institutes of Health (NIH) grant. In JJ-TRIALS, several discrete implementation strategies were combined and manualized to address organizational and system barriers [51]. This effort leverages the resources of RCs with the practical guidance of JJ partners to field a multisite, direct comparison of implementation strategies in a relatively large sample of sites.

The JJ-TRIALS protocol compares two novel implementation interventions that combine several implementation strategies with demonstrated efficacy. These strategies include a needs assessment [52], ongoing training and education [37, 53], local change teams with facilitation [54, 55], and data-driven decision-making [56, 57]. The basic implementation approach compares a Core set of intervention strategies to a more Enhanced set that incorporates all core components plus active facilitation. Across both study conditions, data-driven decision-making serves as a common thread.

Data-driven decision-making (DDDM)

According to the JJ-TRIALS partners, most JJ departments are encouraged to use data to inform decisions, yet few JJ agencies are adequately skilled and resourced in doing so. A number of recent JJ initiatives such as the MacArthur Foundation’s Models for Change [58] have emphasized the importance of making data-informed policy choices. Focusing on systematic data collection, synthesis, and interpretation can help agencies to transform the ways they address problems and implement future change. In design discussions, JJ partners questioned whether providing tools and training would be sufficient or whether a guided “mentoring” approach would be needed to enact system-wide change using DDDM.

DDDM is the process by which key stakeholders collect, analyze, and interpret data to guide efforts to refine or reform a range of outcomes and practices [59]. In JJ settings, DDDM has been used to guide system-wide reform to reduce recidivism and system costs while improving related outcomes such as public safety and access to evidence-based services [6062]. In one instance, DDDM was associated with a 5-year doubling of the proportion of youth accessing EBPs while reducing arrest rates by almost half [58]. This approach has the potential to address unmet substance use treatment needs for JJ-involved youth.

Implementation intervention components

The two sets of implementation intervention strategies tested in JJ-TRIALS are additive (see Table 1 for a description of Core and Enhanced components). The Core condition includes five interventions implemented at all sites during the 6-month baseline period (see timeline below): (1) JJ-TRIALS orientation meetings, (2) needs assessment/system mapping, (3) behavioral health training, (4) site feedback report, and (5) goal achievement training. Following the baseline period, two additional Core components are delivered to all sites: (6) monthly site check-ins and (7) quarterly reports. As part of goal achievement training, sites receive assistance in using their site feedback report to select goals to meet their local needs. Sites are trained on using data to inform decisions (e.g., selecting a goal, applying plan-do-study-act) and enlisting DDDM templates and tools (developed as part of the project) to plan and implement proposed changes. While DDDM principles are expected to facilitate change, organizations may need additional support to apply these principles to their improvement efforts during the implementation phase. The Enhanced condition adds continuing support for the use of DDDM tools by adding research staff facilitation of DDDM over 12 months and formalized local change teams (LCTs) featuring representation from the JJ agency and a local BH provider (with meetings facilitated by research staff). Figure 2 depicts how selection and timing of specific components was informed by EPIS.

Table 1 Description of Core and Enhanced intervention components
Fig. 2
figure 2

Selection and timing of Core and Enhanced components

Study design

The design uses a cluster randomized trial with a phased rollout to evaluate the differential effectiveness of the Core and Enhanced conditions in 36 sites (18 matched pairs—see below) in 7 states. The design features randomization to one of two conditions, randomization to one of three cohorts (with start times spaced 2 months apart), the inclusion of a baseline period in both experimental conditions, and data collection at regular intervals (enabling time series analyses; see Fig. 3). In addition to comparing the two implementation conditions, it also allows sites to serve as their own controls by using an interrupted time series design with the baseline period as an existing practice control. This design enables three time-series comparisons: (1) Baseline (“activities as usual”) versus Core, (2) Baseline versus Enhanced, and (3) Core versus Enhanced.

Fig. 3
figure 3

JJ-TRIALS Study Design

Primary research questions include:

  1. 1.

    Does the Core and/or Enhanced intervention reduce unmet need by increasing Cascade retention related to screening, assessment, treatment initiation, engagement, and continuing care?

  2. 2.

    Does the addition of the Enhanced intervention components further increase the percentage of youth retained in the Cascade relative to the Core components?

  3. 3.

    Does the addition of the Enhanced intervention components improve service quality relative to Core sites?

  4. 4.

    Do staff perceptions of the value of best practices increase over time, and are increases more pronounced in Enhanced sites?

The study also includes exploratory research questions. Examples include: How do sites progress through EPIS phases with and without facilitation? Are Enhanced sites more successful in implementing their chosen action plans, achieving greater improvement in cross-systems interagency collaboration, and experiencing greater reductions in 1-year recidivism rates? Is one condition more cost-effective than the other? And how do inner and outer context measures (e.g., system, organizational, staff) moderate relationships between experimental conditions and service outcomes?


The sample includes 36 sites, with each site composed of one JJ and one or two BH agencies (overall more than 72 participating organizations). Sites were matched into pairs within state systems (based on local population, number of youth referred to JJ, number of staff, and whether EBPs are used). JJ agencies include probation departments (in six states) or drug courts (in one state); BH providers include substance use treatment providers within a county or service region. JJ inclusion criteria were (a) ability to provide youth service records, (b) serve youth under community supervision, (c) access to treatment provider(s) if treatment is not provided directly, (d) minimum average case flow of 10 youth per month, (e) minimum of 10 staff per site, and (f) a senior JJ staff member who agrees to serve as site leader/liaison during the study. Study sites are geographically dispersed and were identified by state JJ agencies (and not selected for particular substance use or related BH service needs).

At the beginning of the project, each site forms an Interagency Workgroup composed of 8–10 representatives from JJ and BH agencies. Recommended composition includes representatives of JJ leadership (e.g., Chief Probation Officer), BH leadership (e.g., Program Director), other JJ and BH agency staff, and other key stakeholders likely to be involved in improvement efforts (e.g., Juvenile Court Administrator, JJ Data Manager).

At least 360 staff members from participating JJ and BH agencies are expected to participate in one or more study activities. Information from at least 120 individual youth case records per site is de-identified and extracted from site data files on a quarterly basis throughout the study period (a minimum sample of 4320 de-identified service records). Interagency workgroup participation, staff survey responses, and youth records are nested within sites.

Recruitment and consenting

Partners facilitated identification and recruitment of JJ agencies. RC staff described study involvement and worked with JJ leadership to identify and recruit the BH partner. JJ agency leadership provided signed letters of commitment and, if required by agency policy or state law, court orders authorizing RC access to confidential juvenile case records. Individual staff recruitment occurs immediately after each leadership and line staff orientation meeting. During orientations, all aspects of the research study are explained and informed consent is obtained from participants, consistent with institutional review board (IRB) protocols at each RC.


The design features two stages of randomization: (a) to one of three start times as part of a phased rollout and (b) to the Core or Enhanced condition. The CC was responsible for all randomization procedures. For the first stage, RCs were used as strata and the six county sites within each were matched into pairs in terms of the number of youth (ages 10–19) in the county based on the 2010 census, the number of youth entering community supervision, the number of staff in community supervision, whether they used standardized screeners/assessments and evidence-based treatment. Each RC PI reviewed matches and ensured comparability prior to randomization. Within each RC, the three resulting pairs were then randomly assigned to one of three start times using a random number generator in Excel. This procedure was utilized to smooth out the logistical burden of implementation and to control for the influence of other exogenous factors [63, 64].

For the second stage of randomization, one site in each pair was randomly assigned to Core and the other to Enhanced. Given that there were only 18 pairs of sites, “optimal” randomization was used to find the most balanced pattern of assignment across each RC. This approach involved running 10,000 permutations of the possible assignments of sites within each pair to condition. For each of these, multivariate Hotelling’s T2 was computed to assess the degree of balance on cohort and condition both within and across all RCs. The final randomization design was selected from a pool of the top 2 % of permutations balancing across all characteristics.

The study is also double-blinded such that neither the RC staff nor any county site staff are aware of their assignment until after both sites in a pair have completed the Core components. Once completed, the condition of both sites is revealed by the CC PI to RC PI and, subsequently, to sites. This design aspect is ideal in studies with multiple sites that have initial variability and require intensive researcher-driven activities such as training, monitoring, or coaching.


For service record level hypotheses, 2160 bi-weekly observations are expected on service delivery outcome measures (36 sites × 60 bi-weekly periods). For site-level hypotheses, 72 observations are expected (36 sites × 2 data collection points), and for staff level hypotheses, a minimum of 1440 observations are expected, with 720 per condition (average of 10 staff × 36 sites × 4 time points). The effective n for power calculations in repeated measures analysis varies between a lower bound of the number of unique sites (N = 36) and an upper bound of the observations per condition (O = 1440 staff surveys or 2160 bi-weekly youth record periods), as a function of the Intraclass Correlation Coefficient (ICC) associated with the outcome measure (e.g., number of youth entering treatment) over time and the number of repeated measures per site. Assuming that the ICC is low (.2 or less), effect sizes in the small to moderate range (.25 to .35) should be detected with 80 % or more power [65]. Several strategies are employed to further increase power: (a) optimal randomization to evenly distribute the 36 sites as much as possible across start up time and condition, (b) using standardized measures to reduce measurement error, and (c) modeling site differences as a function of staff and organizational covariates.


A multilevel approach to measurement is necessary for our understanding of change processes within complex service systems [36, 66]. Youth interact with JJ agency staff who work within a larger organization; in turn, the organization operates within a system that includes BH providers, oversight agencies, and funders. The proposed measurement plan assesses information from these levels.

The design employs three data collection periods: baseline (6 months; generally corresponding to EPIS’ Exploration and Preparation phases), experiment (12 months; corresponding to EPIS’ Implementation phase), and post-experiment (6 months; corresponding to EPIS Sustainment phase). Figure 4 includes a timeline depicting all intervention components (top portion) and data collection (bottom portion) for sites in wave 1. During baseline, RCs initiate collection of de-identified youth records data related to the Cascade dating back to October 1, 2014, administer agency surveys, conduct a local needs assessment (systems mapping exercise and group interview with interagency workgroup members), and administer leadership and line staff surveys at participating agencies. Leadership and line staff complete follow-up surveys during months 2 and 12 of the experiment period and again at month 6 of the post-experiment period. A representative from each site reports progress toward site-selected goals (i.e., implementation activities) during a monthly site check-in phone call. In the Enhanced condition, local change team members complete implementation process surveys during the experiment period. The 6-month post-experiment period consists only of data collection, including youth record extraction, agency and staff surveys, group interview (to determine whether sites sustain new practices), and monthly site check-in calls. Data collection components are summarized in Table 2.

Fig. 4
figure 4

Timeline depicting intervention components (top portion) and data collection (bottom portion) for sites in wave 1

Table 2 Data collection components


The JJ-TRIALS cooperative seeks to manage fidelity by balancing adherence to central elements of the implementation interventions and timely submission of research data with flexibility in addressing diverse site needs. This approach to fidelity aims to address the domains described by Proctor and colleagues [67] with regard to protocol adherence, dose/exposure, and quality. Protocol adherence is fostered by the provision of pre-implementation training activities to key principals (e.g., facilitators) along with the review of critical resources (e.g., detailed instructional manuals, preparation checklists). As implementation ensues, fidelity is further measured by RC-level reporting of the actual date of each study activity relative to its targeted completion date. The Timeline Compliance system tracks key elements of dose, such as the number of attendees at specific trainings [44]. Each implementation intervention has fidelity procedures that provide additional detail regarding adherence, dose, and quality. Procedures include automated reporting (e.g., online BH training sessions), observational ratings (e.g., webinar BH training sessions), facilitator-reported fidelity ratings (e.g., goal achievement training), and participant ratings (e.g., local change team meetings).


Table 3 summarizes the primary hypotheses corresponding to the research questions above. H1 and H2 focus on retention in the Cascade: H1 compares both experimental conditions to their respective baseline period, whereas H2 compares the differential effectiveness of Core versus Enhanced sites. Table 3 shows the working definition and formula for the rates of each step within the Cascade (see Fig. 1), designed to map onto existing and widely used performance metrics systems (Center for Substance Abuse Treatment adolescent treatment branch and National Outcome Monitoring System; the Network for the Improvement of Addiction Treatment (NIATx); the National Quality Forum; the Office of National Coordinator of Healthcare Reform; and Washington Circle Group). The rates shown are proportions of youth receiving the service within each site, divided by the number in the earlier step, with dashed lines highlighting changes in the denominator.

Table 3 JJ-TRIALS research questions and hypotheses

Latent Growth Curve Modeling (LGCM) will be used to test H1 and H2 using MPLUS [68]. A significant change in the slope between the baseline and experimental time periods (H1) or between Core and Enhanced conditions (H2) would suggest that the intervention affected the growth curve. This analysis will be repeated for each targeted outcome measure in the Cascade. To the extent that there are site differences, data can be analyzed within sites, using non-parametric simplified time series (STS) analysis [69, 70]. MPLUS will also allow examination of time-varying covariates to determine whether early implementation activities have significant effects on later time points.

H3a utilizes bi-weekly intake cohorts and tests whether percentages of youth meeting “timing targets” differ significantly between the 18 Enhanced and the 18 Core sites. Records data include dates to allow examination of time between various points in the Cascade (see Table 4). Trends can be examined over time using simplified time series analysis. H3b and H3c are considered exploratory, using data from agency surveys and needs assessment group interviews (measured twice: baseline and end of experiment; see Table 5). Survey content is derived from the JJ-TRIALS National Survey (developed by Chestnut Health Systems and administered in 2014). Group interviews (recorded and transcribed) generate descriptive detail on the entire Cascade, including system capacities, referral processes, the nature and use of screening instruments, the quality of available services, and features in the inner and outer contexts of agencies likely to influence service delivery.

Table 4 Measures from de-identified records corresponding to the Behavioral Health Service Cascade
Table 5 Service cascade: crosswalk of quantitative and quality measures

H4 examines staff perceptions of the value of services along the Cascade. Table 6 describes domains and sample items. Analyses will focus on change in staff responses cross-sectionally over time, using staff nested within agency. Hierarchical linear modeling (HLM) [71] will serve as the basic analysis paradigm in which Enhanced and Core sites are compared. Growth modeling may be appropriate since measures will be collected approximately every 6 months, and it is expected that the groups will be equivalent at baseline. MPLUS can be used to analyze these data using “known class” as an indicator of implementation condition in a multigroup analysis (e.g., linear growth curve modeling). Time-invariant and time-varying covariates that may differentially affect the growth curves of the two implementation conditions will be examined. Should growth model specification not fit the data, multilevel regression modeling will be used.

Table 6 Staff survey domains and example items

Trial status

Feasibility testing

Feasibility testing was conducted in Spring 2015 in three sites not participating in the main study. Study protocol components tested included staff orientations, BH and goal achievement training content, data collection procedures for the needs assessment and baseline staff surveys, content and format of the site feedback report and DDDM templates, and elements of the Enhanced intervention (facilitation, LCT meetings). Information gleaned from feasibility sites was gathered in a systematic format and shared weekly with the Study Design Workgroup. As modifications to content and presentation formats were made, revised protocols were tested in other feasibility sites. Recommended modifications were reviewed and approved by the Steering Committee in September 2015. The extensive testing of all materials, trainings, and procedures in multiple sites helped ensure that anticipated variability across the 36 main study sites was accounted for and addressed.

Main trial

Thirty-six sites from seven states were recruited between January and December 2014. RCs began working with their six respective sites to start obtaining de-identified records in the Fall of 2014. In February 2015, sites corresponding to each RC were paired and randomized to one of three start times. After agency surveys were completed (November 2015), one site from each of the 18 pairs was randomized to the Core (n = 18) or Enhanced (n = 18) study condition. The study began in wave 1 sites in April 2015, with waves 2 and 3 beginning June and August, respectively.


The JJ-TRIALS protocol, developed through a collaborative partnership among NIDA, researchers, and JJ partners, has the potential to impact the field of implementation science as well as JJ and BH service systems in significant ways.

Implementation science innovations

The engagement of JJ partners as collaborators throughout study design, implementation, and interpretation of results has been key to JJ-TRIALS. Active involvement of JJ partners in decisions is essential in designing a study that is both scientifically sound and grounded in the realities confronting the system. For JJ partners, involvement has created a sense of ownership, enhancing the likelihood that interventions are adopted and sustained.

There is great complexity in interactions between the JJ system and community service providers. The problem-solving orientation inherent in EPIS [35] is valuable in understanding the myriad factors that may affect system change: outer context issues, inner context organizational issues, and consumer concerns. These factors become the leverage points for effectively intervening to promote durable system change. EPIS is also fruitful as a framework for developing implementation strategies. The linear phases provide a platform for content and timing of intervention strategies and measurement, yet the dynamic aspect of EPIS suggests recursive movement through those phases as agencies assess and modify implementation efforts. JJ-TRIALS utilizes these strengths of EPIS and builds on current approaches to measuring process improvement [44].

DDDM is another innovative component that is compatible with the needs of researchers who rely on data for evaluating study activities and JJ partners who rely on data to demonstrate accountability to data-driven goals. Participants are trained in applying data-informed strategies using a blended learning approach [72] to facilitate the use of evidence-based practices in identifying and addressing youths’ service needs. Process mapping [73] helps identify addressable gaps in cross-systems service integration. Moreover, reliance on information already captured in sites’ service record data (both electronic and paper formats) allows tracking of the downstream changes resulting from implementation activities.

Finally, JJ-TRIALS efforts (from both quality improvement and evaluation perspectives) are aimed at the entire Cascade, from identification of need (screening and clinical assessment), linkage to care, through retention in treatment. While the JJ system has made progress in the past two decades in determining procedures for the identification of BH needs [74], far less attention has been paid to the implementation of sound procedures for addressing those needs [33]. JJ-TRIALS uses a hybrid measurement model [22] that incorporates measurement of these Cascade-related outcomes at multiple levels: systems, agencies, staff, and youth.

Challenges and potential solutions

Several challenges inherent in developing a complex multisite protocol with multiple levels of measures and hypotheses became apparent as the JJ-TRIALS SC prepared to launch this protocol. First, to test H1, and to introduce local site leadership and staff to the basic concepts and components of the study, a baseline period was established in which data on current services and staff/organizational factors could be collected. Engaging sites in orientation and data collection activities while seeking to ensure that sites did not prematurely begin to address gaps in the Cascade presented a practical challenge.

A second challenge relates to the feasibility of implementing the complex protocol, both for the RCs and participating agencies. With six geographically separated sites per RC, simultaneously initiating the study in all sites would have presented a substantial burden that might have resulted in incomplete or poor implementation of study components. Accordingly, the design included a phased rollout (similar to a stepped wedge design) [64, 75], in which one-third of the matched site pairs were randomly assigned to begin the study in each of three waves, 2 months apart.

Another key concern reflects challenges in meeting the needs and expectations of complex, dynamic service systems while maintaining fidelity to the study protocol. Because JJ agencies face a number of competing priorities and resource constraints, RCs must be sensitive to these issues and maintain flexibility in the study timetable to maintain buy-in among stakeholders. Yet, consistent implementation across sites and across RCs is essential for internal validity. Therefore, flexibility was built into the intervention to allow for variability. Extensive fidelity procedures were developed, including pre- and post-implementation checklists for each intervention component, fidelity monitoring of trainings and facilitation, and monthly facilitator learning circle calls. Each emphasizes “fidelity with flexibility”—keeping to the written protocol to the best of the RC’s ability, while being responsive to the specific needs, preferences, and constraints of the site whenever possible.

Data quality has also proven to be a challenge. As anticipated, wide variability exists in the quality of data available to populate the Cascade. Some sites maintain electronic systems and routinely capture most Cascade elements, while others primarily utilize paper records. Even when data are available electronically, validity can be questioned (e.g., missing values could reflect absence of a service or failure to record a service). RCs have worked closely with sites to ensure adequate and appropriate data, including sending research staff to the site to manually extract records or providing assistance to JJ agencies in developing/modifying electronic systems. In this regard, JJ-TRIALS is likely to facilitate improved data collection within participating sites, addressing existing gaps in justice agencies’ ability to track and report youth outcomes [76].


Through a collaborative partnership among researchers, JJ partners, and NIDA, JJ-TRIALS is incorporating several implementation strategies and the EPIS framework to address unmet substance use treatment needs among juveniles under community supervision. Although such a complex implementation study presents challenges, the protocol is expected to provide important insight regarding the efficacy of implementation interventions to improve BH services in a multi-system context, a test of the utility of EPIS for measuring and assessing organizational and systems changes, the use of a new Cascade framework for analyzing youth data on substance use services, and the ability of JJ and BH agencies to use data-driven decision making to achieve system change. Increasing the use of evidence-based practices for identifying, referring, and treating youth with substance use problems will improve both public health and public safety and provide new tools and strategies for JJ agencies and their BH partners to use when addressing other organizational and system improvements.

Ethical approval

IRB approval was granted by all six research institutions and the coordinating center.



behavioral health


coordinating center


data-driven decision-making


evidence-based practice


Exploration, Preparation, Implementation, Sustainment framework


human immunodeficiency virus


hierarchical linear modeling


Intraclass Correlation Coefficient


institutional review board


Juvenile Justice


Juvenile Justice—Translational Research on Interventions for Adolescents in the Legal System


Latent Growth Curve Modeling


local change team


Network for the Improvement of Addiction Treatment


National Institute on Drug Abuse


National Institutes of Health


research center


steering committee


Sexually Transmitted Infection


simplified time series


  1. Clark DB. The natural history of adolescent alcohol use disorders. Addiction. 2004;99 Suppl 2:5–22.

    Article  PubMed  Google Scholar 

  2. Hicks BM, Iacono WG, McGue M. Consequences of an adolescent onset and persistent course of alcohol dependence in men: adolescent risk factors and adult outcomes. Alcohol Clin Exp Res. 2010;34(5):819–33.

    Article  PubMed  PubMed Central  Google Scholar 

  3. Belenko S, Logan TK. Delivering effective treatment to adolescents: improving the juvenile drug court model. J Subst Abuse Treat. 2003;25(3):189–211.

    Article  PubMed  Google Scholar 

  4. Aarons GA, Brown SA, Hough RL, Garland AE, Wood PA. Prevalence of adolescent substance use disorders across five sectors of care. J Am Acad Child Adolesc Psychiatry. 2001;40(4):419–26.

    Article  CAS  PubMed  Google Scholar 

  5. Wasserman GA, McReynolds LS, Schwalbe CS, Keating JM, Jones SA. Psychiatric disorder, comorbidity, and suicidal behavior in juvenile justice youth. Crim Justice Behav. 2010;37(12):1361–76.

    Article  Google Scholar 

  6. Henggeler SW, Clingempeel WG, Brondino MJ, Pickrel SG. Four-year follow-up of multisystemic therapy with substance-abusing and substance-dependent juvenile offenders. J Am Acad Child Adolesc Psychiatry. 2002;41(7):868–74.

    Article  PubMed  Google Scholar 

  7. Kandel DB, Davies M. Progression to regular marijuana involvement: phenomenology and risk factors for near-daily use. In: Glantz MD, Pickens RW, editors. Vulnerability to drug abuse. Washington, DC: American Psychological Association; 1992. p. 211–53.

    Chapter  Google Scholar 

  8. Kandel DB, Yamaguchi K. Stages of drug involvement in the U.S. population. In: Kandel DB, editor. Stages and pathways of drug involvement: Examining the gateway hypothesis. New York, NY: Cambridge University Press; 2002. p. 65–89.

    Chapter  Google Scholar 

  9. Sickmund M, Sladky A, Kang W. Easy access to juvenile court statistics: 1985-2013. 2015 Accessed 11 Apr 2016.

  10. Puzzanchera C, Adams B. Juvenile arrests 2009 (Juvenile Offenders and Victims: National Report Series Bulletin). Washington, DC: Office of Juvenile Justice and Delinquency Prevention; 2011. Accessed 11 Apr 2016.

    Google Scholar 

  11. Fabelo T, Arrigona N, Thompson MD, Clemens A, Marchbanks III MP. Closer to home: an analysis of the State and local impact of the Texas Juvenile Justice Reforms. Austin, TX: The Council of State Governments Justice Center, Public Policy Research Institute; 2015. Accessed 11 Apr 2016.

    Google Scholar 

  12. Office of Juvenile Justice and Delinquency Prevention. Easy access to juvenile court statistics: 1985-2013. 2015. Accessed 11 Apr 2016.

    Google Scholar 

  13. Chandler RK, Fletcher BW, Volkow ND. Treating drug abuse and addiction in the criminal justice system: improving public health and safety. J Am Med Assoc. 2009;301(2):183. doi:10.1001/jama.2008.976.

    Article  CAS  Google Scholar 

  14. Steadman HJ. Boundary spanners: a key component for the effective interactions of the justice and mental health systems. Law Hum Behav. 1992;16(1):75–87.

    Article  Google Scholar 

  15. Belenko S, Sprott JB, Peterson C. Drug and alcohol involvement among minority and female juvenile offenders: treatment and policy issues. Crim Justice Policy Rev. 2004;15(1):3–36.

    Article  Google Scholar 

  16. De Leon G, Jainchill N. Recovery-oriented integrated system for juvenile justice clients. In: Jainchill N, editor. Understanding and treating adolescent substance use and disorders: Assessment, treatment, juvenile justice responses (Chapter 17). Kingston, NJ: Civic Research Institute; 2012.

    Google Scholar 

  17. Young DW, Dembo R, Henderson CE. A national survey of substance abuse treatment for juvenile offenders. J Subst Abuse Treat. 2007;32(3):255–66.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Belenko S, Dembo R. Treating adolescent substance abuse problems in the juvenile drug court. Int J Law Psychiatry. 2003;26(1):87–110.

    Article  PubMed  Google Scholar 

  19. Henderson CE, Taxman FS. Competing values among criminal justice administrators: the importance of substance abuse treatment. Drug Alcohol Depend. 2009;103 Suppl 1:S7–S16.

    Article  PubMed  Google Scholar 

  20. Roman JK, Butts JA, Roman CG. Evaluating systems change in a juvenile justice reform initiative. Child Youth Serv Rev. 2011;33(Supplement 1):S41–53. Accessed 11 Apr 2016.

    Article  Google Scholar 

  21. Taxman FS, Henderson CE, Belenko S. Organizational context, systems change, and adopting treatment delivery systems in the criminal justice system. Drug Alcohol Depend. 2009;103 Suppl 1:S1–6.

    Article  PubMed  Google Scholar 

  22. Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care. 2012;50(3):217–26. doi:10.1097/MLR.0b013e3182408812.

    Article  PubMed  PubMed Central  Google Scholar 

  23. Palinkas L, Aarons GA, Horwitz SM, Chamberlain P, Hurlburt M, Landsverk J. Mixed method designs in implementation research. Adm Policy Ment Health Ment Health Serv Res. 2011;38(1):44–53. doi:10.1007/s10488-010-0314-z.

    Article  Google Scholar 

  24. Gardner EM, McLees MP, Steiner JF, del Rio C, Burman WJ. The spectrum of engagement in HIV care and its relevance to test-and-treat strategies for prevention of HIV infection. Clin Infect Dis. 2011;52(6):793–800.

    Article  PubMed  PubMed Central  Google Scholar 

  25. Mugavero MJ, Amico KR, Horn T, Thompson MA. The state of engagement in HIV care in the United States: from cascade to continuum to control. Clin Infect Dis. 2013;57(8):1164–71.

    Article  PubMed  Google Scholar 

  26. White House Office of National AIDS Policy. National HIV/AIDS strategy for the United States. Washington, DC: White House Office of National AIDS Policy; 2010. Accessed 11 Apr 2016.

    Google Scholar 

  27. Binard J, Pritchard M, editors. Model policies for juvenile justice and substance abuse treatment: a report by reclaiming futures. Princeton, NJ: Robert Wood Johnson Foundation, NCJ 224679; 2008.

  28. Office of Juvenile Justice and Delinquency Prevention. Screening and assessing mental health and substance use disorders among youth in the Juvenile Justice System: a resource guide for practitioners. Washington, DC: U.S. Department of Justice, Office of Justice Programs, Office of Juvenile Justice and Delinquency Prevention; 2004. Accessed 11 Apr 2016.

    Google Scholar 

  29. Dennis ML, White MK, Titus JC, Unsicker JI. Global Appraisal of Individual Needs (GAIN): administration guide for the GAIN and related measures (Version 5.4). Bloomington, IL: Chestnut Health Systems; 2008. Accessed 11 Apr 2016.

    Google Scholar 

  30. National Institute on Drug Abuse. Chart of evidence-based screening tools for adults and adolescents. 2015. Accessed 11 Apr 2016.

    Google Scholar 

  31. National Institute on Drug Abuse. Principles of adolescent substance use disorder treatment: a research-based guide (NIH Publication No. 14-7953). Bethesda, MD: National Institute on Drug Abuse; 2014. Accessed 11 Apr 2016.

    Google Scholar 

  32. Wasserman GA, McReynolds LS, Whited AL, Keating JM, Musabegovic H, Huo Y. Juvenile probation officers' mental health decision making. Adm Policy Ment Health Ment Health Serv Res. 2008;35(5):410–22.

    Article  Google Scholar 

  33. Wasserman GA, McReynolds LS, Musabegovic H, Whited AL, Keating JM, Huo Y. Evaluating Project Connect: improving juvenile probationers’ mental health and substance use service access. Adm Policy Ment Health Ment Health Serv Res. 2009;36(6):393–405.

    Article  Google Scholar 

  34. Knudsen HK. Adolescent-only substance abuse treatment: availability and adoption of components of quality. J Subst Abuse Treat. 2009;36(2):195–204.

    Article  PubMed  Google Scholar 

  35. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health Ment Health Serv Res. 2011;38(1):4–23.

    Article  Google Scholar 

  36. Shortell SM. Increasing value: a research agenda for addressing the managerial and organizational challenges facing health care delivery in the United States. Med Care Res Rev. 2004;61(3 Suppl):12S–30S.

    Article  PubMed  Google Scholar 

  37. Aarons GA, Green AE, Palinkas LA, Self-Brown SR, Whitaker DJ, Lutzker JR et al. Dynamic adaptation process to implement an evidence-based child maltreatment intervention. Implementation Sci. 2012;7(32). Accessed 11 Apr 2016.

  38. Aarons GA, Fettes DL, Hurlburt MS, Palinkas LA, Gunderson L, Willging CE, et al. Collaboration, negotiation, and coalescence for interagency-collaborative teams to scale-up evident-based practice. J Clin Child Adolesc Psychol. 2014;43(6):915–28.

    Article  PubMed  PubMed Central  Google Scholar 

  39. Aarons GA, Wells RS, Zagursky K, Fettes DL, Palinkas LA. Implementing evidence-based practice in community mental health agencies: a multiple stakeholder analysis. Am J Public Health. 2009;99(11):2087–95.

    Article  PubMed  PubMed Central  Google Scholar 

  40. Edwards A, Lutzker JR. Iterations of the SafeCare model: an evidence-based child maltreatment prevention program. Behav Modif. 2008;32(5):736–56.

    Article  PubMed  Google Scholar 

  41. Johnson P, Raterink G. Implementation of a diabetes clinic-in-a-clinic project in a family practice setting: using the plan, do, study, act model. J Clin Nurs. 2006;18(14):2096–103.

    Article  Google Scholar 

  42. Taylor MJ, McNicholas C, Nicolay C, Darzi A, Bell D, Reed JE. Systematic review of the application of the plan-do-study-act method to improve quality in healthcare. BMJ Qual Safety. 2013. doi:10.1136/bmjqs-2013-001862. Accessed 11 Apr 2016.

  43. Wilkinson A, Lynn J. A common sense approach to improving advance care planning using the "Plan-Do-Study-Act" cycle. BMJ Supportive and Palliative Care. 2011;1(85). doi: 10.1136/bmjspcare-2011-000053.64.

  44. Chamberlain P, Brown CH, Saldana L. Observational measure of implementation progress in community based settings: the Stages of Implementation Completion (SIC). Implementation Sci. 2011;6:116. doi:10.1186/1748-5908-6-116.

    Article  Google Scholar 

  45. Chamberlain P, Roberts R, Jones H, Marsenich L, Sosna T, Price JM. Three collaborative models for scaling up evidence-based practices. Adm Policy Ment Health Ment Health Serv Res. 2012;39(4):278–90. doi:10.1007/s10488-011-0349-9.

    Article  Google Scholar 

  46. Glisson C, Schoenwald SK, Hemmelgarn A, Green P, Dukes D, Armstrong KS, et al. Randomized trial of MST and ARC in a two-level evidence based treatment implementation strategy. J Consult Clin Psychol. 2010;78(4):537–50.

    Article  PubMed  PubMed Central  Google Scholar 

  47. Atkins MS, Frazier SL, Leathers SJ, Graczyk PA, Talbott E, Jakobsons L, et al. Teacher key opinion leaders and mental health consultation in low-income urban schools. J Consult Clin Psychol. 2008;76(5):905–8. doi:10.1037/a0013036.

    Article  PubMed  Google Scholar 

  48. Brooks JM, Titler MG, Ardery G, Herr K. The effect of evidence-based acute pain management practices on inpatient costs. Health Serv Res. 2008;44(1):245–63. doi:10.1111/j.1475-6773.2008.00912.

    Article  Google Scholar 

  49. Gustafson DH, Quanbeck AR, Robinson JM, Ford II JH, Pulvermacher A, French MT, et al. Which elements of improvement collaboratives are most effective? A cluster-randomized trial. Addiction. 2013;108(6):1145–57.

    Article  PubMed  PubMed Central  Google Scholar 

  50. Barwick MA, Peters J, Boydell K. Getting to uptake: do communities of practice support the implementation of evidence-based practice? J Can Acad Child Adolesc Psychiatry. 2009;18(1):16–29.

    PubMed  PubMed Central  Google Scholar 

  51. Kloek CJ, Bossen D, Veenhof C, van dongen JM, Dekkar J, de Bakker DH. Effectiveness and cost-effectiveness of a blended exercise intervention for patients with hip and/or knee osteoarthritis: study protocol of a randomized controlled trial. BMC Musculoskelet Disord. 2014;15(269). doi: 10.1186/1471-2474-15-269 Accessed 11 Apr 2016.

  52. Brown J. Training needs assessment: a must for developing an effective training. Public Pers Manage. 2002;31(4):569–78.

    Article  Google Scholar 

  53. Lyon AR, Stirman SW, Kerns SEU, Bruns EJ. Developing the mental health workforce: review and application of training approaches from multiple disciplines. Adm Policy Ment Health Ment Health Serv Res. 2011;38(4):238–53. doi:10.1007/s10488-010-0331-y.

    Article  Google Scholar 

  54. Belenko S, Visher CA, Copenhaver M, Hiller M, Melnick G, O'Connell DB et al. A cluster randomized trial of utilizing a local change team approach to improve the delivery of HIV services in correctional settings: study protocol. Health Justice. 2013;1(8). Accessed 11 Apr 2016.

  55. Hoffman KA, Green CA, Ford II JH, Wisdom JP, Gustafson DH, McCarty D. Improving quality of care in substance abuse treatment using five key process improvement principles. J Behav Health Serv Res. 2012;39(3):234–44.

    Article  PubMed  PubMed Central  Google Scholar 

  56. Orwin RG, Edwards JM, Buchanan RM, Flewelling RL, Landy AL. Data-driven decision making in the prevention of substance-related harm: results from the Strategic Prevention Framework State Incentive Grant Program. Contemp Drug Probl. 2012;39(1):73–106. doi:10.1177/009145091203900105.

    Article  Google Scholar 

  57. Young D, Moline K, Farrell J, Bierie J. Best implementation practices: disseminating new assessment technologies in a juvenile justice agency. Crime Delinquency. 2006;52(1):135–58.

    Article  Google Scholar 

  58. Childs K, Frick P. Innovation brief: university partnerships as a strategy for promoting data-driven decision making in juvenile justice. 2013. Accessed 11 Apr 2016.

    Google Scholar 

  59. Marsh JA, Pane JF, Hamilton LS. Making sense of data-driven decision making in education: evidence from recent RAND Research (RAND Corporation Occasional Paper Series). Santa Monica, CA: RAND Corporation; 2006. Accessed 11 Apr 2016.

    Google Scholar 

  60. Chayt B. Juvenile justice and mental health: a collaborative approach. 2012. Accessed 11 Apr 2016.

    Google Scholar 

  61. Dwyer AM, Neusteter SR, Lachman P. Data-driven decisionmaking for strategic justice reinvestment. . 2012. Accessed 11 Apr 2016.

  62. DART: Data Analytic Recidivism Tool. Accessed 11 Apr 2016.

  63. Brown CA, Lilford RJ. The stepped wedge trial design: a systematic review. BMC Med Res Methodol. 2006;6(54). doi:10.1186/1471-2288-6-54.

  64. Landsverk J, Brown HC, Chamberlain P, Palinkas L, Ogihara M, Czaja S. Design and analysis in dissemination and implementation research. In: Brownson RC, Colditz GA, Proctor EK, editors. Dissemination and implementation research in health: Translating science to practice. Oxford, NY: Oxford University Press; 2012. p. 225–60.

    Chapter  Google Scholar 

  65. Dennis ML, Lennox RD, Foss MA. Practical power analysis for substance abuse health services research. In: Bryant KJ, Windle M, West SG, editors. The science of prevention: Methodological advances from alcohol and substance abuse research. Washington, DC: American Psychological Association; 1997. p. 367–405.

    Chapter  Google Scholar 

  66. Flynn PM, Knight DK, Godley MD, Knudsen HK. Guest editors. Organizational dynamics within substance abuse treatment [special issue]. Journal of Substance Abuse Treatment. 2012;42: 109-230.

  67. Proctor E, Silmer H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health Ment Health Serv Res. 2011;38(2):65–76. Accessed 12 Apr 2016.

    Article  Google Scholar 

  68. Muthén LK, Muthén BO. Mplus user’s guide. 5th ed. Los Angeles, CA: Muthén & Muthén; 1998-2012.

  69. Dennis ML, Ingram PW, Burks ME, Rachal JV. Effectiveness of streamlined admissions to methadone treatment: a simplified time-series analysis. J Psychoactive Drugs. 1994;26(2):207–16.

    Article  CAS  PubMed  Google Scholar 

  70. Tryon WW. A simplified time-series analysis for evaluating treatment interventions. J Appl Behav Anal. 1982;15(3):423–9.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  71. Raudenbush SW, Bryk AS, Congdon RT. HLM 6: hierarchical linear and nonlinear modeling [computer software]. Skokie, IL: Scientific Software International, Inc.; 2004.

    Google Scholar 

  72. Cucciare MA, Weingardt KR, Villafranca S. Using blended learning to implement evidence-based psychotherapies. Clinical Psychology: Science and Practice. 2008;15(4):299-307. Accessed 12 Apr 2016.

  73. Madison D. Process mapping, process improvement and process management: a practical guide to enhancing work and information flow. Chico, CA: Paton Press; 2005.

    Google Scholar 

  74. Wachter A. Juvenile Justice Geography, Policy, Practice & Statistics StateScan: mental health screening in juvenile justice services. Pittsburgh, PA: National Center for Juvenile Justice. 2015. Accessed 12 Apr 2016.

  75. Dreischulte T, Grant A, Donnan P, McCowan C, Davey P, Petrie D, et al. A cluster randomised stepped wedge trial to evaluate the effectiveness of a multifaceted information technology-based intervention in reducing high-risk prescribing of non-steroidal anti-inflammatory drugs and antiplatelets in primary medical care: the DQIP study protocol. Implementation Sci. 2012;7:24. doi:10.1186/1748-5908-7-24.

    Article  Google Scholar 

  76. Council of State Governments Justice Center. Reducing recidivism and improving other outcomes for young adults in the juvenile and adult criminal justice systems. New York: The Council of State Governments Justice Center; 2015. Accessed 12 Apr 2016.

    Google Scholar 

  77. Venkatesh V, Morris MG, Davis GB, Davis FD. User acceptance of information technology: Toward a unified view. MIS Quarterly. 2003;27(3):425–78.

    Google Scholar 

  78. Welsh WN, Knudsen HK, Knight K, Ducharme L, Pankow J, Urbine T, et al. Effects of an organizational linkage intervention on inter-organizational service coordination between probation/parole agencies and community treatment providers. Administration and Policy in Mental Health and Mental Health Services Research 2015. doi:10.1007/s10488-014-0623-8.

    Google Scholar 

  79. Patterson MG, West WA, Shackleton VJ, Dawson JF, Lawthorn R, Maitlis S, et al. Validating the organizational climate measure: Links to managerial practices, productivity and innovation. Journal of Organizational Behavior. 2005;26(4):379-408. doi:10.1002/job.312.

    Article  Google Scholar 

  80. Rhoades L, Eisenberger R, Armeli S. Affective commitment to the organization: The contribution of perceived organizational support. Journal of Applied Psychology. 2001;86(5):825–36.

    Article  CAS  PubMed  Google Scholar 

  81. Broome KM, Knight DK, Edwards JR, Flynn PM. Leadership, burnout, and job satisfaction in outpatient drug-free treatment programs. Journal of Substance Abuse Treatment. 2009;37:160–70.

    Article  PubMed  PubMed Central  Google Scholar 

Download references


The authors would like to thank the following members of the JJ-TRIALS Cooperative for their assistance and participation in protocol development activities: Gene Brody, Margaret Cawood, Redonna Chandler, Richard Dembo, Patti Donohue, Lori Ducharme, Kelly Hammersley, Veronica Koontz, James Maccarone, Chris Scott, Faye Taxman, Greg Aarons, Connie Baird-Thomas, Diana Bowser, C. Hendricks Brown, Kate Elkington, Barbara Estrada, Leah Hamilton, Phil Harris, Matthew Hiller, Aaron Hogue, Ingrid Johnson, Kathryn McCollister, Cori Miles, Kate Moritz, Jon Morgenstern, Alexis Nager, Elise Ozbardakci, Jennifer Pankow, Jessica Sales, Michele Staton-Tindall, Anne Spaulding, Doris Weiland, Wayne Welsh, Jennifer Wood, and Marsha Zibalese-Crawford.

This study was funded under the JJ-TRIALS cooperative agreement, funded at the National Institute on Drug Abuse (NIDA) by the National Institutes of Health (NIH). The authors gratefully acknowledge the collaborative contributions of NIDA and support from the following grant awards: Chestnut Health Systems (U01DA036221); Columbia University (U01DA036226); Emory University (U01DA036233); Mississippi State University (U01DA036176); Temple University (U01DA036225); Texas Christian University (U01DA036224); and University of Kentucky (U01DA036158). NIDA Science Officer on this project is Tisha Wiley. The contents of this publication are solely the responsibility of the authors and do not necessarily represent the official views of the NIDA, NIH, or the participating universities or JJ systems.

Author information

Authors and Affiliations



Corresponding author

Correspondence to Danica K. Knight.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

SB and DK co-chaired the Study Design Workgroup. All authors participated in the conceptual design of the study and contributed one to two sections of text. DK drafted the manuscript. NA, TW, SB, JB, and GW reviewed and revised the manuscript. All authors read and approved the manuscript.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Knight, D.K., Belenko, S., Wiley, T. et al. Juvenile Justice—Translational Research on Interventions for Adolescents in the Legal System (JJ-TRIALS): a cluster randomized trial targeting system-wide improvement in substance use services. Implementation Sci 11, 57 (2015).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: