Skip to content


  • Study protocol
  • Open Access
  • Open Peer Review

A cluster randomized trial to evaluate external support for the implementation of positive behavioral interventions and supports by school personnel

  • 1, 2, 3Email author,
  • 5,
  • 1,
  • 1,
  • 3,
  • 1, 2, 3,
  • 4,
  • 1 and
  • 6
Implementation Science20149:12

  • Received: 3 August 2013
  • Accepted: 3 January 2014
  • Published:
Open Peer Review reports



Urban schools lag behind non-urban schools in attending to the behavioral health needs of their students. This is especially evident with regard to the level of use of evidence-based interventions with school children. Increased used of evidence-based interventions in urban schools would contribute to reducing mental health services disparities in low-income communities. School-wide positive behavioral interventions and supports (SWPBIS) is a service delivery framework that can be used to deliver universal preventive interventions and evidence-based behavioral health treatments, such as group cognitive behavioral therapy. In this article, we describe our ongoing research on creating internal capacity for program implementation. We also examine the cost-effectiveness and resulting school climate when two different levels of external support are provided to personnel as they implement a two-tier SWPBIS program.


The study follows six K – 8 schools in the School District of Philadelphia randomly assigned to consultation support or consultation-plus-coaching support. Participants are: approximately 48 leadership team members, 180 school staff and 3,900 students in Tier 1, and 12 counselors, and 306 child participants in Tier 2. Children who meet inclusion criteria for Tier 2 will participate in group cognitive behavioral therapy for externalizing or anxiety disorders. The study has three phases, baseline/training, implementation, and sustainability. We will measure implementation outcomes, service outcomes, child outcomes, and cost.


Findings from this study will provide evidence as to the appropriateness of school-wide prevention and treatment service delivery models for addressing services disparities in schools. The effectiveness and cost-effectiveness analyses of the two levels of training and consultation should help urban school districts and policymakers with the planning and deployment of cost-effective strategies for the implementation of evidence-based interventions for some of the most common behavioral health problems in school children.

Trial registration identifier: NCT01941069


  • Implementation
  • Sustainability
  • Fidelity
  • Mental health services disparities
  • Behavioral health
  • Urban schools
  • School-wide positive behavioral interventions and supports


Services disparities

Several epidemiologic studies have shown that only one in five children with emotional and behavioral disorders receive mental health services [13]. Low-income and ethnically diverse children lag well behind their middle class, Caucasian counterparts in rate of service utilization [3, 4]. Access barriers, such as lack of specialized services in low-income communities, high cost,and poor service quality, and stigma have been found to affect service utilization by ethnically diverse children [59]. Service delivery strategies that address access barriers and minimize the effects of stigma are likely to reduce service disparities [10].

Schools’ role in reducing mental health disparities

Services provided in school settings are ideal for identifying and supporting children at risk for mental disorders [11] and for advancing the goal of reducing and eliminating services disparities because they are available to all children. Services are offered in convenient, close to home locations, are often provided at little or no cost to the families, and can be provided while the child is attending school [1214]. School-based services reduce the stigma associated with seeking mental health services [15] and also afford the opportunity to serve children who are at risk for mental disorders [1618]. As such, schools can play a significant role in addressing mental health services disparities in low-income urban communities [19, 20].

Externalizing and anxiety disorders

Aggressive, defiant, disruptive and antisocial behavior such as the behavior seen in children with, or at risk for, externalizing behavior disorders—i.e., Oppositional Defiant Disorder (ODD), Conduct Disorder (CD)—have a lifetime prevalence of approximately 10% [21, 22] and are highly prevalent in school settings [2325]. These disorders have been found to lead to academic underachievement, grade retention, school suspension and expulsion, and later problems with the law [2628]. Early onset of aggressive and antisocial behavior in elementary school has been found to be related to a persistent and chronic trajectory of antisocial behavior into middle childhood and adulthood [29, 30].

Anxiety disorders—i.e., Generalized Anxiety Disorder (GAD), Social Phobia (SP), Separation Anxiety Disorder (SAD)— affect up to 13% of the child population [31, 32]. Anxious children are much more likely than non-anxious children to have problems with social and peer relations [33, 34], academic achievement [35], school refusal [36, 37] and future socio-emotional adjustment [38, 39]. Children with GAD, SAD, and SP share the same underlying construct of anxiety [40, 41], evidence a very high comorbidity rate and have been reported to respond similarly to treatment regardless of which disorder is principal [42, 43]. Anxiety disorders are highly prevalent among inner city school children [4446]. Urban children are especially at risk for anxiety because of the deleterious effects of living in unsafe and deprived neighborhoods [47].

School-wide positive behavioral interventions and supports (SWPBIS)

SWPBIS is an integrated service delivery framework that targets changes in school climate by creating improved systems and procedures [48, 49]. The practices and systems of SWPBIS are organized along a three-tiered continuum of prevention with a behavioral theoretical orientation and the empirical foundation of applied behavior analysis (ABA). Primary prevention strategies focus on preventing new cases of problem behaviors by using school-wide (Tier 1, universal) strategies, such as school-wide discipline, classroom behavior management, and effective instructional practices. Emphasis is placed on teaching all students key behavioral expectations and routines and creating a proactive means of communication for students and school staff. Randomized clinical trials have shown that schools employing SWPBIS have demonstrated better school climate (e.g., fewer student disciplinary problems), higher levels of perceived school safety as reported by students and staff, greater reductions in the number of office discipline referrals (ODR), and improved reading scores compared to schools that did not use SWPBIS and/or did not implement SWPBIS with fidelity [48, 5054]. Some SWPBIS programs also offer targeted group-based support for at risk children (Tier 2, secondary prevention) and individualized support for more severe cases (Tier 3, tertiary prevention). In the present study, children who exhibit need for more targeted support will be offered participation in group cognitive behavioral therapy (GCBT) interventions for externalizing behavior problems or anxiety.

GCBT for externalizing and anxiety problems

The Coping Power Program CPP; [55], a GCBT intervention, has been found to be effective at reducing aggressive behavior, covert delinquent behavior and substance abuse among aggressive boys [56]. Studies using a briefer version of CPP (Anger Coping) also reported significant reductions in aggressive behavior at post-intervention among targeted aggressive boys, compared to untreated aggressive boys and normal controls [57, 58]. Another GCBT intervention, Friends for Life FRIENDS; [59], has been proven to be effective for the prevention and treatment of GAD, SAD, and SP [6063]. For example, in a randomized trial with children diagnosed with an anxiety disorder, 76% of children assigned to FRIENDS were diagnosis free at the end of the 10-week trial compared to 6% of children assigned to a wait-list condition [63].

Use of school personnel for implementation of evidence-based interventions (EBIs)

A number of recent studies have shown that school personnel can be successfully trained in the development and implementation of SWPBIS [48, 50, 64] and GCBT for externalizing and anxiety disorders [6568], respectively. Also, SWPBIS has been successfully implemented in a number of inner city schools around the country [69, 70]. Unfortunately, school personnel and school counselors often lack adequate training in the implementation of SWPBIS and other EBIs. In addition, schools in urban settings often do not have adequate funding to contract mental health professionals to provide child services. The present study directly addresses these important barriers by enhancing school capacity to deploy EBIs.

Theoretical framework

We developed and pilot-tested the program in two K–8 schools that have the same ethnic and socio-economic breakdown as the schools participating in this study. The project was funded by the Centers for Disease Control and Prevention (CDC; RFA-CD-08-001; Elimination of health disparities through translation research; R18). The adaptation and testing of the interventions used in the study were organized around the first three steps of the Public Health Model (i.e., Defining the problem; Identifying causes/risk factors; Developing and testing the intervention) See, [7173] (see Figure 1). Specifically, we collected needs assessment data, assessed symptom profile and mental health service utilization of typical students in the participating schools, investigated risk and protective factors, conducted a review of EBIs for ethnically-diverse children, selected and adapted interventions based on the characteristics of the student population and specific constraints of under-resourced schools, and pilot-tested SWPBIS and EBIs for externalizing behavior problems and anxiety (Eiradi et. al., Development and implementation of a SWPBIS program with mental health supports in urban schools, submitted) [74]. The present study will extend this work by addressing the fourth step (Implementing the intervention/measuring effectiveness) of the Public Health Model [71]. Outcome measures for the present study were selected based on the implementation framework developed by Proctor and colleagues [75, 76], including the assessment of implementation outcomes, service outcomes, and child outcomes (see Figure 1).
Figure 1
Figure 1

Public health model for project development and implementation. Adapted from Kutash, K., Duchnowski, A. J., & Lynn, N. (2006). School-based mental health: An empirical guide for decision-makers. Tampa, FL: University of South Florida, The Louis de la Parte Florida Mental Health Institute, Department of Child & Family Studies, Research and Training Center for Children’s Mental Health.

Study aims

The specific aims of the study are:
  1. 1.

    To determine whether program content and process fidelity for Tier 1 (for all students) and Tier 2 (for at risk and high risk students) differ between schools receiving a higher level of support (Consultation plus Coaching, C + C) and schools receiving a lower level of support (Consultation, C) for program implementers.

  2. 2.

    To determine whether school climate, office discipline referrals, and participants’ diagnostic status, symptom and impairment severity, coping skills, and academic productivity differ between schools receiving different levels of support (C or C + C).

  3. 3.

    To assess changes in mental health disparities brought about by Tier 1 (perception of school climate, student suspension rates) and Tier 2 (proportion of students with unmet need), respectively.

  4. 4.

    To determine the incremental cost-effectiveness of SWPBIS C versus SWPBIS C + C.


Study fit with the funding opportunity

The study is aligned with the funding opportunity PAR-11-104, Reducing health disparities among minority and underserved children (R01), in that it evaluates the comparative effectiveness of two levels of support for school personnel for the implementation of a continuum of mental health prevention strategies for low-income, ethnically diverse children in non-traditional service settings.


Six K–8 schools in the School District of Philadelphia were randomly assigned to either Consultation or Consultation plus Coaching (See Figure 2). A stratified random assignment list of schools to condition was generated using a computerized random assignment program [77]. Schools were stratified to group one if their baseline School Climate (CREST) scores were below the median CREST scores or to group two if the CREST scores ≥ the median value. The units of analyses for Tier 1 are all of the students in those K-8 schools (650 per school), leadership team members (eight per school) and other school staff (30 per school). The units of analyses for Tier 2 are the children meeting criteria for inclusion in one of the GCBT groups (enroll a total of 367 students assuming a 17% dropout; approximately 51 eligible students per school), the school counselors (two per school) conducting the GCBT groups, and the groups themselves (14 per school; see Table 1).
Figure 2
Figure 2

Study design and timeline.

Table 1

Participants by Tier and support level



Consultation plus coaching


Tier 1






Leadership team members




School staff




Tier 2










Intervention groups





Tier 1 research activities will include 48 teachers, administrators and parents in the leadership teams, 180 school staff for the assessment of school climate, and approximately 3,900 students. Participants in Tier 2 will be 12 counselors and approximately 306 evaluable children who have, or are at risk for, externalizing or anxiety disorders. School personnel will provide all services. Information pertaining to each school will be considered nested within school ‘cluster’ and will be defined as such in the statistical analyses.

Inclusion criteria

All children will participate in Tier 1. Children in grades 4–8, who meet screening and diagnostic criteria will be included in Tier 2. The screening criterion is a score ≥1 SD on the Conduct Problems or Emotional Symptoms scales of the Strengths and Difficulties Questionnaire (SDQ), a widely-used screening instrument [78], filled out by a teacher. Children meeting screening criterion will undergo a diagnostic evaluation. Children meeting primary diagnosis of ODD, CD, GAD, SAD, or SP (based on a parent structured interview [79] and a symptom severity scale [80] filled out by an independent rater) at Intermediate or Positive level will be eligible for participation. Children with comorbid (secondary) conditions will be included.

Exclusion criteria

There are no exclusion criteria for Tier 1. Children with a Special Education classification of Intellectual Disability, as well as those who are not able to communicate in English, do not have a principal diagnosis of ODD, CD, SAD, GAD or SP or are not at risk for one of those disorders, have an accumulated absenteeism record of ≥33% for the current year, or have a history of psychotic or autistic spectrum disorders, will be excluded from Tier 2.

Linkage between Tier 1 and Tier 2

The leadership teams and the counselors will manage the SWPBIS program. They will be in charge of identifying, eliminating, or consolidating existing school-wide interventions that could overlap or interfere with the implementation of SWPBIS. Existing behavioral health services for individual students (e.g., wrap-around services) will be maintained because they will not interfere with SWPBIS. The leadership teams in each school will develop all interventions for Tier 1 while receiving C or C + C. A subcommittee within the leadership team composed of a counselor and an administrator will be in charge of identifying and referring children for participation in Tier 2. Following methods used in other studies [81] each teacher will be asked to rank the children in their classroom from least to most difficulty with externalizing behavior or anxiety and then to refer the top three children with externalizing behavior problems and top three children with excessive anxiety for possible participation in Tier 2. Decisions as to the appropriateness of referrals to Tier 2 will be based on data gathered via teacher ratings on the SDQ [78] and a web-based software system [82] for collecting and summarizing office discipline referrals.


The SWPBIS project will be implemented in six K–8 public schools in North Philadelphia, PA, a low-income area that has the second highest level of food insecurity in the country [83]. The ethnic makeup of students is approximately 60% Latino, 30% African American, 10% other minority.

Training and consultation to leadership teams and counselors

The training model assumes that participating school personnel will need initial training during the first year of the project and briefer retraining each subsequent year. Members of the research team will conduct three days of formal initial training and one day of retraining each with members of the leadership team (LT) and counselors. The training will be delivered using procedures employed in dissemination and implementation studies in nontraditional settings (i.e., initial workshop and subsequent ongoing consultation) [68, 84] and other strategies found to be effective according to a recent meta analysis (i.e., active learning strategies) [85].

External coaches

The support conditions (C, C + C) were modeled after procedures developed by John Lochman and colleagues for the training of school counselors in the context of a dissemination study with the Coping Power Program [68] and recommendations from a recent meta analysis [85]. Consultation will be provided by ‘coaches’ who will be advanced trainees (e.g., interns, fellows) in applied psychology. After initial training, which will be the same for C and C + C, the Tier 1C + C Coaches will conduct one-hour biweekly on-site consultation with members of the LT to help them develop and implement universal interventions throughout the school as delineated in the SWPBIS Team Training Manual [86]. Progress toward the completion of each step in the SWPBIS Team Training Manual will be assessed monthly via the Team Implementation Checklist (TIC) and PBIS Action Plan [87] completed by members of the LT and reviewed by the Tier 1 Coach. In addition, the Tier 1 Coaches will be available to the head of the LT for scheduled one-on-one performance feedback and as needed one-on-one consultation (via telephone or email) about implementation barriers (Individualized Problem Solving). The Tier 1C Coaches will supply the LT an online guide containing steps for the development and implementation of universal interventions. Also, the Tier 1C Coaches will have one-hour biweekly monthly phone conferences with the head of the LT in which progress in completing the steps needed for the development and implementation of interventions as measured via the TIC and PBIS Action Plan will be discussed.

After the initial training related to Tier 2 interventions, which will be the same for C and C + C counselors, the Tier 2C + C Coaches will conduct 14, two-hour weekly on-site consultation sessions with counselors to debrief previous CPP and FRIENDS sessions, observe child group sessions, prepare for upcoming sessions, and conduct problem solving concerning barriers and challenges in the implementation of the protocols. Counselors will receive video-recorded samples of effective implementation of the main components of the treatments (e.g., exposure for anxiety). Tier 2C + C Coaches will also provide individual performance feedback to the counselors after the session is over and will be available to counselors as needed (via telephone or email) for one-on-one consultation about implementation barriers (Individualized Problem Solving). Tier 2C Coaches will conduct two-hour weekly consultation sessions using the same group content and format and they will provide counselors video-recorded samples of effective implementation of the main components of the treatments. However, they will not observe group sessions or offer individualized feedback or individualized support to counselors.

School recruitment

Upon receiving funding, a presentation of the project was conducted with a preselected group of school administrators from schools in the regional area of interest. Presentations were conducted in person and via GoToMeeting. Those who expressed interest in taking part in the project were asked to respond to a brief request for proposals (RFP). The six schools that applied met the minimal inclusion criteria, so the search was discontinued at this stage. The inclusion criteria included: grade level (any elementary or middle school combination); students’ socio-economic status (percentage of students eligible for free/subsided lunch set at ≥ 90%); racial/ethnic diversity (majority of students are of minority status); and school-wide initiatives (absence of current mental health prevention initiatives). A more in-depth description of the project was provided during staff meetings and a vote was taken. At least 80% of school staff in attendance was required to vote affirmatively in order for the school to join the project. This is a standard cut-off in SWPBIS [88]. Subsequently, the investigators trained members of the leadership teams. As of this writing, the leadership teams are developing interventions for Tier 1 while receiving C or C + C.


We employ two measures to help us determine the readiness of school personnel for implementing Tier 1 and Tier 2 interventions. Members of the leadership teams will complete a 23-item checklist [87] to assess completion of activities related to critical features of the SWPBIS model, including regular meetings of the LT and the use of discipline data (e.g., events, dates and time, student, in and out of school suspensions) for program planning and to monitor the schools’ progress in implementing their SWPBIS Action Plan [48]. Members of the leadership teams will complete the instrument monthly; it will also be used by the External Coach to guide members of the LT in developing and implementing Tier 1 interventions in their school.

For Tier 2, following the initial workshop, trainers will assess whether counselors have learned the theoretical underpinnings of CBT and the treatment of anxiety and depression in children. We will use a questionnaire [89] developed for the assessment of knowledge of EBIs in the treatment of youth psychopathology. Counselors will complete the questionnaire right after the initial and last workshop. Counselors scoring below the 80% cutoff will be provided further individual training in the areas in which they scored low.

Screening, demographics, service utilization

Counselors will ask teachers to complete the SDQ, a mental health screening questionnaire [78, 90, 91], for children who accumulate ≥3 ODRs or who are referred by a subcommittee of the LT charged with monitoring at-risk children. Children who score ≥1 SD [92] on the Conduct Problems or Emotional Symptoms scales will be invited to participate in a thorough diagnostic assessment.

Implementation outcome measures


Delivering content as originally intended by the treatment developers (content fidelity) leads to better outcomes [93]. Also, delivering content while actively engaging clients in therapy and doing it with a sense of competence (process fidelity) leads to better adherence and outcomes [9497]. A group of 20 randomly selected teachers, other school personnel and students will be interviewed at the end of each project year in order to assess content fidelity for Tier 1 [98]. Data sources are direct observations, review of school policies and interviews with school staff and students [99]. A cut-off score of 80% on the measure will be used for determining successful implementation of SWPBIS [98].

A checklist reflecting each activity component of the session agenda or outline for CPP and FRIENDS will be used to assess content fidelity for Tier 2. All treatment sessions will be video recorded. Independent coders (ICs) will use a yes-no response scale to indicate whether or not a counselor covered a particular component. Two ICs will complete the checklist after observing video recorded sessions. Differences of within 5% between the 2 ICs will be considered an agreement. We will report the average score between the two ICs.

Our process fidelity checklist is based on a 12-item measure developed by John Lochman and colleagues (e.g., Counselor’s tone is warm and positive). We will use the total score (Overall Process Fidelity) [68]. Twenty percent of sessions one, two, three, and four will be coded, while 30% of sessions five through 14 will be coded by 2 ICs. Differences of within one point between the two ICs will be considered an agreement. We will report the average score between the two ICs.

Process evaluation

We will conduct focus groups with members of the leadership teams and other school personnel at the end of each project year in order to assess perceptions of the appropriateness of C and C + C level of support for the implementation of SWPBIS.

Children’s functioning

We will collect children’s academic performance, absenteeism, and office discipline referrals (ODR) throughout the five years of the project. Absenteeism data will be collected from the school’s daily attendance records. Academic performance will be taken from the state’s mandatory testing for reading and math for all students. ODRs will be collected using a web-based system for monitoring ODRs to assist in intervention planning and evaluation [100]. These data will be used for determining inclusion into Tier 2 and to measure outcomes. We will also record in- and out-of-school suspensions per 100 students per year and percentage of students receiving suspensions. These indices are reliable and valid for measuring intervention effects [82].

School climate

Perception of school climate will be assessed via a questionnaire [101] completed by 30 school staff per school (equal proportion of teachers, support staff and administrators) chosen at random in year one. The same school staff in each school will subsequently complete the survey at the end of each project year. The questionnaire has four factors: Skill Instruction; Support for Staff; Staff Respect for Students; and Safety. We will use all four factors in the analyses.

Mental health disparities

We will measure changes in mental health disparities as a result of Tier 1 interventions by comparing scores on school climate and school suspension rates between year one (baseline) and each subsequent year across all schools. Changes in mental health services disparities as a result of Tier 2 interventions will be assessed via a semi-structured service utilization interview [102, 103] administered to the parents of children in Tier 2 across all schools between the baseline and intervention phases.

Child outcome measures for at risk/high risk children

Children’s diagnostic status, mental health symptoms, coping skills and functional impairment will be assessed at pre- and post-participation in CPP and FRIENDS. All of the parent measures are available in Spanish. Diagnostic status will be assessed via a computerized parent-structured interview [79, 104]. This instrument reports three levels of diagnostic severity for each disorder: Positive, Intermediate (at-risk), or Negative. Upon completion of the structured interview, the independent evaluator (IE) will assign interference scores to each positive diagnosis in order to determine primary and secondary diagnoses [80], based on a seven-point scale, with lower scores indicating less severity. Changes in coping skills will be assessed via a 34-item self-report questionnaire [105] that measures coping strategies (seeking social support, self-reliance/problem-solving, distancing, internalizing, and externalizing). The total score for this measure will be used in outcome analyses.

Changes in mental health symptoms (e.g., aggression, conduct problems, anxiety, somatic symptoms) will be measured via parent [106], child [107, 108] and teacher [106] multi-axial rating scales. Teachers will also provide changes in children’s grades and rate of academic productivity [109]. After all parent, child and teacher data are collected at pre- and post-, the IEs will complete a measure of functional impairment [110].

Independent evaluator (IE) training and reliability

IEs will be advanced doctoral students in applied psychology who have been trained to a reliable standard on the use of diagnostic instruments by M.K. and J.M. following recommended guidelines through joint interviews, live observation, and discussion during the pre-study start-up meetings. IE reliability will be checked quarterly. Kappa coefficients will be computed, and a minimum of 0.85 will be required for each measure.


Primary prevention (universal interventions)

The use of SWPBIS in this study differs from other studies in that it focuses primarily on student mental health (i.e., emotional and behavioral functioning). The focus of primary prevention strategies will be to prevent new cases of mild externalizing or anxiety problem behaviors by using school-wide (universal) strategies. A key intervention will be the development of a leadership team. The primary responsibility of the leadership team will be to develop, implement and formatively evaluate Tier 1 interventions. In the process, a set of school-wide expectations will be developed, explicitly taught to all students and promoted through the implementation of a school-wide motivation system. Specific rules and positive and negative consequences for student behavior in hallways, cafeteria, and playgrounds will be developed, taught to students, and implemented and monitored by school personnel. We expect that Tier 1 interventions will lead to improved school climate and perceived school safety. Improved school climate and perceived school safety will likely contribute to a decrease in the number of children who develop externalizing behavior problems, and also, anxiety problems. We will have two tiers in the proposed study: Tier 1 for universal interventions and Tier 2 for children who meet diagnostic criteria or are at risk for an externalizing or an anxiety disorder.

Secondary prevention (at-risk/high risk students)

The Coping Power Program (CPP) is a cognitive-behavioral, multi-component group intervention for elementary and middle school students at risk for externalizing behavior disorders. In addition to anger management, the CPP includes units on goal setting, emotional awareness, relaxation training, social skills training, problem solving, and handling peer pressure. The original CPP offers eight sessions for the first year of intervention and 25 sessions for the second year of intervention. We adapted the protocol based on qualitative (focus groups) and quantitative (acceptability) data from parents, children and teachers. We reduced CPP from the original 34-session program to a 14-session format to make it possible for counselors in busy under-resourced urban schools to run groups with CPP. Care was taken to maintain the key components (active ingredients) of the treatment, even though less time was dedicated to covering each Section. A previous shorter version of CPP (Anger Coping Program) has been found to be very effective with a group of aggressive boys [58, 111].

Friends for Life (FRIENDS) is a group CBT intervention, based on a theoretical model, which addresses cognitive, physiological and behavioral processes that are seen to interact in the development and perpetuation of excessive anxiety. The original FRIENDS protocol consists of 10 (60-minute) weekly sessions and two booster sessions. In the present study, we have included the booster sessions in the regular protocol and added two more sessions in order to fit them into the typical class period (40 minutes), for a total of 14 sessions. The protocol has a parent component, which consists of two group sessions that focus on strategies to help parents cope with their own anxiety, reinforcement strategies and contingency management for children, and brief training in problem solving and communication skills.

Data analytic plan

The statistical analyses for each of the study aims are described below: in addressing aim one, mean fidelity and its 95% confidence intervals for Tier 1 will be calculated using 20 randomly selected participants from each of the six schools during years three, four, and five (120 total per year). The 95% confidence intervals will be presented by C & C + C for all schools combined and for each school. Content and process fidelity data for Tier 2 will be obtained by scoring 14 video recorded sessions per each of the treatment groups. Perception of School Climate (aim two) during year one will be surveyed by a randomly selected sample of teachers, support staff and administrators from each participant school. The randomly selected sample will consist of similar proportion of teachers, support staff and administrators within each school. The questionnaire [101] will be completed at the end of years one, two, three, four, and five. This 31-item, 5-point Likert type scale checklist has four factors. Each factor’s total score will be analyzed separately. School personnel no longer present in the school during subsequent years will be replaced.

The general analytical approach for testing our hypotheses will be the Laird and Ware mixed effects model [112] and the Generalized Estimating Equation (GEE) approaches [113, 114]. Mixed effects and GEE are statistical approaches based on regression techniques for analyzing correlated data collected repeatedly from the same subject. The study design will allow us to examine the between- subjects effect related to the two levels of support, and a within-subjects effect corresponding to time points (five time points), as well as a type of support (C, C + C) by time interaction effect. These analyses will be conducted using SAS Proc Mixed, which utilizes the mixed effects models, and SAS Proc Genmod, which utilizes GEE [115]. Both the mixed effects model and GEE approaches allow estimation of fixed effect parameters such as time, and time by treatment interaction. If the interaction term is significant, it indicates that change in school climate and/or reduction in ODRs over time is significantly different between the two levels of support. If the interaction term is statistically not significant, the changes over time for the two primary outcomes will be presented by the time effect only, which represents both levels of support. The advantage of utilizing these methods is that all information available from each student is used, including variables with missing observations. In the proposed analyses, we will consider students measured within classes are correlated and will be nested within classes. Nested analysis will be chosen as part of Proc Mixed and Proc GEE, as such, the intraclass ‘intracluster’ correlation (ICC) will be part of the overall variance covariance matrix.

Changes from pre- to post- implementation in diagnostic status, symptom severity and impairment level, coping skills and academic productivity during Implementation across all schools, regardless of level of support, will be analyzed using McNemar tests. For each of the listed primary outcomes, a series of 2X2 tables will be constructed to test changes from pre- to post- in the pertinent categories using McNemar tests. Alpha level will be adjusted to 0.025 based on Bonferroni criteria for multiple comparisons. We will also use McNemar tests to test the hypothesis that the pre- to post- level of improvement in diagnostic status, symptom severity, impairment level, coping, and academic productivity obtained during Implementation will be maintained during Sustainability among schools receiving C + C.

We would like to determine whether pre- to post- improvement in diagnostic status, symptom severity, impairment level, and academic productivity differ by support condition during the implementation phase. In addressing this research question, and for the sake of simplifying the statistical analysis, each student will be categorized as either improved or not improved based on change from pre- to post-. A category one will be assigned if a student’s diagnostic status, for example, changes either from positive to intermediate, intermediate to negative, or positive to negative. Otherwise a score of zero will be assigned. Similar coding will be used for symptom severity, coping, impairment level, and academic productivity. We will use chi-squared tests to compare the proportion of (yes/no) for each of the primary outcomes between C and C + C type of supports. To examine changes in the rate of ODRs between baseline and follow up during each school year, the number of students with at least one ODR will be defined at baseline and at the follow up and a 95% confidence interval for such changes will be constructed for each of the five years.

In addressing aim three, school climate scores and the proportion of out-of school suspensions (number of suspension divided by number of students) served at the school during the previous academic year (i.e., before implementation of Tier 1) will be compared to school climate scores (using the two independent sample t-test), and school suspensions for each subsequent project year (using Chi-squared test) respectively. For Tier 2, the proportion of children at baseline who are found to have unmet need for mental health services (e.g., children found to have an externalizing or anxiety disorder at the intermediate or clinical level via the DISC-IV and who have not received mental health services as measured via the SACA) will be compared to the proportion of children with unmet need for mental health services in subsequent years (using Chi-squared test). In order to obtain a measure of unmet need at baseline, we will interview parents to assess the presence of disorders and service utilization at one year prior to the assessment and at the time of the assessment. For each subsequent cohort of children, we will interview parents about presence of disorder and service utilization only at the time of the assessment. Comparison of the proportion of unmet need between each subsequent year to the baseline proportion will be compared using Chi-square test.

Aim four: cost analysis

We will evaluate the effectiveness of SWPBIS with C + C as compared to SWPBIS with C in increasing procedural and process fidelity and in reducing ODRs and improving school climate, and children’s grades, diagnostic status, symptom severity and impairment; and the effectiveness of SWPBIS with C + C as compared to SWPBIS with C. An incremental cost-effectiveness ratio [116] will be constructed for each intervention (SWPBS with C + C for Tier 1, SWPBS with C for Tier 1, SWPBS with C + C for Tier 2, SWPBS with C for Tier 2). The denominator of the cost-effectiveness ratio is the difference between the effectiveness of the intervention and control groups (accounting for baseline levels) on designated measures of fidelity and student outcomes (e.g., student grades). The numerator of the cost ratios is the difference in mean costs for intervention and control groups (accounting for baseline levels). All costs associated with the program will be set at $0 at baseline. Costs for interventions will include: initial development of the leadership team; initial training of the coaches (Tier 1) and clinical supervisors (Tier 2); initial training of the leadership team and school counselors; subsequent supervision of coaches and clinical supervisors; conducting groups with at risk children; and activities to maintain the SWPBIS program, including day-to-day implementation, ongoing training, data collection, and money for student incentives. Within each component, two main types of costs will be calculated: cost of physical materials used for training and interventions and costs associated with time spent by trainers and school personnel [117].

As an example, the cost-effectiveness ratio for fidelity will reveal how much it costs (or how much it saves) to increase fidelity among the study population by one point. Confidence intervals will be constructed to determine the probability that the difference between two ratios indicates significant differences in cost-effectiveness for the target population. The ratios can then be ordered from lowest to highest to show which intervention is the most cost-effective for increasing the SET score. Administrators then have the information they need to determine if gains in fidelity are sufficient to justify marginal costs. In a similar manner cost-effectiveness ratios can be constructed for other targeted outcomes (e.g., measures of school climate, child symptomatology, impairment, coping skills, symptom severity, and functioning) for comparisons in which the results indicate significant improvement in the intervention group as compared to the control condition.

Examination of focus group results

As indicated, focus groups will be conducted with LTs to identify their perceptions of SWPBIS implementation in their schools. Sessions will be transcribed and analyzed qualitatively, based on grounded theory. Two RAs will read the transcripts independently and record all major themes on index cards. The RAs will present the data to the research team. The research team will then try to resolve any discrepancies between the RAs’ findings. Next, the RAs will examine the transcripts once again and identify the frequency of comments pertaining to each theme. The frequencies for RA one and RA two will be compared to check for inter-rater agreement. Standard frequency tables will be used to compare the relative frequency of endorsements for each theme. The agreement between the two RAs will also be checked by calculating the Kappa co-efficient.

Sample size justification and statistical power

Sample size estimation was based on addressing the primary outcome, the participants’ diagnostic status (aim two). The statistical analysis will be McNemar tests for two correlated proportions (pre/post) between two groups. With a sample size of 300 students, the statistical test achieves 92% power to detect an odds ratio of three using a two-sided McNemar test with a significance level (α ) of 0.01. The odds ratio is equivalent to a difference between two paired proportions of 0.25, which occurs, for example, when the proportion among students diagnosed with a Positive anxiety/externalizing disorder at pre and improved to Intermediate at post (cell 1,2 in the 2X2 McNemar table) is 0.425 and the proportion among students diagnosed without Positive anxiety/externalizing disorder who became either Intermediate or Positive at post (cell 2,1 in the 2X2 McNemar table) is 0.175. Therefore, the proportion of discordant pairs is 0.60. We adjusted α level to 0.01 to account for the multiple 2X2 McNemar testing. We need to recruit six schools, grades four to eight,to be able to enroll 300 evaluable students who meet inclusion criteria. We assumed that the observed changes in students’ diagnostic status within schools are independent based on an evaluation of the FRIENDS program, in which ‘schools’ level accounted for less than 5% of the total variance across dependent measures.



The study is innovative in a number of ways. This is the first study utilizing SWPBIS as a foundation for addressing mental health disparities in urban schools and one of the first studies embedding EBIs for both externalizing and anxiety disorders within a SWPBIS framework. With regard to implementation science, this is the first study assessing the effectiveness of two randomized levels of support for school personnel implementing EBIs. The study uses rigorous methods for the measurement of intervention content and process fidelity within a two-tier mental health program. We will also conduct a cost analysis and measure the effects of Tier 1 and Tier 2 interventions on academic productivity, grades, attendance, disciplinary referrals, and suspensions, which are rarely assessed in this type of dissemination and implementation study [12].

Challenges and limitations

Implementing SWPBIS with mental health supports in under-resourced urban schools presents a number of challenges. In our pilot study where we developed and piloted the interventions, we faced repeated leadership changes at the district level, staff turnover at the school level, and limited parent participation [74]. However, we also found strong and consistent support among second-level administrators (e.g., regional superintendents) at the District level and committed leadership teams that were able to recruit new members who functioned effectively. With the lessons learned from our pilot study, we are able to more effectively support the schools in successfully implementing SWPBIS while receiving either level of support. We have redoubled our efforts at helping schools to increase parental collaboration and support, which lead to more parent participation. We have also designed the implementation strategy to fully utilize existing policy and infrastructure (e.g., implementation of District-wide practices to promote students’ success; utilizing school counselors and school facilities for Tier 2 interventions) to limit strain on resources so that the proposed program will be sustainable.

A limitation is that we are conducting the study in six schools within a single school district (though it is one of the largest and most diverse school districts in the country), which limits generalizability to schools in different parts of the country. Including a relatively small number of schools limits our ability to examine the clustering effect of school classes and schools. In this study, the unit of randomization is the school but the unit of analysis (aim two) is the student. A cluster randomized approach would have been an appropriate design if a much larger number of schools were included in the study. However, it is anticipated that the degree of similarity in treatment response among students within a class will be poorly correlated, and therefore the intraclass correlation coefficient within clusters will be small and its effect on the overall variance between schools would be minimal. However, an appropriate cluster analyses will be conducted using nested analysis. The NESTED Procedure of SAS [115] is an example of such analysis. We expect that our findings will inform implementation research in schools and procedures to be incorporated into larger-scale trials in urban schools in the future.


This study has the potential to show the amount of resources (training and consultation) needed for the implementation of EBIs with high levels of fidelity in under-resourced urban schools. It will also begin to show the type of resources needed in order to sustain this type of program longer term. The cost and cost-effectiveness analyses will enable urban school districts and policy makers to determine costs to successfully deploy SWPBIS with integrated mental health supports in each additional school and throughout the district and how much additional support and cost would be needed in order to improve implementation fidelity, and school and child outcomes.

Trial status identifier: NCT01941069. The study was approved by the Committee for the Protection of Human Subjects of The Children’s Hospital of Philadelphia and the Devereux Foundation (IRB 12–009477) and by the Office of Research and Evaluation (ORE) Research Review Committee of the School District of Philadelphia (IRB 2012-07-099).



Applied behavior analysis


Cognitive behavioral therapy


Conduct disorder



C + C: 

Consultation plus coaching


Coping power program


Evidence-based intervention


Friends for life


Generalized anxiety disorder


Oppositional defiant disorder


Separation anxiety disorder


Social phobia


School-wide positive behavioral interventions and supports.



The study is being funded by the Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD), 1R01HD073430-01A1 to Eiraldi, R. (PI).

Authors’ Affiliations

Department of Child and Adolescent Psychiatry and Behavioral Sciences, The Children’s Hospital of Philadelphia, 3440 Market St, Philadelphia, PA 19104-3306, USA
Division of Developmental and Behavioral Pediatrics, The Children’s Hospital of Philadelphia, 3550 Market St, Philadelphia, PA 19104-3303, USA
Department of Pediatrics, Perelman School of Medicine, University of Pennsylvania, 3401 Civic Center Blvd, Philadelphia, PA 19104-4319, USA
Department of Psychiatry, Perelman School of Medicine, University of Pennsylvania, 3535 Market St, Philadelphia, PA 19104-3371, USA
Devereux Center for Effective Schools, The Devereux Foundation, 2012 Renaissance Blvd, King of Prussia, PA 19406, USA
Department of Educational Psychology, Center for Behavioral Education and Research, University of Connecticut, 249 Glenbroook Road Unit 2064, Storrs, CT 06269-2064, USA


  1. Goodman SH: Representativeness of clinical samples of youths with mental disorders: a preliminary population-based study. J Abnorm Psychol. 1997, 106 (1): 3-14.PubMedGoogle Scholar
  2. Burns BJ: Children’s mental health service use across service sectors. Health Aff (Millwood). 1995, 14 (3): 147-59. 10.1377/hlthaff.14.3.147.Google Scholar
  3. Kataoka SH, Zhang L, Wells KB: Unmet need for mental health care among U.S. children: Variation by ethnicity and insurance status. Am J Psychiatry. 2002, 159: 1548-1555. 10.1176/appi.ajp.159.9.1548.PubMedGoogle Scholar
  4. Pumariega AJ, Rogers K, Rothe E: Culturally competent systems of care for children’s mental health: advances and challenges. Community Ment Health J. 2005, 41 (5): 539-55. 10.1007/s10597-005-6360-4.PubMedGoogle Scholar
  5. Eiraldi RB: Service Utilization among ethnic minority children with ADHD: a model of help-seeking behavior. Adm Policy Ment Health. 2006, 33 (5): 607-22. 10.1007/s10488-006-0063-1.PubMedGoogle Scholar
  6. Verhulst FC, van der Ende J: Factors associated with child mental health service use in the community. J Am Acad Child Adolesc Psychiatry. 1997, 36 (7): 901-9. 10.1097/00004583-199707000-00011.PubMedGoogle Scholar
  7. Woodward AM, Dwinell AD, Arons BS: Barriers to mental health care for Hispanic Americans: A literature review and discussion. J Ment Health Adm. 1992, 19: 224-236. 10.1007/BF02518988.PubMedGoogle Scholar
  8. Owens PL: Barriers to children’s mental health services. J Am Acad Child Adolesc Psychiatry. 2002, 41 (6): 731-8. 10.1097/00004583-200206000-00013.PubMedGoogle Scholar
  9. Flisher AJ: Correlates of unmet need for mental health services by children and adolescents. Psychol Med. 1997, 27 (5): 1145-54. 10.1017/S0033291797005412.PubMedGoogle Scholar
  10. Eiraldi R, Mazzuca LB: Factors affecting utilization of mental health services by low-income Latino children: A model of parental help-seeking behavior. Child Poverty in America Today: Health and Medical Care. Edited by: Maume AADJ. 2007, Westport, CT: Praeger, 74-92.Google Scholar
  11. Sink CA, Igelman CN: Anxiety disorders. Educator’s Guide to Mental Health Issues in the Classroom. Edited by: Kline FM, Silver LB. 2004, Baltimore, MD: Paul H. BrookesGoogle Scholar
  12. Rones M, Hoagwood K: School-based mental health services: a research review. Clin Child Fam Psychol Rev. 2000, 3 (4): 223-41. 10.1023/A:1026425104386.PubMedGoogle Scholar
  13. Taras HL: School-based mental health services. Pediatrics. 2004, 113 (6): 1839-45.PubMedGoogle Scholar
  14. McKay MM, McCadam K, Gonzales J: Addressing barriers to mental health services for inner-city children and their caretaker. Community Ment Health J. 1996, 32 (4): 353-361. 10.1007/BF02249453.PubMedGoogle Scholar
  15. Nabors LA, Reynolds MW: Overcoming challenges in outcome evaluations of school mental health programs. J Sch Health. 2000, 70: 206-209. 10.1111/j.1746-1561.2000.tb06474.x.PubMedGoogle Scholar
  16. Weist MD: Challenges and opportunities in expanded school mental health. Clin Psychol Rev. 1999, 19: 131-135. 10.1016/S0272-7358(98)00068-3.PubMedGoogle Scholar
  17. Greenberg MT, Domitrovich C, Bumbarger B: The prevention of mental disorders in school-aged children: Current state of the field. Prevention and Treatment. Prev & Treat. 2001, 4: 1-Google Scholar
  18. Hintze JM: Interventions for Fears and Anxiety Problems. Interventions for Academic and Behavior Problems II: Preventive and Remedial Approaches. Edited by: Shinn MR, Walker HM, Stoner G. 2002, Bethesda, MD: NASP Publications, 939-959.Google Scholar
  19. Cappella E: Enhancing schools’ capacity to support children in poverty: an ecological model of school-based mental health services. Adm Policy Ment Health. 2008, 35 (5): 395-409. 10.1007/s10488-008-0182-y.PubMedPubMed CentralGoogle Scholar
  20. Atkins MS: School-based mental health services for children living in high poverty urban communities. Adm Policy Ment Health. 2006, 33 (2): 146-59. 10.1007/s10488-006-0031-9.PubMedGoogle Scholar
  21. Nock MK: Lifetime prevalence, correlates, and persistence of oppositional defiant disorder: results from the National Comorbidity Survey Replication. J Child Psychol Psychiatry. 2007, 48 (7): 703-13. 10.1111/j.1469-7610.2007.01733.x.PubMedGoogle Scholar
  22. Nock MK: Prevalence, subtypes, and correlates of DSM-IV conduct disorder in the National Comorbidity Survey Replication. Psychol Med. 2006, 36 (5): 699-710. 10.1017/S0033291706007082.PubMedPubMed CentralGoogle Scholar
  23. Yoshikawa H, Knitzer J: Lessons from the field: Head Start mental health strategies to meet changing needs. 1997, New York: National Center for Children in PovertyGoogle Scholar
  24. Reid J: Prevention of conduct disorder before and after school entry: Relating interventions to developmental findings. Dev Psychopathol. 1993, 5: 243-262. 10.1017/S0954579400004375.Google Scholar
  25. Furlong MJ, Morrison GM, Jimerson SR: Externalizing behaviors of aggression and violence and the school context. Handbook of research in emotional and behavioral disorders. Edited by: Rutherford RBJr, Quinn MM, Mathur SR. 2007, dew York: Guilford, 243-261.Google Scholar
  26. Ma L: Academic competence for adolescents who bully and who are bullied: Findings from the 4-H study of positive youth development. J Early Adolesc. 2009, 29: 862-897. 10.1177/0272431609332667.Google Scholar
  27. Frick PJ: Conduct disorders and severe antisocial behavior. 1998, New York: PlenumGoogle Scholar
  28. Miller-Johnson S: Relationship between childhood peer rejection and aggression and adolescent delinquency severity and type among African American youth. J Emotional and Behav Disord. 1999, 7: 137-146. 10.1177/106342669900700302.Google Scholar
  29. Moffitt TE, Caspi A: Childhool predictors differentiate life-course persistent and adolescence-limited pathways among males and females. Dev Psychopathol. 2001, 13: 355-375. 10.1017/S0954579401002097.PubMedGoogle Scholar
  30. Patterson GR, Yoeger K: Developmental models for delinquent behavior. Mental disorder and crime. Edited by: Hodgins S. 1993, Thousand Oaks, CA: SageGoogle Scholar
  31. Canino G: The DSM-IV rates of child and adolescent disorders in Puerto Rico: prevalence, correlates, service use, and the effects of impairment. Arch Gen Psychiatry. 2004, 61 (1): 85-93. 10.1001/archpsyc.61.1.85.PubMedGoogle Scholar
  32. Costello EJ: Prevalence and development of psychiatric disorders in childhood and adolescence. Arch Gen Psychiatry. 2003, 60 (8): 837-44. 10.1001/archpsyc.60.8.837.PubMedGoogle Scholar
  33. Greco L, Morris T: Factors influencing the link between social anxiety and peer acceptance: Contributions of social skills and close friendships during middle childhood. Behav Ther. 2005, 36: 197-205. 10.1016/S0005-7894(05)80068-1.Google Scholar
  34. Strauss CC: The association between social withdrawal and internalizing problems of children. J Abnorm Child Psychol. 1986, 14 (4): 525-35. 10.1007/BF01260521.PubMedGoogle Scholar
  35. Van Amerigen M, Manicini C, Farvolden P: The impact of anxiety disorders on educational achievement. J Anxiety Disord. 2003, 17: 561-571. 10.1016/S0887-6185(02)00228-1.Google Scholar
  36. Kearney CA, Bensaheb A: Assessing Anxiety Disorders in Children and Adolescents. The clinical assessment of children and adolescents: a practitioner’s handbook. Edited by: Smith S, Handler L. 2007, Mahwah, NJ: Lawrence Erlbaum Associates Publishers, 467-483.Google Scholar
  37. Kearney CA, Albano AM: The functional profiles of school refusal behavior diagnostic aspects. Behav Modif. 2004, 28 (1): 147-61. 10.1177/0145445503259263.PubMedGoogle Scholar
  38. Beidel DC: Social phobia and overanxious disorder in school-age children. J Am Acad Child Adolesc Psychiatry. 1991, 30 (4): 545-52. 10.1097/00004583-199107000-00003.PubMedGoogle Scholar
  39. Feehan M, McGee R, Williams SM: Mental health disorders from age 15 to age 18 years. J Am Acad Child Adolesc Psychiatry. 1993, 32 (6): 1118-26. 10.1097/00004583-199311000-00003.PubMedGoogle Scholar
  40. Bell-Dolan D, Brazeal TJ: Separation anxiety disorder, overanxious disorder, and school refusal. Child Adolesc Psychiatr Clin N Am. 1993, 2: 563-580.Google Scholar
  41. Pine DS, Grun J: Child psychopharmacology, Volume 17. Anxiety disorders. Edited by: Walsh TB. 1998, Washington, DC: American Psychiatric PressGoogle Scholar
  42. Gould RA: Cognitive-behavioral and pharmacological treatment for social phobia: A meta-analysis. Clin Psychol: Sci Pract. 1997, 4: 291-306. 10.1111/j.1468-2850.1997.tb00123.x.Google Scholar
  43. Gould RA: Cognitive-behavioral and pharmacological treatment of generalized anxiety disorder: A preliminary meta-analysis. Behav Ther. 1997, 28: 285-305. 10.1016/S0005-7894(97)80048-2.Google Scholar
  44. Pina AA, Silverman WK: Clinical phenomenology, somatic symptoms, and distress in Hispanic/Latino and European American youths with anxiety disorders. J Clin Child Adolesc Psychol. 2004, 33 (2): 227-36. 10.1207/s15374424jccp3302_3.PubMedGoogle Scholar
  45. Costello EJ: The Great Smoky Mountains Study of Youth. Goals, design, methods, and the prevalence of DSM-III-R disorders. Arch Gen Psychiatry. 1996, 53 (12): 1129-36. 10.1001/archpsyc.1996.01830120067012.PubMedGoogle Scholar
  46. Angold A: Psychiatric disorder, impairment, and service use in rural African American and white youth. Arch Gen Psychiatry. 2002, 59 (10): 893-901. 10.1001/archpsyc.59.10.893.PubMedGoogle Scholar
  47. Fitzpatrick KM, Boldizar JP: The prevalence and consequences of exposure to violence among African-American youth. J Am Acad Child Adolesc Psychiatry. 1993, 32 (2): 424-30. 10.1097/00004583-199303000-00026.PubMedGoogle Scholar
  48. Barrett SB, Bradshaw CP, Lewis-Palmer T: Maryland statewide PBIS initiative: systems, evaluation, and next steps. J Posit Behav Interv. 2008, 10 (2): 105-114. 10.1177/1098300707312541.Google Scholar
  49. Lewis TJ, Sugai G: Effective behavior support: a systems approach to proactive schoolwide management. Focus on Except Child. 1999, 31 (6): 1-Google Scholar
  50. Horner RH: A randomized, wait-list controlled effectiveness trial assessing school-wide positive behavior support in elementary schools. J Posit Behav Interv. 2009, 11 (3): 133-144. 10.1177/1098300709332067.Google Scholar
  51. Bradshaw CP: Altering school climate through school-wide Positive Behavioral Interventions and Supports: findings from a group-randomized effectiveness trial. Prev Sci. 2009, 10 (2): 100-15. 10.1007/s11121-008-0114-9.PubMedGoogle Scholar
  52. Bradshaw CP, Waasdorp TE, Leaf PJ: Effects of school-wide positive behavioral interventions and supports on child behavior problems. Pediatrics. 2012, 130 (5): e1136-45. 10.1542/peds.2012-0243.PubMedPubMed CentralGoogle Scholar
  53. Bradshaw CP, Mitchell MM, Leaf PJ: Examining the effects of schoolwide positive behavioral interventions and supports on student outcomes. J Posit Behav Interv. 2010, 12 (3): 133-148. 10.1177/1098300709334798.Google Scholar
  54. Waasdorp TE, Bradshaw CP, Leaf PJ: The impact of schoolwide positive behavioral interventions and supports on bullying and peer rejection: a randomized controlled effectiveness trial. Arch Pediatr Adolesc Med. 2012, 166 (2): 149-56. 10.1001/archpediatrics.2011.755.PubMedGoogle Scholar
  55. Lochman J, Wells K, Lenhart L: Coping Power Child Group Program: Facilitator Guide. 2008, New York: Oxford University PressGoogle Scholar
  56. Lochman JE, Wells KC: The coping power program for preadolescent aggressive boys and their parents: outcome effects at the 1-year follow-up. J Consult Clin Psychol. 2004, 72 (4): 571-8.PubMedGoogle Scholar
  57. Lochman JE: Effects of different treatment lengths in cognitive behavioral interventions with aggressive boys. Child Psychiatry Hum Dev. 1985, 16: 45-56. 10.1007/BF00707769.PubMedGoogle Scholar
  58. Lochman J, Curry J: Effects of social problem solving training and self-instruction training with aggressive boys. J Clin Child Psychol. 1986, 15: 159-164. 10.1207/s15374424jccp1502_8.Google Scholar
  59. Barrett P: Friends for Life: Group leaders’ manual for children. 2008, Sydney, Australia: Pathways Health and Research Centre, AustraliaGoogle Scholar
  60. Briesch AM, Hagermoser Sanetti LM, Briesch JM: Reducing the prevalence of anxiety in children and adolescents: an evaluation of the evidence base for the FRIENDS for life program. Sch Ment Health. 2010, 2: 155-165. 10.1007/s12310-010-9042-5.Google Scholar
  61. Pahl KM, Barrett PM: Interventions for anxiety disorders in children using group cognitive-behavioral therapy with family involvement. Evidence-Based Psychotherapies for Children and Adolescents. Edited by: Weisz JR, Kazdin AE. 2010, New York: The Guilford Press, 61-79.Google Scholar
  62. Barrett P: Cognitive-behavioral treatment of anxiety disorders in children: long-term (6-year) follow-up. J Consult Clin Psychol. 2001, 69 (1): 135-41.PubMedGoogle Scholar
  63. Shortt AL, Barrett PM, Fox TL: Evaluating the FRIENDS program: A cognitive-behavioral group treatment for anxious children and their parents. J Clin Child Psychol. 2001, 30 (4): 525-35. 10.1207/S15374424JCCP3004_09.PubMedGoogle Scholar
  64. Sugai G, Horner RH, Gresham F: Behaviorally effective school environments. Interventions for academic and behavior problems 2: Preventative and remedial approaches. Edited by: Shinn MR, Walker HM, Stoner G. 2002, Bethesda, MD: National Association of School Psychologists, 315-350.Google Scholar
  65. Ginsburg GS, Drake KL: School-based treatment for anxious african-american adolescents: a controlled pilot study. J Am Acad Child Adolesc Psychiatry. 2002, 41 (7): 768-75. 10.1097/00004583-200207000-00007.PubMedGoogle Scholar
  66. Ginsburg GS: Transporting CBT for childhood anxiety disorders into inner-city schools-based mental health clinics. Cogn Behav Pract. 2008, 15: 148-158. 10.1016/j.cbpra.2007.07.001.Google Scholar
  67. Stein BD: A mental health intervention for schoolchildren exposed to violence: a randomized controlled trial. Jama. 2003, 290 (5): 603-11. 10.1001/jama.290.5.603.PubMedGoogle Scholar
  68. Lochman JE: Dissemination of the Coping Power program: importance of intensity of counselor training. J Consult Clin Psychol. 2009, 77 (3): 397-409.PubMedGoogle Scholar
  69. Warren J: Urban applications of the school-wide positive behavior support: Critical issues and lessons learned. J Posit Behav Interv. 2003, 5 (2): 80-91. 10.1177/10983007030050020301.Google Scholar
  70. Scott TM: A schoolwide example of positive behavioral support. J Posit Behav Interv. 2001, 3 (2): 88-94. 10.1177/109830070100300205.Google Scholar
  71. Kutash K, Duchnowski AJ, Lynn N: School-based mental health: An empirical guide for decision-makers. 2006, Tampa, FL: University of South Florida, The Louis de la Parte Florida Mental Health Institute, Department of Child & Family Studies, Research and Training Center for Children’s Mental HealthGoogle Scholar
  72. Strein W, Hoagwood K, Cohn A: School psychology: A public health perspective I. Prevention, populations, and systems change. J Sch Psychol. 2003, 41: 23-38. 10.1016/S0022-4405(02)00142-5.Google Scholar
  73. Weist MD: Fulfilling the promise of school-based mental health: moving toward a Public Mental Health Promotion approach. J Abnorm Child Psychol. 2005, 33 (6): 735-41. 10.1007/s10802-005-7651-5.PubMedGoogle Scholar
  74. Proctor EK: Implementation research in mental health services: an emerging science with conceptual, methodological, and training challenges. Adm Policy Ment Health. 2009, 36 (1): 24-34. 10.1007/s10488-008-0197-4.PubMedGoogle Scholar
  75. Proctor EK: Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. 2011, 38: 65-76. 10.1007/s10488-010-0319-7.PubMedGoogle Scholar
  76. Elashoff JD: nQuery advisor® 6.0. 1995-2005, Sangus, MA: Statistical SolutionsGoogle Scholar
  77. Goodman R: Using the Strengths and Difficulties Questionnaire (SDQ) to screen for child psychiatric disorders in a community sample. Br J Psychiatry. 2000, 177: 534-539. 10.1192/bjp.177.6.534.PubMedGoogle Scholar
  78. Shaffer D: NIMH Diagnostic Interview Schedule for Children Version IV (NIMH DISC-IV): description, differences from previous versions, and reliability of some common diagnoses. J Am Acad Child Adolesc Psychiatry. 2000, 39 (1): 28-38. 10.1097/00004583-200001000-00014.PubMedGoogle Scholar
  79. Guy W: Clinical Global Impression scale(CGI), in ECDEU Assessment Manual for Psychopharmacology Revised. 1976, Rockville, MD: U.S. Department of Health, Education, and Welfare, 217-222.Google Scholar
  80. Walker HM, Severson HH: Systematic Screening for Behavior Disorders (SSBD). 1992, Longmont, CO: Sopris West, SecondGoogle Scholar
  81. Irvin LK: Validity of office discipline referral measures as indices of school-wide behavioral status and effects of school-wide behavioral interventions. J Posit Behav Interv. 2004, 6: 131-147. 10.1177/10983007040060030201.Google Scholar
  82. Food Research and Action Center: Food Hardship in America - 2010 Data for the Nation, States, 100 MSAs, and Every Congressional District. 2010, Washington, DC: Food Research and Action Center,,Google Scholar
  83. Martin JL: Moving evidence-based practices into everyday clinical care settings: Addressing challenges associated with pathways to treatment, child characteristics, and structure of treatment. Emotional Behav Disord in Youth. 2007, 7: 5-21.Google Scholar
  84. Beidas RS, Kendall PC: Training therapists in evidence-based practice: a critical review of studies from a systems-contextual perspective. Clin Psychol (New York). 2010, 17 (1): 1-30.Google Scholar
  85. Sugai G: School-wide Positive Behavior Support Team Training Manual. 2005, University of Oregon: College of EducationGoogle Scholar
  86. Sugai G: School-wide Positive Behavior Support Team Training Manual. 2005, Eugene, OR: University of Oregon College of EducationGoogle Scholar
  87. Horner RH: Schoolwide positive behavior support, in Individualized supports for students with problem behaviors: Designing positive behavior plans. Edited by: Bambara LM, Kern L. 2005, New York: Guilford, 359-390.Google Scholar
  88. Stumpf RE, Higa-McMillan CK, Chorpita BF: Implementation of evidence-based services for youth: assessing provider knowledge. Behav Modif. 2009, 33 (1): 48-65.PubMedGoogle Scholar
  89. Goodman R, Scott S: Comparing the strengths and difficulties questionnaire and the child behavior checklist: Is small beautiful?. J Abnorm Child Psychol. 1999, 27 (1): 17-24. 10.1023/A:1022658222914.PubMedGoogle Scholar
  90. Goodman R: Psychometric properties of the strengths and difficulties questionnaire. J Am Acad Child Adolesc Psychiatry. 2001, 40 (11): 1337-45. 10.1097/00004583-200111000-00015.PubMedGoogle Scholar
  91. Elliot D, Mihalic S: Issues in disseminating and replicating effective prevention programs. Prev Sci. 2004, 5: 47-53.Google Scholar
  92. Forgatch MS, Patterson GR, DeGarmo DS: Evaluating fidelity: predictive validity for a measure of competent adherence to the Oregon model of parent management training. Behav Ther. 2005, 36 (1): 3-13. 10.1016/S0005-7894(05)80049-8.PubMedPubMed CentralGoogle Scholar
  93. Cloitre M: Therapeutic alliance, negative mood regulation, and treatment outcome in child abuse-related posttraumatic stress disorder. J Consult Clin Psychol. 2004, 72 (3): 411-6.PubMedGoogle Scholar
  94. Hogue A: Early therapeutic alliance and treatment outcome in individual and family therapy for adolescent behavior problems. J Consult Clin Psychol. 2006, 74 (1): 121-9.PubMedPubMed CentralGoogle Scholar
  95. Huey SJ: Multisystemic therapy effects on attempted suicide by youths presenting psychiatric emergencies. J Am Acad Child Adolesc Psychiatry. 2004, 43 (2): 183-90. 10.1097/00004583-200402000-00014.PubMedGoogle Scholar
  96. Horner RH: The School-Wide Evaluation Tool (SET): A research instrument for assessing school-wide positive behavior support. J Posit Behav Interv. 2004, 6: 3-12. 10.1177/10983007040060010201.Google Scholar
  97. Scott T, Barrett S: Using staff and student time engaged in disciplinary procedures to evaluate the impact of school-wide PBS. J Posit Behav Interv. 2004, 6: 21-27. 10.1177/10983007040060010401.Google Scholar
  98. Ervin RA: Merging research and practice agendas to address reading and behavior school-wide. Sch Psychol Rev. 2006, 35 (2): 198-223.Google Scholar
  99. McCausland SG, Hales LW, Reinhardtsen JM: CREST: Technical Report. 1997, Vancouver, WA: Educational Service District 112Google Scholar
  100. Horwitz SM: Reliability of the services assessment for children and adolescents. Psychiatr Serv. 2001, 52 (8): 1088-94. 10.1176/ Scholar
  101. Canino G: Methodological challenges in assessing children’s mental health services utilization. Ment Health Serv Res. 2002, 4 (2): 97-107. 10.1023/A:1015252217154.PubMedGoogle Scholar
  102. Shaffer D: The NIMH Diagnostic Interview Schedule for Children Version 2.3 (DISC-2.3): description, acceptability, prevalence rates, and performance in the MECA Study. Methods for the Epidemiology of Child and Adolescent Mental Disorders Study. J Am Acad Child Adolesc Psychiatry. 1996, 35 (7): 865-77. 10.1097/00004583-199607000-00012.PubMedGoogle Scholar
  103. Causey D, Dubow E: Development of self-report coping measure for elementary school children. J Clin Child Psychol. 1992, 21 (1): 47-59. 10.1207/s15374424jccp2101_8.Google Scholar
  104. Achenbach T, Rescorla L: Manual for the ASEBA School-Age Forms & Profiles: An Integrated System of Multi-informant Assessment. 2001, Burlington, VT: University of Vermont, Research Center for Children Youth & FamiliesGoogle Scholar
  105. March JS: Manual for the Multidimensional Anxiety Scale for Children-2nd Edition (MASC 2). 2012, North Tonawanda, NY: MHSGoogle Scholar
  106. Kovacs M: CDI 2: Children’s Depression Inventory 2nd Edition. 2011, North Tonawanda, NY: Multi-Health Systems, Inc.Google Scholar
  107. DuPaul GJ, Rapport M, Perriello L: Teacher ratings of academic skills: the development of the academic performance rating scale. Sch Psychol Rev. 1991, 20 (2): 284-300.Google Scholar
  108. Shaffer D: A children’s global assessment scale (CGAS). Arch Gen Psychiatry. 1983, 40 (11): 1228-31. 10.1001/archpsyc.1983.01790100074010.PubMedGoogle Scholar
  109. Lochman JE: Cognitive-behavioral intervention with aggressive boys: three-year follow-up and preventive effects. J Consult Clin Psychol. 1992, 60 (3): 426-32.PubMedGoogle Scholar
  110. Laird NM, Ware JH: Random-effects models for longitudinal data. Biometrics. 1982, 38 (4): 963-974. 10.2307/2529876.PubMedGoogle Scholar
  111. Liang KW, Zeger SL: Longitudinal data analyses using generalized linear models. Biometrika. 1986, 73: 13-22. 10.1093/biomet/73.1.13.Google Scholar
  112. Zeger SL, Liang KY: Longitudinal data analysis for dicrete and continuous outcomes. Biometrics. 1986, 42 (1): 121-130. 10.2307/2531248.PubMedGoogle Scholar
  113. SAS/STAT: SAS/STAT® 9.2 User’s Guide. 2002–2008, Cary, NC: SAS Institute IncGoogle Scholar
  114. Detsky AS: A clinician’s guide to cost-effectiveness analysis. Ann Intern Med. 1990, 113 (2): 147-154. 10.7326/0003-4819-113-2-147.PubMedGoogle Scholar
  115. Blonigen BA: Application of economic analysis to school-wide positive behavioral support (SWPBS) programs. J Posit Behav Interv. 2008, 10 (1): 5-19. 10.1177/1098300707311366.Google Scholar
  116. Glasser BG, Strauss AL: The discovery of grounded theory: Strategies for qualitative research. 1967, Chicago: AldineGoogle Scholar
  117. Barrett PM: Long-term outcomes of an Australian universal prevention trial of anxiety and depression symptoms in children and youth: an evaluation of the friends program. J Clin Child Adolesc Psychol. 2006, 35 (3): 403-11. 10.1207/s15374424jccp3503_5.PubMedGoogle Scholar


© Eiraldi et al.; licensee BioMed Central Ltd. 2014

This article is published under license to BioMed Central Ltd. This is an open access article distributed under the terms of the Creative Commons Attribution License (, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.