Skip to main content

Confirmatory factor analysis of the Evidence-Based Practice Attitudes Scale with school-based behavioral health consultants

Abstract

Background

The Evidence-Based Practice Attitude Scale (EBPAS) is a widely used tool, but it has not been adapted and validated for use in schools, the most common setting where youth access behavioral health services. This study examined the factor structure, psychometric properties, and criterion-related validity of the school-adapted EBPAS in a sample of school-based behavioral health consultants.

Method

A research team comprised of experts in implementation of evidence-based practices in schools along with the original developer adapted the EBPAS for the school setting. The adapted instrument was administered to a representative sample (n = 196) of school-based behavioral health consultants to assess the reliability and structural validity via a series of confirmatory factor analyses.

Results

The original EBPAS factor structure was confirmed, with the final model supporting four first-order factors that load onto a second-order factor capturing general attitudes toward evidence-based practice. Correlations among the subscales indicated both unique and shared variance. Correlations between EBPAS scores and consultant variables demonstrated differential criterion-related validity, with the total score and the Requirements and Openness subscales demonstrating the strongest correlations.

Conclusions

The adapted EBPAS performed well when administered to behavioral health consultants operating in the educator sector, supporting the relevance of assessing attitudes in school settings. Potential directions for future research and applications of the EBPAS in schools and other service sectors are discussed.

Peer Review reports

Research has identified numerous determinants that either enable or obstruct the successful implementation of evidence-based practices (EBP), including outer context, inner context, and innovation-specific factors [1,2,3]. Notwithstanding the influence of these factors, successful implementation rests with the decisions and behaviors of those who are most closely connected to the adoption and delivery of EBP, such as designated providers and embedded consultants within the service setting. Indeed, mounting evidence suggests that individual-level factors play a central role in predicting implementation outcomes [4, 5]. One individual-level factor, attitudes toward evidence-based practice, has garnered significant attention across service sectors as an important determinant that is linked to successful implementation [6,7,8].

Evidence-Based Practice Attitude Scale

According to Crano and Prislin [9], attitudes reflect evaluative judgments based on the integration of specific behavioral beliefs that impact a person’s motivation, ambivalence, and resistance to perform a specific action. When focused specifically on the implementation of EBP, those attitudes reflect a person’s favorable or unfavorable evaluative judgments about the adoption and delivery of EBP. Aarons [6] was the first to design a measure dedicated to capturing attitudes related to the implementation of EBP. Through work in community-based mental health settings, the Evidence-Based Practice Attitudes Scale (EBPAS) was developed to include four subscales that capture distinct yet interrelated constructs: (1) willingness to adopt EBPs given their intuitive appeal; (2) willingness to adopt new practices if required; (3) general openness toward new or innovative practices; and (4) perceived divergence of usual practice with academically developed or research-based practices [4]. Since the original study [4], the EBPAS has been used extensively in research across different implementation contexts and providers [10,11,12]. One of the most comprehensive validation studies to date involved administering the EBPAS to over 1000 mental health providers from 100 different community-based organizations across 26 states in the USA [8]. Results supported the scale’s second-order factor structure (i.e., four subscales) and demonstrated adequate reliability of the subscales and total scale.

Gaps in EBPAS research

There is a need for studies that can determine the extent to which existing implementation findings and products, such as the EBPAS, are reliable and valid in novel contexts. Cross validation in other settings is key to advance the multi-disciplinary field of implementation science by determining whether specific constructs and instruments are context-dependent or context-independent. Despite the existing research on the EBPAS and its potential to inform both implementation science and practice, psychometric findings regarding the internal factor structure and criterion validity have not been replicated in schools.

The educational sector offers unique opportunities to promote youth behavioral health given that over 70% of youth who receive behavioral health services in the USA do so in schools [13, 14]. Behavioral health services delivered in schools often are not evidence-based nor delivered with sufficient fidelity, resulting in a significant waste of resources (e.g., funds invested in the research) and a missed opportunity to promote public health outcomes [15,16,17]. Most schools have a diverse set of personnel who can support the delivery of behavioral health services [16]. Among them are school-based behavioral health consultants who frequently operate as implementation intermediaries that are tasked with supporting the delivery of EBP across multiple levels of care ranging from prevention to treatment [18]. Because their intermediary role positions them as gatekeepers during behavioral health implementation efforts, the attitudes of these personnel may be critical to EBP implementation success.

The research on the EBPAS has focused exclusively on providers, albeit across multiple contexts [10,11,12], with limited to no research examining whether the construct validity of the measure holds for other implementation stakeholders, such as consultants or intermediaries. Further, it is likely that a consultant’s attitudes toward evidence-based practice would be associated with consultant-relevant variables linked to provider-level implementation outcomes. However, research offers no empirical guidance on whether a consultant’s attitudes predict other variables relevant to implementation, such as a consultant’s embeddedness (i.e., activity, visibility, and collaboration) [19, 20], use of implementation strategies to promote EBP adoption and delivery [21, 22], and consultant self-efficacy (i.e., belief in one’s ability to promote provider behavior change) with regard to promoting provider behavior change [23].

Purpose of the present study

The purpose of this paper was to extend the research on the EBPAS by examining the construct validity of an adapted version of the EBPAS through a series of confirmatory factor analyses with a sample of school-based behavioral health consultants. A secondary aim was to examine whether the school-adapted EBPAS predicted three variables related to behavioral health consultants’ EBP implementation activities, including their embeddedness, use of implementation strategies, and self-efficacy.

Method

Sample

The study sample included members of a statewide educational and behavioral health organization on the West Coast of the USA committed to the delivery of EBPs for students who exhibit mental and behavioral health concerns. The majority of organization members were in positions that support the delivery of behavioral health EBPs. Of the survey responses received, 196 participants (89%) completed at least 80% of questions in the section pertaining to consultation and were thus included in analyses. Complete demographic information for participants is shown in Table 1.

Table 1 Demographics of survey respondents (n = 196)

Procedures

This study was reviewed and determined to be exempt by the Human Subjects Institutional Review Board. Approval was obtained from the participating statewide organizational leadership. Data were collected via an online survey, distributed through a series of emails to organization members. Prior to constructing and administering the survey, school-based implementation experts adapted EBPAS items for the educational context in collaboration with the developer of the original measure. Adaptations consisted of changing item wording to ensure construct equivalence for the target respondents (i.e., school-based practitioners), while preserving the integrity of the original items/constructs to ensure appropriateness to the school context [24]. Thus, all items were maintained with changes only made to item wording, such as replacing the word “supervisor” with “school administrator,” “clinician” with “school personnel,” and “agency” with “school.” One additional item was included to the EBPAS to capture whether the respondent would adopt an EBP if it was required by the school district.

In the fall, members were sent an e-mail asking and recruiting them to participate in an online survey study. The current study was part of a larger project examining school-based behavioral health consultants’ perceptions of the implementation of school-based EBPs and employed best practices in designing a web-based survey (e.g., visual ease, clear instructions, sending the survey, reminders) [25]. For this analysis, only items from the EBPAS and criterion-related ratings of implementation strategies, embeddedness, and self-efficacy were included.

Measures

Evidence-Based Practice Attitudes Scale (EBPAS)

The original EBPAS was developed to assess the degree to which providers possess favorable attitudes toward the adoption and delivery of EBPs [10,11,12]. The original scale includes a total of 15 items capturing four subscales: Appeal, Requirements, Openness, and Divergence. The Requirements subscale includes three items while the other three subscales include four items each. An additional item was included under the Requirements subscale to capture attitudes toward EBP if “required by school district.” The Divergence scale was reversed scored so higher scores corresponded to more favorable attitudes like the other scales. Respondents rated each item on a 5-point scale ranging from Not at all to a Very Great Extent. The EBPAS has demonstrated adequate internal consistency reliability as well as convergent and discriminant validity from related scales [8].

Criterion-related variables

Items assessing three criterion-related variables relevant to consultation were included: consultant embeddedness, use of implementation strategies, and consultant self-efficacy.

Consultant embeddedness

Items were adapted from the Expanded School Mental Health Collaboration Instrument [19] to capture consultant embeddedness (i.e., degree of visibility, presence, and collaboration) in a given school. In particular, 13 items from the Outreach and Approach subscale capturing clinician embeddedness were adapted. Scale items are rated on a 4-point scale (Strongly Disagree to Strongly Agree) and summed to create a total score. The Outreach and Approach scale has demonstrated acceptable reliability and validity [19]. In this study, the scale also demonstrated acceptable internal consistency (α = .89).

Use of implementation strategies

Fifteen items assessing whether respondents used a subset of implementation strategies selected from an existing compilation that were relevant to the consultant role [26] were included as a self-report measure of consultant implementation-oriented behavior (see Additional file 1 for list of strategies). Respondents rated Yes or No regarding their use of each of the 15 strategies, with the DV serving as the total number of techniques used to support provider implementation.

Consultant self-efficacy

Four items drawn from the Generalized Self-Efficacy scale [27] were adapted and included in the survey to assess consultant self-efficacy, which assessed beliefs to produce desired effects when supporting teachers to adopt and deliver EBPs. Example items included “I am able to increase the fidelity with which a teacher implements the intended intervention as planned” and “I feel confident in ensuring that the intervention is appropriate and fits well with teachers’ classroom environment.” The items were rated on a 5-point scale ranging from Not at All to Very Great Extent. In this study, the scale demonstrated acceptable internal consistency (α = .82).

Data analytic approach

The data analytic procedure involved examining the construct validity of the school-adapted EBPAS via a series of confirmatory factor analyses (CFA) using weighted least squares means and variances (WLSMV) estimation with delta parameterization for the ordered-categorical scale items, as employed in Mplus [28]. The fit of each model was determined across several indices (e.g., chi-square statistic, comparative fit index [CFI], the Tucker-Lewis index [TLI], root mean square error of approximation [RMSEA]) with values of the CFI and TLI greater than .95 and values of the RMSEA less than or equal to .05 as indicative of good model fit to the data [29,30,31,32,33]. Standardized factor loadings (ß) less than .55 were deemed poorly performing items that required further examination. The measurement model from the original EBPAS was tested first, followed by subsequent modifications based on resulting model modification indices and theoretical justification. Finally, evidence supporting the construct validity was examined via correlational analyses testing associations between EBPAS scores and criterion-related variables.

Results

Summary statistics

Summary statistics for the EBPAS scale and subscale items in the form of means, standard deviations, and estimates of subscale internal consistency (coefficient alphas) are depicted in Table 2. Descriptive statistics indicated that the Openness subscale had the highest mean and smallest standard deviation, while the Requirements subscale had the lowest mean and most dispersion. Statistics and graphed data of the response distributions for each of the measures and subscales were examined to assess skewness, kurtosis, and normality. Inspection of these data indicated that all subscales had relatively normally distributed data, with slight negative skewness for Requirements, Openness, and Appeal subscales, and slight positive skewness for the Divergence subscale. With regard to reliability, the subscales showed strong internal consistency (i.e., α > .80), with the exception of the Divergence subscale (α = .63).

Table 2 Summary statistics for the four EBPAS subscales

Confirmation factor analyses

The construct validity of the school-adapted EBPAS was assessed with two separate CFA models. The first model examined the four theorized sub-constructs without a higher second-order factor capturing a total attitude score. The second model was a hierarchical CFA with items loading on the four theorized first-order factors that, in turn, loaded on a second-order total score capturing overall attitudes. The second hierarchical CFA model fit the data slightly better than the first model. Results of both of the models are included as an Additional file 2, but only the structural model (Fig. 1) and results for the hierarchical CFA are reported here. Fit statistics for the second model were χ2 (df = 100, n = 189) = 240.13, p < .001, CFI = .989, TLI = .987, RMSEA = .086 (90% confidence interval = .072 to .100). All standardized item factor loadings were significant (p < .05; ßs > .480) across all the subscales. Moreover, the first-order factor loadings onto the second-order factor were all significant (p < .05); three of the factors had large standardized factor loadings (ßs > .525; Requirements, Appeal, and Openness) and one had a moderate factor loading (ßs > .338; Divergence). Separate CFAs were performed for each of the subscales to examine whether the overall model masked poor fit of the individual subscales. Results from these models indicated adequate fit for each of the subscales (e.g., CFI > .984, TFI > .952) and all factor loadings significant and above ßs > .480.

Fig. 1
figure 1

Results of confirmatory factor analysis with first- and second-order factors

A correlation matrix depicting the associations between the EBPAS total score and four first-order factors are shown in Table 3. All of the correlations were derived from interfactor correlations between the subscales, and the total score was significant. The strongest correlations were noted between the EBPAS total score and four subscales. Of the subscales, Appeal had the strongest correlations with the other subscales, while Divergence had the weakest.

Table 3 Interfactor correlations across EBPAS scores

Criterion-related validity

Bivariate correlational analyses between the EBPAS scores and three consultation-related variables were performed to examine evidence of criterion-related validity (see Table 4). The EBPAS total score has significant, positive correlations with two of the three criterion variables: consultant use of behavior techniques and embeddedness in a given school. Results at the subscale level indicated that the Openness subscale was significantly and positively associated with all three criterion variables, with the strongest association found for reported use of implementation strategies. Openness also was the only EBPAS subscale to significantly predict consultant self-efficacy, indicating that those who were more open to EBP also had higher self-efficacy. The only other subscale found to significantly correlate with the criterion- related variables was Requirements, which had positive correlations with two of the three variables: use of implementation strategies and consultant embeddedness.

Table 4 Correlations between EBPAS subscales and consultation variables

Discussion

The purpose of this study was to adapt and confirm the underlying factor structure and technical adequacy of the EBPAS when administered in the educational sector to school-based behavioral health consultants. Findings from the confirmatory analyses were consistent with the factor structure and psychometric properties found in the original study in a sample of public sector mental health providers [6]. Results supported a model with four first-order factors (Openness, Requirements, Appeal, and Divergence) loading onto a higher order factor reflecting general evidence-based practice attitudes. Coefficient alphas demonstrated strong internal consistency, with the exception of the Divergence subscale which, consistent with the original EBPAS validation study [6], fell slightly below the conventional acceptable level (α < .70) [34]. Correlations among the subscales indicated both unique and shared variance, with Appeal demonstrating the strongest correlations with all other subscales and the total score. It is possible that an attitudinal category like Appeal may serve to influence other types of attitudes (e.g., openness to adopt and implement), because providers or intermediaries for whom EBPs have no appeal are unlikely to be open to adopting and implementing EBPs. Lastly, correlations between EBPAS scores and consultant-relevant variables provided evidence supporting differential criterion-related validity across subscales, with small to moderate correlations revealed for only the total score and Openness and Requirements subscales. The following section discusses the implications of the findings for future research examining attitudes toward EBP.

Implications for efforts to measure and address evidence-based practice attitudes

The adapted version of the EBPAS performed well when administered to behavioral health consultants operating in schools, supporting the relevance of assessing attitudes in school settings. In general, educational professionals who function in consultative roles tend to endorse more supportive beliefs regarding the incorporation of EBPs into routine school-based service delivery than teachers [35, 36]. Moreover, on average, the mean scores obtained from this study’s sample were higher across all subscales when compared to previous studies using the EBPAS with providers [6, 8, 37, 38]. Unlike consultants, frontline providers who are responsible for the delivery of an EBP may have different attitudes about taking on new practices because adoption requires them to change their professional routines and behavior, potentially resulting in (a) less EBP appeal; (b) less openness; and (c) more negative reactions to EBP requirements than personnel in consultative roles. Future research should explore attitude alignment among different professional roles (e.g., providers, administrators/supervisors, intermediaries) and whether discrepancies predict implementation outcomes. Moreover, attitudes reflecting appeal are most strongly related to the EBPAS total score,

In light of the confirmatory evidence, the EBPAS could be applied in the education sector as a measure to examine the impact of efforts to alter educational professionals’ attitudes with the goal of creating greater commitment to undertake EBP implementation among providers and consultants. If employed at the beginning of an EBP adoption process, the EBPAS could help inform efforts to prepare a setting organization for initial implementation, as favorable attitudes among professionals is a component of organizational readiness for change [39, 40]. Implementation strategies informed by the attitude change literature could be particularly helpful to promote more favorable attitudes among implementation practitioners [41]. There are efforts underway in the educational sector to develop and test pre-implementation strategies targeting providers’ attitudes among other putative mechanisms of behavior change [42, 35]; however, there are no known efforts targeting attitudes among consultants or other personnel supporting EBP implementation.

The correlational analyses suggested that attitudes were associated with consultant embeddedness (i.e., visibility and connections to others in the service setting) and use of implementation strategies. These findings suggest that attitudes may be associated with consultant behavior, which, in turn, has the potential to impact providers’ EBP implementation [43]. Most consultation models assume that consultants have favorable attitudes. This may not be the case universally, as indicated by the variability among the respondents in this study. If consultant attitudes are unfavorable, they may be less likely to put in the effort required to influence implementation outcomes (e.g., collaborating on EBP implementation and using implementation strategies).

Limitations/directions for future research

Further study will be needed to examine the temporal reliability of the EBPAS and provide a more extensive assessment of validity, as this study examined only internal consistency and a limited set of potential criterion-related variables. Given the variability among educational systems across the globe [44], the generalizability of the current findings and school-adapted EBPAS beyond US schools is unclear and should be examined. Furthermore, this study did not link EBPAS scores to actual implementation outcomes (such as adoption, fidelity, and reach), or the subsequent behavioral health outcomes. Behavioral health consultants tend to sit at the center of school-based behavioral health implementation efforts yet reflect only one role in a school among other professionals who might be involved in implementation efforts [16, 45]. Data gathered from multiple informants and across multiple roles are likely to yield important insights into the importance of attitudes for EBP implementation effectiveness.

Conclusions

This study expanded extant EBPAS research by adapting and validating the instrument for use in the educational sector with behavioral health consultants. This research extends the external validity of the EBPAS not only to a novel service setting (i.e., schools), but a different group of stakeholders involved in the implementation process (i.e., consultants). Despite this study’s confirmatory findings, there remain several avenues for future research that explore applications and adaptations to the measure. Differential criterion-related validity estimates bring into question the Divergence subscale, which may not serve as a valid sub-construct of attitudes when used with consultants. Moreover, research that examines the application of EBPAS to inform and evaluate the impact of implementation strategies that target professionals’ attitudes as a key mechanism of implementation outcomes should be prioritized.

Abbreviations

CFA:

Confirmatory factor analysis

CFI:

Comparative fit index

DV:

Dependent variable

EBP:

Evidence-based practice

EBPAS:

Evidence-Based Practice Attitude Scale

RMSEA:

Root mean square error of approximation

TLI:

Tucker-Lewis index

WLSMV:

Weighted least squares means and variances

References

  1. Glisson C, Schoenwald SK, Kelleher K, Landsverk J, Hoagwood KE, Mayberg S, Green P. Research network on youth mental health. Therapist turnover and new program sustainability in mental health clinics as a function of organizational culture, climate, and service structure. Adm Policy Ment Hlth. 2008;35:124–33.

    Article  Google Scholar 

  2. Glisson C, Schoenwald SK. The ARC organizational and community intervention strategy for implementing evidence-based children’s mental health treatments. Ment Health Serv Res. 2005;7:243–59.

    Article  PubMed  Google Scholar 

  3. Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004;82:581–629.

    Article  PubMed  PubMed Central  Google Scholar 

  4. Aarons GA. Measuring provider attitudes toward evidence-based practice: consideration of organizational context and individual differences. Child Adolesc Psychiatr Clin N Am. 2005;14:255–71.

    Article  PubMed  PubMed Central  Google Scholar 

  5. Patterson DA, Maguin E, Dulmus CN, Nisbet BC. Individual worker-level attitudes toward empirically supported treatments. Res Soc Work Pract. 2013;23:95–9.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Aarons GA. Mental health provider attitudes toward adoption of evidence-based practice: The Evidence-Based Practice Attitude Scale (EBPAS). Ment Health Serv Res. 2004;2:61–74.

    Article  Google Scholar 

  7. Aarons GA, McDonald EJ, Sheehan AK, Walrath-Greene CM. Confirmatory factor analysis of the Evidence-Based Practice Attitude Scale (EBPAS) in a geographically diverse sample of community mental health providers. Adm Policy Ment H. 2007;34:465.

    Article  Google Scholar 

  8. Aarons GA, Glisson C, Hoagwood K, Kelleher K, Landsverk J, Cafri G. Psychometric properties and United States norms of the Evidence-Based Practice Attitude Scale (EBPAS). Psychol Assess. 2010;3:701–17.

    Article  Google Scholar 

  9. Crano WD, Prislin R. Attitudes and persuasion. Annu Rev Psychol. 2006;57:345–74.

    Article  PubMed  Google Scholar 

  10. Aarons GA. Transformational and transactional leadership: association with attitudes toward evidence-based practice. Psychiatr Serv. 2006;57:1162–9.

    Article  PubMed  PubMed Central  Google Scholar 

  11. Borntrager CF, Chorpita BF, Higa-McMillan C, Weisz JR. Provider attitudes toward evidence-based practices: are the concerns with the evidence or with the manuals? Psychiatr Serv. 2009;60:677–81.

    Article  PubMed  Google Scholar 

  12. Rye M, Torres EM, Friborg O, Skre I, Aarons GA. The Evidence-Based Practice Attitude Scale-36 (EBPAS-36): a brief and pragmatic measure of attitudes to evidence-based practice validated in US and Norwegian samples. Implement Sci. 2017;12:44.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Burns BJ, Costello EJ, Angold A, Tweed D, Stangl D, Farmer EM, Erkanli A. Children’s mental health service use across service sectors. Health Aff. 1995;14:147–59.

    Article  CAS  Google Scholar 

  14. Farmer EM, Burns BJ, Phillips SD, Angold A, Costello EJ. Pathways into and through mental health services for children and adolescents. Psychiatr Serv. 2003;54:60–6.

    Article  PubMed  Google Scholar 

  15. Gottfredson DC, Gottfredson GD. Quality of school-based prevention programs: results from a national survey. J Res Crime Delinq. 2002;39:3–5.

    Article  Google Scholar 

  16. Owens JS, Lyon AR, Brandt NE, Warner CM, Nadeem E, Spiel C, Wagner M. Implementation science in school mental health: key constructs in a developing research agenda. Sch Ment Heal. 2014;6:99–111.

    Article  Google Scholar 

  17. Rones M, Hoagwood K. School-based mental health services: a research review. Clin Child Fam Psychol Rev. 2000;3:223–41.

    Article  PubMed  CAS  Google Scholar 

  18. Bruns EJ, Duong MT, Lyon AR, Pullmann MD, Cook CR, Cheney D, et al. Fostering SMART partnerships to develop an effective continuum of behavioral health services and supports in schools. Am J Orthop. 2016;86:156–70. https://doi.org/10.1037/ort0000083

    Article  Google Scholar 

  19. Mellin EA, Taylor L, Weist MD. The expanded school mental health collaboration instrument [school version]: development and initial psychometrics. Sch Ment Heal. 2014;6:151–62. https://doi.org/10.1007/s12310-013-9112-6.

    Article  Google Scholar 

  20. Provan KG, Milward HB. A preliminary theory of interorganizational network effectiveness: a comparative study of four community mental health systems. Adm Sci Q. 1995;30:1–33.

    Article  Google Scholar 

  21. Carlson CI, Tombari ML. Multilevel school consultation training: preliminary program evaluation. Prof Sch Psychol. 1986;1:89.

    Google Scholar 

  22. Montano DE, Kasprzyk D. Theory of reasoned action, theory of planned behavior, and the integrated behavioral model. In: Glanz K, Rimer BK, Viswanath K, editors. Health behavior and health education: Theory, research, and practice; 2015. p. 67–96.

    Google Scholar 

  23. Guiney MC, Zibulsky J. Competent consultation: developing self-efficacy for process and problem aspects of consultation. J Educ Psychol Consult. 2017;27:52–71.

    Article  Google Scholar 

  24. Hambleton RK, Merenda P, Spielberger C. Adapting Educational and Psychological Tests for Cross-Cultural Assessment. Hillsdale, NJ: Erlbaum; 2005.

  25. Rea LM, Parker RA. Designing and Conducting Survey Research. San Francisco: Jossey-Boss; 2014.

  26. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, Proctor EK, Kirchner JE. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implementat Sci. 2015;10:21.

    Article  Google Scholar 

  27. Schwarzer R, Jerusalem M. Generalized self-efficacy scale. In: Weinman UJ, Wright S, Johnston M, editors. Measures in health psychology: a user’s portfolio. Causal and control beliefs. Windsor: Nfer-Nelson; 1995. p. 35–7.

    Google Scholar 

  28. Muthén LK, Muthén BO. Mplus v8.0 [statistical software]. Los Angeles: Muthén & Muthén; 1998-2017.

    Google Scholar 

  29. Bentler PM. Comparative fit indexes in structural models. Psychol Bull. 1990;107(2):238.

    Article  PubMed  CAS  Google Scholar 

  30. Hu LT, Bentler PM. Cutoff criteria for fit indexes in covariance structure analysis: conventional criteria versus new alternatives. Struct Equ Model. 1999;6:1–55.

    Article  Google Scholar 

  31. Tucker LR, Lewis C. A reliability coefficient for maximum likelihood factor analysis. Psychometrika. 1973;38:1.

    Article  Google Scholar 

  32. Browne MW, Cudeck R. Alternative ways of assessing model fit. Sage Focus Ed. 1993;154:136.

    Google Scholar 

  33. Tabachnick BG, Fidell LS. Using multivariate statistics. Boston: Allyn & Bacon/Pearson Education; 2007.

    Google Scholar 

  34. McMillan JH, Schumacher S. Research in education: a conceptual introduction. 5th ed. New York: Addison Wesley Longman; 2001.

    Google Scholar 

  35. Cook CR, Lyon AR, Kubergovic D, Wright DB, Zhang Y. A supportive beliefs intervention to facilitate the implementation of evidence-based practices within a multi-tiered system of supports. Sch Ment Health. 2015;7:49–60.

  36. Reinke WM, Stormont M, Herman KC, Puri R, Goel N. Supporting children’s mental health in schools: teacher perceptions of needs, roles, and barriers. School Psychol Quart. 2011;26:1.

    Article  Google Scholar 

  37. Lau A, Barnett M, Stadnick N, Saifan D, Regan J, Wiltsey Stirman S, Brookman-Frazee L. Therapist report of adaptations to delivery of evidence-based practices within a system-driven reform of publicly funded children’s mental health services. J Consult Clinic Psychol. 2017;85:664.

    Article  Google Scholar 

  38. Pemberton JR, Conners-Burrow NA, Sigel BA, Sievers CM, Stokes LD, Kramer TL. Factors associated with clinician participation in TF-CBT post-workshop training components. Adm Policy Ment Health Ment Health Serv Res. 2015;25:1.

    Google Scholar 

  39. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Admin Pol Ment Health. 2011;38:4–23.

    Article  Google Scholar 

  40. Weiner BJ. A theory of organizational readiness for change. Implement Sci. 2009;4:67.

    Article  PubMed  PubMed Central  Google Scholar 

  41. Ajzen I, Fishbein M. The influence of attitudes on behavior. In: Albarracín D, Johnson BT, Zanna MP, editors. The handbook of attitudes. New Jersey: Erlbaum; 2005. p. 73–221.

    Google Scholar 

  42. Hicks TB, Shahidullah JD, Carlson JS, Palejwala MH. Nationally Certified School Psychologists’ use and reported barriers to using evidence-based interventions in schools: The influence of graduate program training and education. Sch Psych Quarter. 2014;29:469–87.

  43. Solomon BG, Klein SA, Politylo BC. The effect of performance feedback on teachers’ treatment integrity: a meta-analysis of the single-case literature. Sch Psychol Rev. 2012;41:160.

    Google Scholar 

  44. Blömeke S, Delaney S. Assessment of teacher knowledge across countries: a review of the state of research. In: Blömeke S, Hsieh FJ, Kaiser G, Schmidt WH, editors. International perspectives on teacher knowledge, beliefs and opportunities to learn. Dordrecht: Springer; 2014. p. 542–85.

    Chapter  Google Scholar 

  45. Forman SG, Olin SS, Hoagwood KE, Crowe M, Saka N. Evidence-based interventions in schools: developers’ views of implementation barriers and facilitators. Sch Ment Health. 2009;1:26–36.

    Article  Google Scholar 

Download references

Funding

This publication was supported in part by funding from the University of Washington Center for Child and Family Wellbeing. Additional funding was provided by grants K08MH095939 (Lyon) and K01MH100199 (Locke) awarded from the National Institute of Mental Health, as well as R305A160114 (Lyon and Cook) awarded by the Institute of Education Sciences. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health or Institute of Education Sciences.

Availability of data and materials

Please contact the lead author for more information.

Author information

Authors and Affiliations

Authors

Contributions

CRC and ARL are project Co-PIs, collaborated on study design and coordination, and co-lead the research team. CRC, ARL, JL, ME, and GAA developed the revised versions of the measurement instruments. ARL developed the initial manuscript outline. CD cleaned and prepared the data for analysis in collaboration with CRC. CD conducted the majority of the data analyses in consultation with EB and provided input on the write up of the results. CRC, ARL, JL, EB, CD, ME, GAA, and ML all drafted the sections of the manuscript and/or participated in reviewing and approving the final version.

Corresponding author

Correspondence to Clayton R. Cook.

Ethics declarations

Ethics approval and consent to participate

This project was submitted to the last author’s Institutional Review Board (IRB), which determined the project to be exempt from review. Regardless, all participants were clearly informed about the purpose of the project and the planned use of the resulting data.

Consent for publication

Not applicable.

Competing interests

GA is an Associate Editor of Implementation Science. However, another editor will make all decisions on this paper. All other authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Additional file

Additional file 1:

Consultative-relevant Implementation Strategies. (DOCX 13 kb)

Additional file 2:

Supplemental File_CFA models. (DOCX 118 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Cook, C.R., Davis, C., Brown, E.C. et al. Confirmatory factor analysis of the Evidence-Based Practice Attitudes Scale with school-based behavioral health consultants. Implementation Sci 13, 116 (2018). https://doi.org/10.1186/s13012-018-0804-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13012-018-0804-z

Keywords