Skip to main content

Broadening measures of success: results of a behavioral health translational research training program

Abstract

Background

While some research training programs have considered the importance of mentoring in inspiring professionals to engage in translational research, most evaluations emphasize outcomes specific to academic productivity as primary measures of training program success. The impact of such training or mentoring programs on stakeholders and local community organizations engaged in translational research efforts has received little attention. The purpose of this evaluation is to explore outcomes other than traditional academic productivity in a translational research graduate certificate program designed to pair graduate students and behavioral health professionals in collaborative service-learning projects.

Methods

Semi-structured qualitative interviews with scholars, community mentors, and academic mentors were conducted regarding a translational research program to identify programmatic impacts. Interviews were transcribed and coded by the research team to identify salient themes related to programmatic outcomes.

Results

Results are framed using the Translational Research Impact Scale which is organized into three overarching domains of potential impact: (1) research-related impacts, (2) translational impacts, and (3) societal impacts. This evaluation demonstrates the program’s impact in all three domains of the TRIS evaluation framework. Graduate certificate participants (scholars) reported that gaining experience in applied behavioral health settings added useful skills and expertise to their present careers and increased their interest in pursuing translational research. Scholars also described benefits resulting from networks gained through participation in the program, including valuable ties between the university and community behavioral health organizations.

Conclusions

This evaluation of the outcomes of a graduate certificate program providing training in translational research highlights the need for more community-oriented and practice-based measures of success. Encouraging practitioner involvement in translational research is vital to translate knowledge into practice and to enable practice-based needs to inform research and policy. A more flexible approach to measuring programmatic success in research training programs can help bridge the knowledge translation gap.

Peer Review reports

Background

Translational research aims to improve individual and population health outcomes by bringing evidence-based knowledge into clinical and public health practice [1, 2]. Health service researchers and clinicians are increasingly expected to not only produce high-quality research but also facilitate sustained implementation of knowledge in practice [3]. Medical schools and research institutions face increasing pressure to improve researchers’ and clinicians’ competencies in both knowledge production and knowledge translation [4]. Evaluation of the success of training programs is therefore a priority.

Training in translational research has thus far demonstrated positive results. Targeting graduate students for mentoring has been shown to influence graduate students’ career development trajectories. Long-term mentoring also encourages support of mentees to continue to evolve their academic research careers [5]. Many evaluations, however, emphasize academic productivity (in the form of publications and grants) as a primary measure of success [6,7,8]. While the skills of the research team or their record of interdisciplinary collaboration have served as additional outcome measures in a few programs [9, 10], there is an increasing need to establish relevant competencies of trainees for evaluating success at all levels of the knowledge translation process. Whereas several frameworks for assessing success of translational research efforts exist, no consensus has been reached on how to best evaluate these programs [11,12,13,14].

In this paper, the authors present an evaluation of a National Institute on Drug Abuse (NIDA)-funded translational research training program, the Institute for Translational Research Education in Adolescent Drug Abuse (“the Institute”), aimed at engaging graduate-level scholars and behavioral health practitioners in substance abuse translational research. The Institute engages new researchers in combining substance abuse research with evidence-based approaches to implementation. Such a training program has potentially substantial public health impacts given the sizable gap between the demand for implementation of high-quality evidence-based programs and the number of researchers focusing in this area [15, 16].

Glasgow et al. [17] called for new dissemination and implementation research-practice collaborations that include those community members and local practitioners who will be involved in implementation. This involvement is key to embedding health care advances into existing workflows and achieving the sustainability that often eludes implementation science efforts [17]. Accordingly, Glasgow et al. [17] expanded the definition of translational research to a five-stage process involving overlapping and interrelated stages: 1) T0 is the identification of a health/behavioral health issue; 2) T1 is the development of interventions for the health/behavioral heath issue; 3) T2 is the identification of the interventions' impact on the health/behavioral health issue; 4) T3 is the evaluation of the implementation of these interventions in practice; and 5) T4 is the evaluation of the effectiveness of the intervention across several applied settings. The Institute’s focus is primarily in T3 translational research, which includes “investigations designed to increase uptake and implementation of evidence-based recommendations into practice” ([17], p1276).

One of the most challenging aspects of implementing evidence into practice is that implementation stakeholders (i.e. providers, organizations) rarely have the skills to interpret and implement new research into practice [18]. In this paper, the authors describe part of the Institute’s success in terms of building capacity for implementing evidence-based practices by training current and future practitioners and researchers to understand translational research and implementation within community settings. We advocate that traditional measures of success focusing on academic productivity (grants and publications) fail to capture the breadth of impact of a translational research institute intending to train community practitioners, future health professionals, and researchers in translational research.

Institute overview

The central aim of the Institute is to develop innovative, applied research skills among behavioral health researchers and practitioners through a collaborative service-learning-focused approach. Institute scholars complete service-learning translational research and implementation science studies in collaboration with academic and community mentors and community behavioral health organization partners as part of the Graduate Certificate in Translational Research in Adolescent Behavioral Health Program. Implementation is not only relevant for academically based researchers but also has a pragmatic relevance for those who practice in the field. The intention of including both master’s and doctoral students in the Institute is to expose a broader range of professionals to the implications for translational research and implementation science in working agencies.

Service-learning studies are defined as research carried out as a service to the program. Scholars were expected to approach their projects based on what community mentors describe as the greatest need in their organization. The graduate certificate provided additional training for scholars, in most cases, outside of the requirements for their degree program. The program was designed as a graduate certificate to be available to enrolled graduate students and also available to community members seeking additional certification and/or knowledge for their current job. Additionally, making the program an official graduate certificate of the university created a sustainable program beyond grant funding.

Interested applicants applied to the Institute by submitting a personal statement, resume, two letters of recommendation, and official transcripts. Members of the Executive Committee then reviewed the applicants collectively and made final decisions regarding admission to the program. The Institute Executive Committee included the multiple principal investigators for the Institute and the project director who were responsible for the Institute’s design, implementation, and evaluation.

Two groups of mentors (academic and community) contributed to the delivery of the curriculum. The academic mentors provided support in research design and analysis, while the community mentors provided hands-on experience and real-world interpretations. Community agencies interested in participating in the Institute met with members of the Executive Committee to determine fit for the program, including the commitment to scholar project length and providing a community mentor at their site. Academic mentors were selected at the university based on areas of expertise that were applicable to the proposed research projects.

The graduate certificate program (15 credits) requires four continuous semesters, a total of 18 months, to complete. Scholars complete online coursework on adolescent behavioral health, translational research methods, and community-based participatory research, while also completing their in-person service-learning research project. During the first semester, scholars attend the Annual Research and Policy Conference on Child, Adolescent and Young Adult Behavioral Health where they participate in a networking event with community mentors and academic mentors. Scholars then rank preferences for projects, and the Executive Committee reviews scholar preferences and mentor expertise and places the scholars and mentors into their respective research teams. During the last semester of the program, graduating scholars also attend the conference (cited above) to present findings from their applied research studies. For additional information about the Institute, see the website, http://www.usf.edu/cbcs/cfs/itre/index.aspx [19, 20].

As part of the triadic mentoring relationship, scholars have the opportunity to gain experience in improving community readiness, uptake, and successful implementation of new programs and practices. Academic mentors provide experience in methodologies of data collection, research and evaluation methods, community consultation, and problem solving, while community partners provide the practical application possibilities and pitfalls of accomplishing such changes.

Measures for gauging the success of the program beyond academic productivity include real-world implementation and community outcomes. These measures are considered a step toward a more comprehensive understanding of the impact of translational research training on both scholars and community behavioral health organizations.

The purpose of this evaluation is to explore outcomes of the Institute beyond academic productivity in order to highlight areas of impact and suggest potential next steps in the field of translational research. The Translational Research Impact Scale (TRIS), developed by Dembe et al. [14], provides a framework for evaluating long-term and community-oriented outcomes of the Institute. The TRIS distinguishes three domains of potential impact: (1) research-related impacts, (2) translational impacts, and (3) societal impacts. Expansion to translational and societal impacts lends a broader perspective to evaluation, moving beyond academic productivity to consider implementation and impact as key outcomes of translational research training. For a full list of potential impact indicators in each sub-domain of the TRIS, refer to Dembe et al. [14].

Methods

Participants and data collection

Scholars

The Institute engages in ongoing process and outcomes evaluation in its efforts to inform practitioners and future researchers about the field of translational research and implementation science. The current study reports evaluation measures completed to date with the first two cohorts of scholars enrolled in the graduate certificate program. Evaluation is mainly focused on cohort #1 scholars, for whom information about career trajectory and longer term impact has already been collected. Cohort #1 included a Masters of Social Work student, six Masters of Public Health students, four employees of community agencies, two doctoral students in public health, and one doctoral student in criminology. Insights from cohort #2 scholars (similar in background), as well as academic and community mentors, are also included.

Cohort #1 scholars (completed Institute participation in May 2014) were interviewed 1 year after completion of the Institute graduate certificate program. Questions focused on long-term outcomes of participation in the Institute, including current career, current interests and engagement in translational research, dissemination and research activities, and other themes of interest. Cohort #2 scholars (completed Institute participation in May 2015) were interviewed about their experiences in the Institute and plans for future research, career aspirations, and beliefs about how the Institute influenced their careers and professional aspirations overall.

Community mentors

For each cohort, Institute scholars were matched with a community partner organization focused on behavioral health in order to complete the translational research and implementation science study. For cohort #1, five community behavioral health organizations agreed to host a team of scholars and provide an on-site community mentor to work directly with scholars on the conceptualization and implementation of the translational research study. The on-site community mentor provided oversight on the nature and scope of the project, while also assisting the scholars with ongoing data collection and analysis activities. For cohort #2, all but one of the original five agencies continued their involvement with the Institute.

Academic mentors

In addition to being paired with a community mentor, scholars were also paired with academic mentors, forming a triadic mentoring relationship. Academic mentors were tasked with providing consultation and oversight of research methodology including qualitative and quantitative approaches for data collection and analysis. In particular, academic mentors provided oversight on methodology in order to ensure the service-learning translational research studies best matched the needs of the behavioral health organizations. Academic mentors were paired with teams of scholars whose proposed research studies matched at least one of their areas of research expertise.

Data analysis

The purpose of this evaluation was to assess longer term impacts of the Institute on scholars as well as from the perspectives of academic mentors and community partners. This evaluation was approved by the University of South Florida Institutional Review Board and participants provided verbal consent to participate. Semi-structured qualitative interviews were audio recorded and transcribed verbatim. A member of the Executive Committee trained the interviewers. Interviewers conducted practice interviews with Institute staff and met to discuss issues that arose in order to align strategies prior to conducting the interviews.

Sample size was limited by the total number of people participating in the Institute program. In contrast to quantitative sampling in which large numbers of participants are required to achieve significance, relatively small numbers, as few as twelve in some instances, of participants are typically needed in qualitative research to achieve what is referred to as “thematic saturation” [21,22,23,24]. Thematic saturation is achieved when additional interviews no longer uncover unique themes in participants’ comments. Because of the narrow scope of the questions explored in this evaluation, we believe that the total number of interviews (n = 33) was sufficient for the purposes of this evaluation.

All interview transcripts were coded manually using both inductive and deductive approaches [23,24,25,26] for themes related to outcomes and impacts of interest for the Institute evaluation. Pre-determined areas of interest used to identify themes included: (1) career aspirations, (2) current career, (3) dissemination activities, (4) education activities/aspirations, (5) grant-writing activities/aspirations, and (6) influence of the Institute on Scholarslives overall. The importance of networks gained through Institute participation was added as an emergent theme of central interest to scholars and community mentors. Two coders met multiple times to discuss coding and establish reliability. This reliability procedure was intended to critically reflect on disagreements and improve analytical consistency, not to measure percentage of agreement [27, 28]. Qualitative analysis of codes was performed by an independent analyst not affiliated with the Institute with expertise in qualitative analysis methods and theoretical analysis [29,30,31]. Involving a third party in analysis helped to reduce bias and bring a fresh perspective to assess Institute outcomes.

After initial coding and analysis, the Translational Research Impact Scale (TRIS) [14] was used to frame the potential impact of translational research efforts of the Institute. This evaluation demonstrates the Institute’s impact in all three domains of the TRIS evaluation framework (see Dembe et al. [14]).

Results

Institute scholars were recruited from health- and social-services-related graduate academic disciplines, programs, and fields, such as criminal justice, education, nursing, psychology, public health, behavioral health, rehabilitation and mental health counseling, and social work. Of the fifteen scholars in cohort #1, twelve agreed to be interviewed for this evaluation. Of the twelve scholars in cohort #2, seven agreed to be interviewed for this evaluation.

Community mentors were full-time employees at local or regional child and adolescent behavioral health service agencies. We use behavioral health service agencies to refer to organizations that provided mental health, substance use, and/or educational programs, in this case specific to children and adolescents. Three of the agencies provided direct behavioral health services. One agency disseminates and evaluates an evidence-based curriculum focused on preventing substance abuse. The final agency was the social work services division of a local county school system. All five community mentors representing participating community behavioral health service agencies participated in the 2014 evaluation. Three out of the four community mentors participated in the 2015 evaluation.

Of the academic mentors, two were psychologists with backgrounds working with community and school-based mental health intervention programs. One academic mentor had a background in mental health policy and social/health psychology and extensive experience working with community behavioral health agencies on evaluation projects. Another academic mentor was a sociologist with expertise in behavioral health statistics. All six of the academic mentors involved with the cohort #1 scholars participated in interviews during the 2014 evaluation. Three of the four academic mentors involved with the cohort #2 scholars participated in interviews during the 2015 evaluation.

Using the domains of the TRIS framework [14] (research-related impacts, translational impacts, and societal impacts) as a guide, the authors describe the range of ways participation in the Institute impacted both scholars and community mentors. Overall, scholars and community mentors reported increased interest in translational research, implementation science, and substance abuse research as a result of participating in a research project as part of the Institute service-learning component, as well as development of valuable professional networks and ongoing collaborative partnerships. The impacts in each of the TRIS domains follow and quotes from scholars, community mentors, and academic mentors illustrate their respective impact in specific areas of the TRIS domains.

Research-related impacts

Domain 1, research related impacts, is the most extensive domain in the TRIS. Research-related impacts include areas such as (1) identification of research needs, (2) development of innovative methods, (3) improvement of research conduct and/or management strategies, (4) improvements in analysis and synthesis of results, (5) increases in the number of grant submissions and publications among translational researchers, (6) new, marketable discoveries, and (7) successes in research dissemination to appropriate audiences. Table 1 contains illustrative quotes from participants to show research-related impacts resulting from Institute-applied research projects.

Table 1 Illustrative quotations to demonstrate areas of research-related impacts

A range of new research networks and collaborations were formed as a result of collaborative mentoring and community-based participatory research studies conducted as part of the Institute graduate certificate program. Community mentors, representing five different community behavioral health organizations collaborated with Institute scholars on community-based applied research studies. Although some individuals had prior relationships with the Institute’s faculty, community mentors and academic mentors reported improved collaborative relationships as a result of the Institute service-learning component [32].

The triangular research mentoring relationship, including scholars, academic mentors, and community mentors, reportedly influenced scholars’ success and helped them develop and refine competencies in public health and translational research [19]. One year after participation in the Institute, scholars reported that networks gained through experience with the Institute were instrumental in obtaining their present positions and both scholars and community mentors reported those networks strengthened their commitment to pursuing community-engaged work initiatives.

To date, four cohorts of scholars have created and implemented applied research studies in collaboration with community mentors. The service-learning studies that were completed have positively impacted the operations of participating community organizations [20]. The translational research studies completed by Institute scholars in collaboration with community mentors ranged from exploring the integration of behavioral health and primary care services (traditionally delivered separately in the United States) to identifying factors and mechanisms for adapting evidence-based practices in schools. Scholars and community mentors reported heightened awareness of the need for practical application of evidence-based research.

In the context of translational research, individual mentoring can be difficult and thus not conducive to collaborative training. The Institute developed a collaborative mentoring approach to address this challenge and provided broader training to scholars as well as offering beneficial service-learning research to community behavioral health organizations [19]. Emphasis on the “real-world” or actual community agency experience provided by the Institute was a key theme in scholars’ descriptions of the overall impact of the Institute on their lives and career aspirations. Scholars reported that as a result of participation in the Institute, they had a better appreciation for the importance of research in applied community settings, as well as the need for evidence-based understanding prior to implementation and intervention.

Inclusion of community mentors and practice-oriented scholars led to widespread diffusion of understanding of the techniques involved in conducting and implementing translational research. Scholars reported giving presentations to local policymakers and community organizations. Those scholars working for health organizations also reported bringing techniques learned from participating in the Institute into their work and educating their colleagues on the value of evaluation.

Five scholars from cohort #1 reported presenting study results at national conferences. Several manuscripts are in preparation, authored by scholars and mentors from cohorts #1 and #2. Thus far, one manuscript has been published by an Institute scholar team [33]. Scholars cited the short timeline of participation in the Institute, the need for more incentives to publish, and inexperience in academic writing as barriers to manuscript preparation, submission, and subsequent publication.

One scholar noted that over-emphasis on academic productivity can frustrate practice-oriented scholars and de-emphasize outcomes that are of importance to community organizations such as programmatic development or sharing results with other providers.

Sometimes we get really wrapped up in formal research, in publishing and getting our names on things. That kind of left a bitter taste about that, not specifically from the Institute just like in general through my master’s program. I think maybe that’s why I’ve been not as involved or like eager to go to conferences or eager to be in the American Public Health Association or anything, just because I feel like we end up getting kind of really lost in that. I’m much more of like a one-on-one kind of person, like hands-on on the ground type work. (Cohort #1 Scholar)

To balance the different outcomes valued by academia versus community organizations, one key outcome for all projects included in the Institute was a presentation at a national conference held in Tampa, FL (Annual Research and Policy Conference on Child, Adolescent, and Young Adult Behavioral Health). Disseminating project results at this conference created an opportunity to speak to both academic- and community-based audiences.

Translational impacts

Domain 2, translational impacts, includes potential impact in areas such as (1) incorporation of discoveries from bench science into human or animal studies, (2) incorporation of clinical trial results into clinical or practical guidelines, (3) recruitment and preparation of new translational researchers, and (4) improvement in evidence-based health care service and delivery, patient outcomes, and positive health behaviors, as a result of research translation. Table 2 provides illustrative quotes from Institute participants regarding translational impacts.

Table 2 Illustrative quotations to demonstrate translational impacts

The Institute’s inclusion of graduate students and working professionals from a range of health- and social-services-focused disciplines resulted in a diverse cohort of professionals trained in translational research methodology not only from the university but also from community agencies. Whereas transdisciplinary research across university departments is important in furthering research and translation, members of community-based organizations involved in implementation also benefit from training in translational research and implementation science methods. As earlier Institute-focused publications describe, the processual development of the Institute, including a community agency network and a translational research training program, have contributed toward the goal of increasing the number of people working in health care who have been trained to understand the importance of evidence translation and research [19, 20, 32].

Societal impacts

Domain 3, social impacts, includes potential impact areas such as (1) strengthening and refining of health-related policies and procedures, (2) improvements in community health, empowerment, and economic conditions to reduce health disparities, and (3) development of new community-based participatory programs and partnerships geared toward effective and meaningful implementation. For illustrative quotes related to societal impacts resulting from Institute projects, see Table 3.

Table 3 Illustrative quotations to demonstrate impacts related to local community and society

Service-learning applied research studies completed as part of the Institute’s graduate certificate and training program were collaboratively designed and implemented by scholars and community mentors and academic mentors. The emphasis on mutual benefit was designed to ensure the sustainability of community-based behavioral health programs and partnerships. While most scholars did not continue to collaborate with the community mentors after graduating from the program, the community mentors continued partnerships with the university through ongoing service-learning projects with new cohorts of Institute scholars. Rather than attempts to create new behavioral health programs, the Institute’s training program was designed to train new researchers as well as make meaningful contributions to the function and applicability of existing behavioral health programs. Scholars have presented study outcomes and insights to local policymakers and community behavioral health organizations. Such presentations contribute to participatory decision-making and policy development. Community mentors described impacts to their organizations in terms of understanding the value of applied research and collaboration with the University, as well as of the need for adaptation and innovation of their ongoing operations.

Examples of outcomes from service-learning translational research projects completed as part of the Institute’s graduate certificate program include the following: (1) development of a training for providers focused on appropriate mechanisms to adapt a school-based evidence-based practice curriculum; (2) implementation of a professional development training program focused on mental health counseling; and (3) identifying facilitators and barriers to integrating primary care and behavioral health services.

The Institute’s emphasis on community-university partnerships with a community-based participatory research foundation was designed to broadly impact child and adolescent behavioral health issues [32]. Community behavioral health organizations involved in the Institute reported benefits as a result of collaboration with scholars and improved methods for serving their targeted at-risk populations.

At the time of the evaluation, cohort #1 scholars were employed in a range of behavioral health organizations and university departments. Current careers included the following: technical assistants on health focused grant projects; reporting and analytics for a health care organization; senior qualitative analyst for a health care organization; administrator for a substance-abuse-focused organization; curriculum development and implementation specialist for a substance abuse outreach organization; manager for a children’s special needs organization; and research fellow for a federal healthcare organization.

As noted above, one of the most challenging aspects of implementing evidence into practice is that implementation stakeholders (i.e., providers, organizations) rarely have the skills to interpret and implement new research into practice [18]. Part of the Institute’s societal impact is in providing training, particularly in T3, to people employed in various community organizations.

Discussion

Implementing a multi-component, applied service-learning research experience, in addition to education or perhaps curriculum-based translational research, holds the potential to expand the public health and behavioral health workforce in needed areas [34, 35]. To monitor the success of such programs, however, knowledge about translational research and implementation science must be considered along with evidence of productivity [36]. Evaluation of translational research education training must therefore include measures of success beyond conventional academic metrics of peer-reviewed publications. In this paper, several areas of impact beyond academic productivity are highlighted. Dimensions such as community engagement, network building, and practical application of evidence are areas of potential impact in translational research and implementation science; however, such effects have not been emphasized or measured in translational research education training [13].

Fortunately, there are resources available to encourage translation of research into practice. The National Collaborating Centre for Methods and Tools [37] has developed a series of resources for practitioners, including education and self-assessments, to encourage their uptake and use of evidence-based practices. Approaches to evaluating the success of translational research education training require sufficient flexibility to accommodate individuals and individual institutions, while still retaining enough rigor to demonstrate that a program is meeting established objectives and competencies [2]. Another potential outcome of translational research education could be utilizing evidence to inform policy making. The SUpporting POlicy Relevant Reviews and Trials (SUPPORT) Project provides a series of resources targeting policy-makers to encourage their use of evidence-informed decision-making in policy development [38].

Training and mentoring new researchers in implementation science and in translational research are increasingly emphasized in health equity research, as evidenced by the many current programs in existence for training both students and faculty members [1, 6, 8, 10, 39,40,41]. To facilitate innovation in translational research in clinical and services research, researchers from various disciplines must be willing to engage in collaboration beyond the defined margins of their discipline [9]. Although most education training is focused on early career research faculty, Wooten et al. [4] suggest that engaging graduate-level scholars in translational research may be the key to inspiring a new cohort of investigators in this field. A critical hurdle in translating knowledge to practice is the gap that exists between academic publication goals and the policy-oriented interests of public health and behavioral health directors and service agency administrators. There is a need to establish effective iterative loops in translation between researchers and those involved in implementation in order to address shared goals [15].

Limitations

This paper is limited by the self-reported nature of the qualitative data collected. Qualitative investigation is well-suited, however, to providing in-depth information about participants’ experiences. For the purpose of this evaluation, qualitative interviews were intended to elicit information about stakeholder experiences in the Institute. Although this paper does not provide a comprehensive evaluation of the Institute, these findings about Institute scholar experiences and the need for alternative measures of success will likely be of interest to similar current and future educational training programs in translational research [8, 39,40,41]. Findings related to the relevance of translational research training to members of community behavioral health organizations also point to the importance of training in these skills beyond academia.

Conclusions

Over the course of this evaluation, the authors identified several drawbacks in emphasizing academic measures of productivity rather than community outcomes in evaluating translational research training programs. Common measures, such as numbers of grant applications and awards, are less revealing of success when applied to practice-oriented graduate and professional degree programs and community organizations. In this paper, the use of the TRIS domains to identify areas of potential impact suggests ways of broadening measures of success in the context of a translational research educational training program designed for graduate students, community partners, and academic mentors. This triadic, applied approach to translational research training has potential to broaden the reach of translational research in several ways. After participating in training programs, researchers are better prepared to work in interdisciplinary environments and to successfully implement evidence-based programs in applied community practice settings. Further, training not only graduate students but also members of stakeholder organizations can facilitate improved communication and understanding and lead to stronger community/university partnerships.

Abbreviations

NIDA:

National Institute on Drug Abuse

TRIS:

Translation Research Impact Scale

References

  1. Morrato EH, Rabin B, Proctor J, Cicutto LC, Battaglia CT, Lambert-Kerzner A, et al. Bringing it home: expanding the local reach of dissemination and implementation training via a university-based workshop. Implement Sci. 2015;10(1):1–12. doi:10.1186/s13012-015-0281-6.

    Article  Google Scholar 

  2. Rubio DM, Schoenbaum EE, Lee LS, Schteingart DE, Marantz PR, Anderson KE, et al. Defining translational research: implications for training. Acad Med. 2010;85(3):470–5. doi:10.1097/ACM.0b013e3181ccd618.

    Article  PubMed  PubMed Central  Google Scholar 

  3. Lal S, Urquhart R, Cornelissen E, Newman K, Van Eerd D, Powell BJ, et al. Trainees’ self-reported challenges in knowledge translation, research and practice. Worldviews Evid-Based Nurs. 2015;12(6):348–54. doi:10.1111/wvn.12118.

    Article  PubMed  Google Scholar 

  4. Wooten KC, Dann SM, Finnerty CC, Kotarba JA. Translational science project team managers: qualitative insights and implications from current and previous postdoctoral experiences. Postdoc J. 2014;2(7):37–49.

    PubMed  PubMed Central  Google Scholar 

  5. Waitzkin H, Yager J, Parker T, Duran B. Mentoring partnerships for minority faculty and graduate students in mental health services research. Acad Psychiatry. 2006;30(3):205–17. doi:10.1176/appi.ap.30.3.205.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Viets VL, Baca C, Verney SP, Venner K, Parker T, Wallerstein N. Reducing health disparities through a culturally centered mentorship program for minority faculty: the Southwest Addictions Research Group (SARG) experience. Acad Med. 2009;84(8):1118–26. doi:10.1097/ACM.0b013e3181ad1cb1.

    Article  PubMed  PubMed Central  Google Scholar 

  7. Wooten KC, Calhoun WJ, Bhavnani S, Rose RM, Ameredes B, Brasier AR. Evolution of multidisciplinary translational teams (MTTs): insights for accelerating translational innovations. Clin Transl Sci. 2015;8(5):542–52. doi:10.1111/cts.12266.

    Article  PubMed  PubMed Central  Google Scholar 

  8. Proctor EK, Landsverk J, Baumann AA, Mittman BS, Aarons GA, Brownson RC, et al. The implementation research institute: training mental health implementation researchers in the United States. Implement Sci. 2013;8(1):1–12. doi:10.1186/1748-5908-8-105.

    Article  Google Scholar 

  9. Begg MD, Crumley G, Fair AM, Martina CA, McCormack WT, Merchant C, et al. Approaches to preparing young scholars for careers in interdisciplinary team science. J Investig Med. 2014;62(1):14–25. doi:10.2310/JIM.0000000000000021.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Straus SE, Brouwers M, Johnson D, Lavis JN, Légaré F, Majumdar SR, et al. Core competencies in the science and practice of knowledge translation: description of a Canadian strategic training initiative. Implement Sci. 2011;6(1):127.

    Article  PubMed  PubMed Central  Google Scholar 

  11. Pozen R, Kline H. Defining success for translational research organizations. Sci Transl Med. 2011;3(94):94cm20. doi:10.1126/scitranslmed.3002085.

    Article  PubMed  Google Scholar 

  12. Rajan A, Sullivan R, Bakker S, Van Harten WH. Critical appraisal of translational research models for suitability in performance assessment of cancer centers. Oncologist. 2012;17(12):e48–57.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Grether M, Eickelberg O, Mall MA, Rabe KF, Welte T, Seeger W. New metrics for translational research. Lancet Respir Med. 2014;2(8):e13–4.

    Article  PubMed  Google Scholar 

  14. Dembe AE, Lynch MS, Gugiu PC, Jackson RD. The translational research impact scale development, construct validity, and reliability testing. Eval Health Prof. 2014;37(1):50–70.

    Article  PubMed  Google Scholar 

  15. Ginexi EM, Hilton TF. What’s next for translation research? Eval Health Prof. 2006;29(3):334–47. doi:10.1177/0163278706290409.

    Article  PubMed  Google Scholar 

  16. Truncali A, Kalet AL, Gillespie C, More F, Naegle M, Lee JD, et al. Engaging health professional students in substance abuse (SA) research: development and early evaluation of the SARET program. J Addict Med. 2012;6(3):196–204. doi:10.1097/ADM.0b013e31825f77db.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Glasgow RE, Vogt TM, Boles SM, Vinson C, Chambers D, Khoury MJ, Kaplan RM, Hunter C. National Institutes of Health approaches to dissemination and implementation science: Current and future directions. The American Journal of Public Health. 2012;102(7):1274–81.

  18. Grimshaw JM, Eccles MP, Lavis JN, Hill SJ, Squires JE. Knowledge translation of research findings. Implement Sci. 2012;7(1):50. doi:10.1186/1748-5908-7-50.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Young B-R, Williamson HJ, Burton DL, Massey OT, Levin BL, Baldwin JA. Challenges and benefits in designing and implementing a team-based research mentorship experience in translational research. Pedagogy Health Promot. 2015;1(4):233–46. doi:10.1177/2373379915600174.

    Article  PubMed  PubMed Central  Google Scholar 

  20. Burton DL, Levin BL, Massey T, Baldwin J, Williamson H. Innovative Graduate Research Education for Advancement of Implementation Science in Adolescent Behavioral Health. J Behav Health Serv Res. 2016;43(2):172–86.

  21. Malterud K. Qualitative research: standards, challenges, and guidelines. Lancet. 2001;358. doi:10.1016/s0140-6736(01)05627-6.

  22. Glaser BG, Strauss AL. The discovery of grounded theory: Strategies for qualitative research. New York, NY: Routledge; 2017 [1999].

  23. Strauss A, Corbin J. Basics of qualitative research: Techniques and procedures for developing grounded theory. Thousand Oaks, CA: Sage Publications; 1998.

  24. Strauss AL. Qualitative analysis for social scientists. New York, NY: Cambridge University Press; 1987.

  25. Strauss A, Corbin J. Basics of Qualitative Research. Grounded Theory Procedures and Techniques. Newbury Park, California: SAGE Publications; 1990.

    Google Scholar 

  26. Bradley EH, Curry LA, Devers KJ. Qualitative data analysis for health services research: developing taxonomy, themes, and theory. Health Serv Res. 2007;42(4):1758–72. doi:10.1111/j.1475-6773.2006.00684.x.

    Article  PubMed  PubMed Central  Google Scholar 

  27. Armstrong D, Gosling A, Weinman J, Marteau T. The place of inter-rater reliability in qualitative research: an empirical study. Sociology. 1997;31(3):597–606.

    Article  Google Scholar 

  28. Pope C, Ziebland S, Mays N. Analysing qualitative data. BMJ. 2000;320(7227):114–6.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  29. Eaves ER, Nichter M, Ritenbaugh C, Sutherland E, Dworkin SF. Works of illness and the challenges of social risk and the specter of pain in the lived experience of TMD. Med Anthropol Q. 2015;29(2):157–77.

    Article  PubMed  Google Scholar 

  30. Eaves ER, Sherman KJ, Ritenbaugh C, Hsu C, Nichter M, Turner JA, et al. A qualitative study of changes in expectations over time among patients with chronic low back pain seeking four CAM therapies. BMC Complement Altern Med. 2015;15(1):1–10. doi:10.1186/s12906-015-0531-9.

    Article  Google Scholar 

  31. Hsu C, Sherman K, Eaves ER, Turner J, Cherkin DC, Cromp D. New perspectives on patient expectations of treatment outcomes: results from qualitative interviews with patients seeking complementary and alternative medicine treatments for chronic low back pain. BMC Complement Altern Med. 2014;14. doi:10.1186/1472-6882-14-276.

  32. Williamson HJ, Young B-R, Murray N, Burton DL, Lubotsky Levin B, Massey OT, et al. Community-University Partnerships for Research and Practice: Application of an Interactive and Contextual Model of Collaboration. J High Educ Outreach Engagement. 2016;20(2):55–84.

  33. Castillo HL, Rivers T, Randall C, Gaughan K, Ojanen T, Burton D. Placing Evidence-Based Interventions at the Fingertips of School Social Workers. J Behav Health Serv Res. 2016;43(3):474–83.

  34. Sopher CJ, Adamson BJS, Andrasik MP, Flood DM, Wakefield SF, Stoff DM, et al. Enhancing diversity in the public health research workforce: the research and mentorship program for future HIV vaccine scientists. Am J Public Health. 2014;105(4):823–30. doi:10.2105/AJPH.2014.302076.

    Article  PubMed  Google Scholar 

  35. Adamson BJS, Fuchs JD, Sopher CJ, Flood DM, Johnson RP, Haynes BF, et al. A new model for catalyzing translational science: the early stage investigator mentored research scholar program in HIV vaccines. Clin Transl Sci. 2015;8(2):166–8. doi:10.1111/cts.12249.

    Article  PubMed  Google Scholar 

  36. Kerner JF. Knowledge translation versus knowledge integration: a “funder’s” perspective. J Contin Educ Health Prof. 2006;26(1):72–80.

    Article  PubMed  Google Scholar 

  37. National Collaborating Centre for Methods and Tools. Is Research Working for You? Tool. Hamilton: McMaster University; 2009. (Updated 16 February, 2017) Retrieved from http://www.nccmt.ca/resources/search/35.

    Google Scholar 

  38. Lavis J, Oxman A, Lewin S, Fretheim A. SUPPORT tools for evidence-informed health policymaking (STP). Introduction. Health Res Policy Syst. 2009;7 Suppl 1:I1–10.

    Article  PubMed  PubMed Central  Google Scholar 

  39. Johnson MO, Subak LL, Brown JS, Lee KA, Feldman MD. An innovative program to train health sciences researchers to be effective clinical and translational-research mentors. Acad Med. 2010;85(3):484.

    Article  PubMed  PubMed Central  Google Scholar 

  40. Feldman MD, Steinauer JE, Khalili M, Huang L, Kahn JS, Lee KA, et al. A mentor development program for clinical translational science faculty leads to sustained, improved confidence in mentoring skills. Clin Transl Sci. 2012;5(4):362–7. doi:10.1111/j.1752-8062.2012.00419.x.

    Article  PubMed  PubMed Central  Google Scholar 

  41. Urquhart R, Cornelissen E, Lal S, Colquhoun H, Klein G, Richmond S, et al. A community of practice for knowledge translation trainees: an innovative approach for learning and collaboration. J Contin Educ Health Prof. 2013;33(4):274–81.

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

The research reported in this article was supported by the National Institutes of Health National Institute on Drug Abuse (NIDA R25DA031103). The authors wish to thank the Institute scholars, mentors, and community organizations for their participation and willingness to share their experiences with the study team.

Funding

The research reported in this article was supported by the National Institute on Drug Abuse of the National Institutes of Health (R25DA031103). The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

Availability of data and materials

The datasets generated during and/or analyzed during the current study are not publicly available due to patient confidentiality considerations. Aggregate data are available from the corresponding author on reasonable request.

Author information

Authors and Affiliations

Authors

Contributions

JB was MPI of the study and was involved in the study design and implementation, as well as in drafting and revising the manuscript. HW was involved in the coding and analysis, as well as in drafting and revising the manuscript. EE was involved in the analysis and in drafting and revising the manuscript. DB was the project director and was involved in the study design and implementation, as well as in revising the manuscript. BL was MPI and was involved in the study design and implementation, as well as in revising the manuscript. OM was MPI and was involved in the implementation and evaluation of the study and in revising the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Julie A. Baldwin.

Ethics declarations

Ethics approval and consent to participate

This evaluation was approved by the University of South Florida Institutional Review Board and participants provided verbal consent to participate.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Baldwin, J.A., Williamson, H.J., Eaves, E.R. et al. Broadening measures of success: results of a behavioral health translational research training program. Implementation Sci 12, 92 (2017). https://doi.org/10.1186/s13012-017-0621-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13012-017-0621-9

Keywords