This article has Open Peer Review reports available.
Translating research into practice in Leeds and Bradford (TRiPLaB): a protocol for a programme of research
© Hanbury et al; licensee BioMed Central Ltd. 2010
Received: 18 March 2010
Accepted: 21 May 2010
Published: 21 May 2010
The National Institute for Health Research (NIHR) has funded nine Collaborations for Leadership in Applied Health Research and Care (CLAHRCs). Each CLAHRC is a partnership between higher education institutions (HEIs) and the NHS in nine UK regional health economies. The CLAHRC for Leeds, York, and Bradford comprises two 'research themes' and three 'implementation themes.' One of these implementation themes is Translating Research into Practice in Leeds and Bradford (TRiPLaB). TRiPLaB aims to develop, implement, and evaluate methods for inducing and sustaining the uptake of research knowledge into practice in order to improve the quality of health services for the people of Leeds and Bradford.
TRiPLaB is built around a three-stage, sequential, approach using separate, longitudinal case studies conducted with collaborating NHS organisations, TRiPLaB will select robust innovations to implement, conduct a theory-informed exploration of the local context using a variety of data collection and analytic methods, and synthesise the information collected to identify the key factors influencing the uptake and adoption of targeted innovations. This synthesis will inform the development of tailored, multifaceted, interventions designed to increase the translation of research findings into practice. Mixed research methods, including time series analysis, quasi-experimental comparison, and qualitative process evaluation, will be used to evaluate the impact of the implementation strategies deployed.
TRiPLaB is a theory-informed, systematic, mixed methods approach to developing and evaluating tailored implementation strategies aimed at increasing the translation of research-based findings into practice in one UK health economy. Through active collaboration with its local NHS, TRiPLaB aims to improve the quality of health services for the people of Leeds and Bradford and to contribute to research knowledge regarding the interaction between context and adoption behaviour in health services.
In response to the recommendation of the Chief Medical Officer's Clinical Effectiveness Group that the NHS should better utilise higher education to support initiatives to enhance the effectiveness and efficiency of clinical care , the National Institute for Health Research (NIHR) announced a strategy of increasing partnerships between higher education and the NHS in local health economies. One means of developing these partnerships is Collaborations in Leadership and Applied Health Research and Care or CLAHRCs. The NIHR has funded nine CLAHRCs, each with an emphasis on research that makes an impact locally and with a strong, disciplined, and strategic approach to implementing that research. The NIHR CLAHRC for Leeds, York and Bradford (LYBRA) comprises two 'research' programmes (Improving Vascular Prevention in Cardiac and Stroke Care (IMPROVE-PC), Improving the Quantity and Quality of Life in People with Addictions) and three 'implementation' programmes (Outcome Driven Stroke Care, Maternal and Child Health, and the focus of this protocol, Translating Research into Practice in Leeds and Bradford, or TRiPLaB).
The aim of TRiPLaB is to develop, implement, and evaluate methods of inducing and sustaining the uptake of research into practice in order to improve the quality of health services for the people of Leeds and Bradford. Research implementation is a complex process, highly dependent on context, and interactions between multiple, interconnected, factors at the level of individuals, groups, organisations and wider health systems [2–6]. Despite this complexity, or perhaps because of it, implementation research has often focused on individual behaviour change without reflecting on, or paying attention to, the characteristics of health technologies, the processes by which health technologies are adopted and sustained, or a workable understanding of the particular context in which implementation occurs .
Successive reviews of the evidence for successful translation of research findings into healthcare practice reveal that a range of implementation strategies can be successful. However, why strategies work in some circumstances but not others remains unclear [3, 6, 8].
Using theory to guide the exploration of the local context for implementation can help . First, relevant theories enable the tailoring of strategies to the most significant barriers to translating research into practice in a given context. Second, theories enable researchers to build on existing knowledge and increase the transferability of findings to settings and contexts other than the immediate research environment .
TRiPLaB will use theory to guide its exploration of context in our collaborating healthcare organisations. This exploration will in turn inform the development of tailored implementation strategies for innovation delivery. The synthesis of research findings by Greenhalgh et al.  on the dissemination and implementation of research-based innovations provides the theoretical framework for TRiPLaB. Their analysis proposes that successful innovation adoption requires analysis of the characteristics of the innovation itself, the perceptions of those individuals tasked with adopting the innovation, and the wider organisational cultures in place in the setting for adoption. Shaped by diffusion of innovation theory , Greenhalgh et al. also acknowledge the influence of channels of communication, or social networks, between practitioners as important influences on whether, and how quickly, an innovation is adopted. In adopting this particular theoretical framework, TRiPLaB will explore the relative influence of these often overlooked but important elements at individual, team and organisational levels in our NHS partners [2–6]. This theory-informed exploration will form our 'diagnostic analysis'  of the local context in each of the NHS healthcare organisations that make up our case study series.
Ethical approval for this study was given by York Research Ethics Committee (REC 10/H1311/1).
The Develop, IMplement, Evaluate (DIME) approach of TRiPLaB
TRiPLaB is a multisite, longitudinal, mixed methods case study. Currently, we are working with NHS Bradford and Airedale (an NHS commissioning and community provider organisation) to translate research-based findings into practice in the areas of maternal mental health and stroke care, and with Leeds Partnership Foundation Trust (a provider of mental health services) to enhance the implementation of recent and relevant NICE guidance.
Phase one is a development phase in which the innovation that is the focus of each case study will be selected and its key characteristics mapped. Theory-informed factors hypothesised as influential in health professionals' adoption of the selected innovation into routine practice are explored and mapped.
Phase two is an implementation phase in which tailored behaviour-change interventions are developed piloted and delivered using personnel from TRiPLaB and its partner organisations.
Phase Three is an evaluation phase in which changes in structure, process, and outcome are described and evaluated. We will be looking at change both within and, towards the end of the programme, between case studies.
TRiPLaB will use the resources of the Centre for Reviews and Dissemination (CRD) to increase the accessibility of research evidence to decision makers (particularly commissioners) in the NHS. Primarily, we will do this by using tailored briefings relating research evidence to specific decision problems and context in Bradford and Leeds. These 'evidence briefings' will be based on existing sources of synthesised and quality assessed evidence - for example, CRD's databases of systematic reviews (DARE) and economic evaluations (NHS EED). We will develop and implement methods for producing and disseminating evidence briefings and evaluate their perceived usefulness, costs, and use by decision makers.
Development phase (phase one)
Selecting the innovation
At the start of each case study, the specific innovation to be targeted will be selected. The selection will be based on the results of: a qualitative stakeholder consultation designed to identify key topics; a conjoint analysis survey of commissioners and practitioners designed to explore those characteristics of innovations likely to influence individuals' prioritisation of them; and a mapping exercise exploring how each of the stakeholder short-listed topics 'scores' against the characteristics measured in the conjoint analysis survey (for example, local capacity and expertise for implementation, cost/impact on local budgets). We will also consider pragmatic issues, such as the presence or absence of routine data sources to aid the measurement of innovation adoption.
Stakeholder consultations will focus on identifying key topics in the relevant clinical area. For example, in NHS Bradford and Airedale, stakeholder consultation in the area of child and maternal health care with a range of commissioners and practitioners revealed the importance of maternal mental health as a focus for activity.
The conjoint survey will reveal the characteristics  that influence an individual's decision to prioritise one innovation over another. The factors that make up the conjoint profiles to be evaluated will include characteristics such as the strength of supporting evidence base behind an innovation and its economic costs. By conjoint analysing the characteristics of potential innovations, we will be able to 'plug in' future innovations and inform the organisation's understanding the likelihood of successful implementation. This has the obvious advantage of not having to ask the healthcare workforce or consumers to rank or rate innovations on multiple occasions. The conjoint analysis also reduces the likelihood of the TRiPLaB team targeting respondents (for example, as change agents) who may not favour the innovation eventually selected.
The mapping exercise will score short-listed innovations against characteristics measured in the conjoint analysis survey. For example, the strength of supporting evidence base for each of the short listed innovations from the stakeholder consultation will be explored through reference to published systematic reviews. The outcome of this process will be summarised in a matrix. Finally, the pragmatic factors to be considered will include whether suitable process of care and health outcome measures are available through routinely collected data to evaluate the impact of the implementation strategies, or whether tailor-made, repeatable, audits have to be established.
The selected innovation for each case study will be one that has been identified as a key topic from the stakeholder consultation that scores highly on those characteristics identified from the conjoint analysis survey as influential in commissioners' and practitioners' prioritisation of innovations, and can be monitored through tailor-made audits or, preferably, via routinely collected data. In sum, the combination of stakeholder consultation, the conjoint survey of practitioner and commissioner preferences, and the mapping exercise will enable us to select a robust but feasible innovation to target in each case site.
Exploring the local context
Following selection of the innovation to be targeted, we will undertake a diagnostic analysis  in which we will administer a second survey in each case site to measure health professionals' attitudes towards the innovation, health care team innovation culture (using the Team Climate Inventory ), and the social networks/communication channels between health professionals with regards to the innovation. A series of semi-structured interviews will also be conducted with a sample of the health professionals to further explore perceived barriers to implementation, and to gain a richer understanding of the influence of health care teams and social networks in the uptake and adoption of new innovations into practice.
Quantitative survey data will be synthesised using multilevel modelling (MLM) approaches to identify the hierarchical level most likely to be responsive to the implementation strategies developed. For example, if the MLM identifies healthcare team culture to be particularly influential, a multifaceted intervention specifically targeting a team's culture towards innovation might (a priori) be more successful than an intervention targeting only individual attitudes towards the innovation. This focus is deliberate given the current dearth of implementation research examining the influence of factors at different hierarchical levels in the health care system, and recommendations for further research in this area . The qualitative data collected from the semi-structured interviews will be analysed using thematic analysis and combined with the outcome of the MLM to gain a richer understanding of the local context and to help tailor the implementation strategies.
Implementation phase (phase two)
Development of the intervention will be systematic, specifying intervention objectives, developing specific implementation strategies to satisfy these objectives, and piloting strategies to assess their likely impact and test how they will be received by the health professionals. This piloting and modelling prior to rolling out implementation strategies/behaviour-change interventions is a necessary prerequisite stage . The objectives and design of the intervention will be determined by the outcome of the development phase, in particular the results of the planned multilevel modelling. The selection and design of the actual intervention components will be informed by existing systematic, and other, reviews of the relevant literature.
Having decided on the innovation in phase one and possible implementation strategies in phase two, we will make the final choice on our implementation approach with reference to the idea of 'policy' cost effectiveness . Summary data on: the innovation from Phase One (net cost per patient and likely health gain per patient); the implementation strategies under consideration (net cost of planned implementation and likely change in adoption/adherence); and local scale factors (for example, the number of NHS organisational units involved and number of patients targeted) will be combined to arrive at a policy cost effectiveness figure for each option. The combination with the highest cost effectiveness will be the option pursued.
The failure to adequately describe interventions in the context of research and the commensurate reduction in others' ability to then use successful programs -- or conversely, avoid making the same mistakes as unsuccessful ones -- is common in healthcare research . For each of the case studies in the TRiPLaB program we will describe: the intervention and its component parts in sufficient detail that others could reproduce it; why the specific intervention was chosen; and a fidelity measure of how well the intervention was delivered. For example, if we undertake educational outreach or training as a component of an intervention, we will detail how many sessions each unit of analysis receive, and when and where the training took place.
Evaluation phase (phase three)
Following the recommendation to conduct exploratory trials prior to embarking on more definitive randomised controlled trials , TRiPLaB will employ three different methods to evaluate the impact of the tailored implementation strategies delivered in each case study. The findings from these evaluation measures will inform (if worthwhile) later randomised controlled trials. The three methods to be used are: interrupted time series analysis of either tailor-made audit data or routinely collected data to estimate the impact of the intervention upon suitable process of care and outcome measures; comparison of pre- and post-intervention scores of survey-gathered measures of individual attitudes, team culture, and changing nature, composition, and size of social networks; and a qualitative process evaluation of why the intervention worked (or did not work).
Alongside these three primary evaluation methods we will also collect cost data on the resources used in the delivery of implementation approaches. The micro costs  associated with each strategy will be estimated alongside the extent of behavioural change achieved to arrive at summary estimates of implementation cost effectiveness  for each of the case studies in the programme.
Interrupted time series analysis
Interrupted time series designs compare multiple 'before and after' (the introduction of a change strategy) measures to detect whether an intervention has had an effect over and above any underlying trend in the data . Time series analysis has been used as a technique for evaluating the effectiveness of health care interventions . In the case studies, routinely collected process (health professionals' adoption of the innovation) and health outcome measures (dependent on the innovation selected) will provide the multiple time points necessary to perform a time series analysis. Time, possible seasonal trend, and possible upward trend, commonly occurring following the introduction of a new innovation , will be modelled into the analysis. This will be the primary outcome measure for each case study.
Comparison of pre- and post-intervention scores
The interrupted time series analysis will estimate the impact of the intervention upon process of care and health outcome measures; however, a comparison of pre- and post-intervention scores is also necessary to estimate whether the intervention successfully changed the factors (for example, individual attitudes, social networks and team culture) in the underlying theoretical framework that it was designed to target (based on the data synthesis through multilevel modelling in TRiPLaB's development phase in each site). This 'meditational analysis'  is critical when evaluating the theory used to develop change interventions, as it will inform our understanding of why an intervention either works or fails to work in the ways we intended.
Qualitative process evaluation
Qualitative interviews with health professionals receiving the intervention will enable an exploration of their perceptions of what worked and what did not work in the intervention, providing insight into the 'black box' of intervention effectiveness . In combination with the measure of fidelity taken during the implementation phase, these qualitative interviews comprise a process evaluation of the intervention, addressing recommendations to monitor intervention delivery and receipt by participants [11, 20]. The data will be analysed using a framework approach : familiarisation with the data, identification of a thematic framework, indexing, charting, and finally, mapping and interpretation with reference to the overall aim of TRiPLaB as well as the themes revealed by the data.
TRiPLaB is a theory-informed, systematic, mixed methods approach to developing and evaluating tailored implementation strategies aimed at increasing the translation of research findings into clinical and service practice. TRiPLaB aims to play a part in improving the quality of health services for the people of Leeds and Bradford. By working alongside multiple healthcare organisations in a series of longitudinal case studies, the TRiPLaB programme will develop a richer understanding of key issues influencing the adoption of innovations in the NHS and the promotion of quality improvement in routine practice.
This article presents independent research funded by the National Institute for Health Research (NIHR) through the Leeds York Bradford Collaboration for Leadership in Applied Health Research and Care. The views expressed in this publication are those of the authors and not necessarily those of the NHS, the NIHR or the Department of Health.
- Tooke JC: Report of the High Level Group on Clinical Effectiveness. A report to Sir Liam Donaldson Chief Medical Officer. 2007, London: Department of HealthGoogle Scholar
- Lomas J: Retailing research: Increasing the role of evidence in clinical services for childbirth. Milbank Quarterly. 1993, 71: 439-475. 10.2307/3350410.View ArticlePubMedGoogle Scholar
- Centre for Reviews and Dissemination: Getting evidence into practice. Effective Health Care. 1999, 5: 1-Google Scholar
- Ferlie EB, Shortell SM: Improving the quality of health care in the United Kingdom and the United States: a framework for change. Milbank Quarterly. 2001, 79: 281-315. 10.1111/1468-0009.00206.View ArticlePubMedPubMed CentralGoogle Scholar
- Greenhalgh T, Robert G, MacFarlane F, Bate P, Kyriakidou O: Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Quarterly. 2004, 82: 581-629. 10.1111/j.0887-378X.2004.00325.x.View ArticlePubMedPubMed CentralGoogle Scholar
- Grimshaw JM, Thomas RE, MacLennan G, Fraser C, Ramsay CR, Vale L, Whitty P, Eccles MP, Matowe L, Shirran L, Wensing M, Dijkstra R, Donaldson C: Effectiveness and efficiency of guideline dissemination and implementation strategies. Health Technology Assess. 2004, 8 (6): iii-iv. 1-72View ArticleGoogle Scholar
- Grol R, Grimshaw J: From best evidence to best practice: effective implementation of change in patients' care. Lancet. 2003, 362: 1225-1230. 10.1016/S0140-6736(03)14546-1.View ArticlePubMedGoogle Scholar
- Oxman AD, Thomson MA, Davis DA, Haynes RB: No magic bullets: a systematic review of 102 trials of interventions to improve professional practice. CMAJ. 1995, 153: 1423-1431.PubMedPubMed CentralGoogle Scholar
- Michie S, Abraham C: Interventions to change health behaviours: evidence based or evidence inspired?. Psychology and Health. 2004, 19: 29-49. 10.1080/0887044031000141199.View ArticleGoogle Scholar
- Rogers EM: Diffusion of innovations. 2003, New York, NY; London: Free Press, 5Google Scholar
- Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Pettigrew M: Developing and evaluating complex interventions: new guidance. 2008, London: Medical Research CouncilGoogle Scholar
- Anderson NR, West MA: Measuring climate for work group innovation: development and validation of the team climate inventory. Journal of Organizational Behaviour. 1998, 19: 235-258. 10.1002/(SICI)1099-1379(199805)19:3<235::AID-JOB837>3.0.CO;2-C.View ArticleGoogle Scholar
- Baker R, Camosso-Stefinovic J, Gillies C, Shaw EJ, Cheater F, Flottorp S, Robertson N: Tailored interventions to overcome identified barriers to change: effects on professional practice and health care outcomes. Cochrane Database of Systematic Reviews. 2010, CD005470-Doi: 10.1002/14651858.CD005470, 3Google Scholar
- Mason J, Freemantle N, Nazareth I, Eccles M, Haines A, Drummond M: When is it cost effective to change the behaviour of health professionals?. JAMA. 2001, 286: 2988-2992. 10.1001/jama.286.23.2988.View ArticlePubMedGoogle Scholar
- Glasziou P, Heneghan C: A spotters guide to study designs. Evidence Based Medicine. 2009, 14: 37-38. 10.1136/ebm.14.2.37-a.View ArticlePubMedGoogle Scholar
- Tan SS, Rutten FF, van Ineveld BM, Redekop WK, Hakkaart-van Roijen L: Comparing methodologies for the cost estimation of hospital services. European Journal of Health Economics. 2009, 10: 39-45. 10.1007/s10198-008-0101-x.View ArticlePubMedGoogle Scholar
- Cook TD, Campbell DT: Quasi-Experimentation: Design and Analysis Issues for Field Settings. 1979, Chicago: Rand McNallyGoogle Scholar
- Ramsey CR, Matowe L, Grilli R, Grimshaw JM, Thomas RE: Interrupted time series designs in health technology assessment: lessons from two systematic reviews of behaviour change strategies. International Journal of Technology Assessment in Health Care. 2003, 19: 613-623.Google Scholar
- Hulscher MEJL, Laurant MGH, Grol RPTM: Process evaluation on quality improvement interventions. Quality and Safety in Health Care. 2003, 12: 40-46. 10.1136/qhc.12.1.40.View ArticlePubMedPubMed CentralGoogle Scholar
- Hardeman W, Michie S, Fanshawe T, Prevost AT, McLoughlin K, Kinmouth AL: Fidelity and delivery of a physical activity intervention: predictors and consequences. Psychology and Health. 2008, 23: 11-24. 10.1080/08870440701615948.View ArticlePubMedGoogle Scholar
- Pope C, Ziebland S, Mays N: Qualitative research in health care: analysing qualitative data. British Medical Journal. 2000, 320: 114-120. 10.1136/bmj.320.7227.114.View ArticlePubMedPubMed CentralGoogle Scholar
This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.