- Short report
- Open Access
- Open Peer Review
The development and application of audit criteria for assessing knowledge exchange plans in health research grant applications
Implementation Science volume 9, Article number: 93 (2014)
Research funders expect evidence of end user engagement and impact plans in research proposals. Drawing upon existing frameworks, we developed audit criteria to help researchers and their institutions assess the knowledge exchange plans of health research proposals.
Criteria clustered around five themes: problem definition; involvement of research users; public and patient engagement; dissemination and implementation; and planning, management and evaluation of knowledge exchange. We applied these to a sample of grant applications from one research institution in the United Kingdom to demonstrate feasibility.
Our criteria may be useful as a tool for researcher self-assessment and for research institutions to assess the quality of knowledge exchange plans and identify areas for systematic improvement.
The recognised gap between healthcare research and practice has led to research funders, amongst other initiatives, introducing explicit expectations that applications detail the expected impact of research and demonstrate how it will be achieved [–]. Similarly, the UK’s Research Excellence Framework now includes a retrospective evaluation of the economic, societal and cultural ‘impact’ of University-based research as well as its scientific quality []. There are a number of resources to help researchers think about how best to increase the impact of their research and write knowledge exchange plans [–]. Noting that Tetroe et al. identified up to 29 terms used for ‘knowledge transfer’ [], we use the term ‘knowledge exchange’ to describe the multidirectional, dynamic and iterative nature of translating research-based knowledge into policy and practice []. Knowledge exchange encompasses a number of activities, such as dissemination (i.e., sharing of research findings), collaboration and consultancy, which ideally result in a range of impacts.
Resources are available to help reviewers assess knowledge exchange plans []. Despite the presence of such guidance and funders’ insistence that research proposals include explicit knowledge exchange and impact plans, a general standard for assessing those plans is not readily available. Funders provide guidelines to reviewers, which vary significantly across funders and funding programmes. Some of these guidelines are freely available whilst others are only made available to reviewers directly when asked to review a proposal. It can therefore be difficult for researchers to know whether reviewers are likely to judge their knowledge exchange plans as suitable. Furthermore, it is rare for researchers to receive feedback on this aspect of their proposal. This is unsurprising since these plans currently form a relatively limited part of the assessment process. Indeed, if researchers have followed recent advice to embed knowledge exchange principles and mechanisms throughout the entire lifecycle of a research project [,], it may be relatively difficult for reviewers to directly comment on this aspect of a proposal. This situation offers little scope for researchers to learn about how knowledge exchange can be better incorporated into the research process. There is a risk that researchers will come to see knowledge exchange and impact plans as a ‘tick-box’ exercise rather than considering how these could genuinely improve the entire research project.
Having been involved in advising a number of colleagues in our own institution on how to enhance knowledge exchange plans, we began to consider how to change researcher behaviour. We therefore took an approach based upon the principles of audit and feedback []. We developed criteria for assessing knowledge exchange plans within research proposals that could be applied to a sample of grant proposals and fed back to researchers. We demonstrate the feasibility of criteria development and assessment before discussing their potential to help researchers to improve their knowledge exchange plans.
We aimed to develop assessment criteria that drew upon existing conceptual frameworks, were underpinned by a sound rationale, and could potentially be measured from a review of written grant proposals. We extracted candidate knowledge exchange principles and recommendations from a review and synthesis of knowledge exchange frameworks, supplemented by existing guidance issued by UK research councils [-,]. Following iterative development, including feedback from academic colleagues, we established a set of 19 criteria for assessing knowledge exchange plans grouped under five thematic headings (Table 1). The five themes cover: problem definition; involvement of research users; public and patient engagement; dissemination and implementation; and planning, management and evaluation of knowledge exchange.
Application of the assessment criteria
We applied the criteria in an audit of applied health research proposals submitted from our own institution. We designed each criterion so that it could be rated as ‘met’ (scoring ‘1’) or ‘not met’ (scoring ‘0’) from reviewing grant proposals. We also anticipated that judgements upon whether or not each criterion was met would depend upon an assessment of the entire proposal as opposed to only ‘dissemination plans’ or equivalent. We took this approach because we expected evidence of knowledge exchange to be embedded throughout proposals (e.g., ‘problem definition’ in introductory sections). Three project team members (AIR, AR and RF) piloted the criteria by independently assessing three proposals, comparing assessments, and then clarifying criteria where necessary.
We screened the titles of grant proposals recorded by the Faculty of Medicine and Health, University of Leeds, which were submitted between May 2011 and May 2012. We selected 102 with a likely focus on applied health research. We included pending, successful and unsuccessful proposals because we sought a representative range of applications. We subsequently identified 25 full proposals led by academics in our institution. The majority of these were submitted to various National Institute of Health Research (NIHR) programmes (20), three to UK research councils, and two to other funders. We obtained permission from all lead applicants to review their grant applications in full. One project team member (AR) then applied the criteria to each application.
We calculated mean scores for each criterion and also for each theme across the 25 proposals (Figure 1). Proposals scored highest in problem definition (0.87, out of a maximal 1), followed by public and patient involvement (0.68), dissemination and implementation (0.63), and involvement of users (0.57), and lowest in planning, management and evaluation of knowledge exchange activities (0.18).
Amongst individual criteria, the three most frequently met were: ‘problem addressed by this proposal and its significance to the health service or health is stated’ (24 of 25 proposals); ‘specific users of research are identified’ (24 proposals); and ‘statement about how the problem has been identified’ (23 proposals). The three least frequently met criteria were: ‘ways in which the uptake of research findings can be monitored’ (no proposals); ‘timing and order of knowledge exchange activities is stated’ (3 proposals); and ‘applicants’ previous experience of undertaking knowledge exchange activities is described’ (5 proposals). A mean of 11.2 criteria out of a maximal 19 were met across the proposals (range 5.8 to 16.3). Table 1 also illustrates part-anonymised text from proposals that would allow a criterion to be judged as met (with the addition of a fictional example for one criterion met by no applications).
It is feasible to develop and apply audit criteria for assessing knowledge exchange plans within research proposals. We suggest they can be used by individual researchers and teams for self-assessment, or by grant-seeking institutions to identify common strengths and weaknesses and hence guide staff development. Our modest analysis of one institution suggests some key challenges that others are likely to face, especially around identifying resources and methods to monitor the longer term impact of research.
Developing meaningful and feasible criteria posed three main challenges. First, we aimed to develop criteria that would offer researchers enough detail to guide improvement of knowledge exchange plans whilst avoiding over-specification. We found it helpful to organise the emerging criteria into five themes that both followed the flow of a proposal and strongly related to the knowledge exchange process [,]. The themes helped to contextualise the criteria and safeguarded against missing aspects of the knowledge exchange process. Second, there is a risk that rather than encouraging a longitudinal view of the knowledge exchange process, our audit criteria may promote a tokenistic ‘box-ticking’ approach by applicants [,], especially if their institutions use measurement as a feature of performance management []. Any audit instrument is prone to the same misuse and degrees of self-deception. Furthermore, developing and stating a plan for knowledge exchange is more likely in principle to result in action than not making a plan []. Third, we were aware of the need to capture knowledge exchange plans aimed at a range of different research ‘users’. The Canadian Institutes of Health Research (CIHR), for instance, explains that ‘A knowledge user can be, but is not limited to, a practitioner, a policy maker, an educator, a decision maker, a health care administrator, a community leader or an individual in a health charity, patient group, private sector organization or media outlet’ []. The UK NIHR states that ‘the term user refers to patients, their carers and family members, as well as to members of the public and representatives from patient and charitable organisations’ []. We therefore distinguished between immediate users of research findings (e.g., clinicians, commissioners) and longer-term beneficiaries (e.g., patients).
Applying the tool to research proposals also posed a number of challenges. First, funders have adopted different concepts of knowledge exchange and impact, and use different terminology []. We suggest that our criteria are sufficiently generic to be transferable beyond the funding applications we assessed from one UK institution. Second, proposal forms differ substantially across different funders and programmes, making it necessary for assessors to read entire proposals to capture the full extent of knowledge exchange plans. Third, some criteria within the planning, management and evaluation of knowledge exchange theme scored poorly; e.g., none of the 25 proposals included a statement about the monitoring of the uptake of research findings. This may reflect both the absence of explicit guidance by funders and limited experience and skills amongst researchers. Fourth, researchers and institutions will inevitably raise the question of whether stronger knowledge exchange plans actually enhance the chances of grant success. We did not examine associations with success, partly because of the small number of applications reviewed but mainly because this was not the key aim. Whilst demonstrating stronger knowledge exchange may have variable impacts upon the likelihood of success, we suggest that the fundamental issue concerns how to maximise the chances of relevant impact during and following research projects.
In summary, research funders and institutions are increasingly interested in demonstrating impact. Researchers are therefore expected to present clear knowledge exchange plans, ideally embedded throughout the whole research cycle. We suggest that our criteria are useful for researcher self-assessment of individual applications and as an audit tool for research institutions to identify areas for improvement. Given the limited, exploratory nature of this work, we welcome further suggestions and debate around how to enhance the validity and relevance of such audit criteria.
National Institute of Health Research
National Institute for Health and Care Excellence
National Health Service
Tetroe JM, Graham ID, Foy R, Robinson N, Eccles MP, Wensing M, Durieux P, Legare F, Nielson CP, Adily A, Ward JE, Porter C, Shea B, Grimshaw JM: Health research funding agencies’ support and promotion of knowledge translation: an international study. Milbank Q. 2008, 86: 125-155. 10.1111/j.1468-0009.2007.00515.x.
Goering P, Ross S, Jacobson N, Butterill D: Developing a guide to support the knowledge translation component of the grant application process. Evid Policy J Res Debate Pract. 2010, 6: 91-102. 10.1332/174426410X483024.
Holmes B, Scarrow G, Schellenberg M: Translating evidence into practice: the role of health research funders. Implement Sci. 2012, 7: 39-10.1186/1748-5908-7-39.
Smith S, Ward V, House A: “Impact” in the proposals for the UK’s research excellence framework: shifting the boundaries of academic autonomy. Res Policy. 2011, 40: 1369-1379. 10.1016/j.respol.2011.05.026.
Ward V, Smith S, Foy R, House A, Hamer S: Planning for knowledge translation: a researcher’s guide. Evid Policy. 2010, 6: 527-541. 10.1332/174426410X535882.
Briefing notes for researchers: involving the public in NHS, public health and social care research. [http://www.invo.org.uk/wp-content/uploads/2012/04/INVOLVEBriefingNotesApr2012.pdf]
RCUK impact requirements., [http://www.rcuk.ac.uk/RCUK-prod/assets/documents/impacts/RCUKImpactFAQ.pdf]
Typology of research impact., [http://www.rcuk.ac.uk/RCUK-prod/assets/documents/impacts/TypologyofResearchImpacts.pdf]
Ward V, Smith S, House A, Hamer S: Exploring knowledge exchange: a useful framework for practice and policy. Soc Sci Med. 2012, 74: 297-304. 10.1016/j.socscimed.2011.09.021.
Foy R, Eccles MP: Audit and Feedback Interventions. Knowledge Translation in Healthcare: Moving From Evidence To Practice. Edited by: Straus SE, Tetroe J, Graham ID. 2013, John Wiley & Sons, Ltd, Oxford, 183-10.1002/9781118413555.ch16. Second
Ward V, House A, Hamer S: Developing a framework for transferring knowledge into action: a thematic analysis of the literature. J Health Serv Res Policy. 2009, 14: 156-164. 10.1258/jhsrp.2009.008120.
Solberg L, Mosser G, McDonald S: The three faces of performance measurement: improvement, accountability, and research. J Qual Improv. 1997, 23: 135-147.
Gollwitzer PM: Implementation intentions: strong effects of simple plans. Am Psychol. 1999, 54: 493-503. 10.1037/0003-066X.54.7.493.
Guide to knowledge translation planning at CIHR: integrated and end-of-grant approaches., [http://www.cihr-irsc.gc.ca/e/documents/kt_lm_ktplan-en.pdf]
Patient and public involvement - getting going as a research reviewer., [http://www.ccf.nihr.ac.uk/PPI/Documents/PPI%20A5%20booklet%20low%20res%20FINAL.pdf]
We would like to thank all authors of research proposals who gave their permission to include their proposal in this audit; the Faculty research support team for the retrieval of proposals; Clare Skinner for her support in the permissions process and our colleagues Paul Baxter and Maureen Twiddy for their feedback during the criteria development stage.
The authors declare that they have no competing interests.
All authors contributed to the criteria development. AIR, AR and RF piloted the criteria, and AIR coordinated author permissions and retrieved and assessed all applications using the criteria. AIR, VW and RF drafted the manuscript and table, and AIR produced the figure. RF conceived of the study. All authors read and approved the final manuscript.
Authors’ original submitted files for images
Below are the links to the authors’ original submitted files for images.
About this article
Cite this article
Ruppertsberg, A.I., Ward, V., Ridout, A. et al. The development and application of audit criteria for assessing knowledge exchange plans in health research grant applications. Implementation Sci 9, 93 (2014). https://doi.org/10.1186/s13012-014-0093-0
- Knowledge exchange
- Impact plans
- Research proposals