Skip to main content

Table 8 Where should intervention designers and evaluators direct their efforts and resources?

From: Explaining variable effects of an adaptable implementation package to promote evidence-based practice in primary care: a longitudinal process evaluation

Stage

Lesson

Selecting indicators

Consider fit with professional values, patient benefit and practice goals to enable a clear understanding of the need for something to be done differently and that improvement is possible

Consider workload of reviewing patients near to targets (e.g. impact of stringent targets on patient preferences and rapport) and how this fits with achievement

Ensure outcome measures are sensitive to efforts to improve achievement to enable learning from working to achieve change

Limit the number of indicators and specify clear corrective actions or behaviours that will have impact on achievement

Make visible individual contributions towards changing team-based behaviours and enable individuals to be accountable to themselves and their team

When developing intervention components

 

Audit and feedback

Identify a named lead to coordinate the overall plan and individual actions

Facilitate reach to those who are able to act to improve performance and suggest that feedback is made visible in the practice and at practice meetings

Make clear relevance to non-clinicians

Focus on feedback for learning in addition to feedback on performance (i.e. what could be done differently in addition to feedback on gap between actual and desired performance to support underachievers)

Frame behaviour to showcase benefit of additional or modified ways of working (e.g. reduce unwanted actions (e.g. reduce risky prescribing or reduce strokes) as opposed to increase desired behaviours (e.g. increase prescription of anticoagulation))

Action plans that suggest specific and feasible actions could minimise cognitive load and overcome habitual patterns of working

Consider reporting timeframe in relation to work to be undertaken. Estimate timeframes required for actions on action plans and time feedback accordingly

Repeated negative feedback may be dispiriting and decrease ownership

Educational outreach

Provide a time to review audit feedback and conduct patient-identifiable searches before meeting face-to-face to further explore barriers and goal setting

Enrol all potentially relevant staff (e.g. administrative, managerial and clinical) as early as possible to create a sense of ownership and maximise time for improvement

Create an open discussion of problems, how individuals work and ways to overcome challenges

Ensure that the facilitator is seen as credible

Reminders

Patient identifiable searches may reduce burden and enable practices to develop a continuous feedback loops to track and maintain improvements

Ensure that searches and computerised prompts can be easily adapted to focus on practice targets for achievement

Computerised prompts may be applicable to both clinical and administrative staff involved in repeat prescribing

When delivering interventions

Establish commitment, rapport and mobilise resources prior to intervention delivery (e.g. time commitment, access to identifiable audit data) to increase awareness of intervention package

Identify a practice lead who can empower participation and manage competing priorities

Establish a team including management, clinicians and administrators to reinforce collective action

Encourage rapid actions in intermediate process and outcomes to make progress visible and increase internal motivation to continuously improve

Consider opportunities for social exchange of success stories of what others are doing

When evaluating implementation components

Enable interactive communication between intervention developers and practices to support tailoring and adaptation of interventions to context

Pilot test delivery, receipt and engagement as informed by NPT and TDF constructs before evaluating at scale