Skip to main content

Table 5 Overview of SPIRIT’s final essential elements: their scoring, how they were monitored and which of the interventions components they applied to

From: Figuring out fidelity: a worked example of the methods used to identify, critique and revise the essential elements of a contextualised intervention in health policy agencies

Final essential elementsa Final scoring of essential element Activity that provided data for scoring Intervention components to which essential elements apply
Audit & feedback Leaders forums Symposia Research exchanges
1. Provider had expertise and credentials in the topic/field appropriate to the session Yes / No Review of publicly available biographical information and, for no. 1, participant feedback form item
2. Provider had experience in presenting to policy/program developers Yes / No
Engagement and facilitation: the methods used to deliver the presentation and encourage participation
3. Non-didactic presentation strategies were used Extensive | Moderate | Limited | Not at all Direct observation of session delivery
4. Content was delivered in an engaging manner Yes / No Participant feedback form item
5. The provider encouraged participants to contribute to session (ask questions, make comments, provide examples, participate in discussion) Extensive | Moderate | Limited | Not at all Direct observation of session delivery
6. The provider encouraged participants to discuss how information / learning from the session might be applied in their setting Extensive | Moderate | Limited | Not at all Direct observation of session delivery
7. Provider showed respect for participants’ contributions and work Extensive | Moderate | Limited | Not at all Direct observation of session delivery
8. Provider demonstrated sensitivity to the ‘real world’ of the agency’s policy/program work Extensive | Moderate | Limited | Not at all Direct observation and participant feedback form item  
Content: key topics, messages, activities and resources
9. Core content outlined in session plan was delivered Aggregated rating across all items specified in session plan: Wholly | Mostly | About half | Limited | Not at all Direct observation and multiple participant feedback form items
10. The session content was relevant to the agency’s work Yes | No Participant feedback form item  
11. Where specified in the session plan, provider identified or provided resources that supported or extended learning from the session Yes / Partially / No / N/A - not specified in plan Direct observation of session delivery
12. The value of using research / evaluation in agency work was conveyed Yes | No Participant feedback form item
13. Synthesised data from measures was provided and discussed Yes | No Direct observation of session delivery    
14. Opportunities to improve use of research were identified Extensive | Moderate | Limited | Not at all Direct observation of session delivery   
Participation: characteristics of attendees’ interaction and contribution to the session
15. Targeted agency staff attended Numbers and roles of all attendees. Approximate proportion of those targeted Direct observation and review of data from session ‘sign in sheet’
16. A leader (e.g. CEO, member of executive) introduced the session or contributed to it positively in other ways Yes | No Direct observation of session delivery   
17. Participants contributed to session (asked questions, made comments, participated in discussion) All | ~ ¾ | ~ ½ | ~ ¼ | Few | None Direct observation of session delivery
18. Participant contributions included knowledge/examples from their own experience Extensive | Moderate | Limited | Not at all Direct observation of session delivery
19. Discussion included how information/learning from the session might be applied in their setting Extensive | Moderate | Limited | Not at all Direct observation of session delivery
20. Participants identified one or more agency research-related areas that could benefit from improvement Yes | No Direct observation of session delivery   
  1. aEssential elements are one type of fidelity criteria. Other fidelity measures concerning frequency, duration, coverage, etc., plus participants’ perspectives, were collected for each session but are not shown on this table