Skip to main content

Table 3 How we answered the three questions for assessing essential elements during the intervention period

From: Figuring out fidelity: a worked example of the methods used to identify, critique and revise the essential elements of a contextualised intervention in health policy agencies

Questions used to critique essential elements

Data sources

Data examples

Data analysis / use

1. When implemented in these contexts, does this provisional / likely essential element realise the change principle(s) that informed its development?

2. Is this essential element critical for achieving the session goals? Does anything else appear to be?

3. Does this essential element function across all subcomponents and all six trial intervention settings?

Implementation checklist completed during the delivery of each session

Codes showing whether or not (or to what extent) each essential element was delivered as intended

Collation of codes by session and by agency

Fieldnotes made during observation of each session

Description of how the essential elements appeared to work or not (e.g. how participants reacted), how they were delivered, any adaptations that took place, any factors that appeared to affect how the intervention was delivered or how people engaged with and responded to it

Data was coded thematically using the constant comparative method. In each session we examined the alignment between 1. what was delivered (including any modifications), 2. any observed process effects, and 3. the change principles that informed what was intended, and compared this across all agencies

Participant feedback forms collected at the end of each session

How participants assessed delivery against quality criteria such as content relevance, provider credibility, and learning outcomes; and their advice for improvements

Descriptive analysis of quantitative data (frequencies, averages and comparisons)

Transcripts of semi-structured interviews with purposively sampled participants from two phases of interviewing: early in the intervention period and after it

Participant perceptions of the strategies used to effect change: the extent to which they worked and how modifying factors such as work practices, organisational goals, and beliefs about research shaped process effects

Managed using Framework Analysis. Data was synthesised in categories that were identified both inductively from early interviews and a priori based on intervention outcomes and a review of the research utilisation literature

Fieldnotes documenting informal conversations with participants following sessions

As above but ad hoc and generally very brief

Data was collated in running memos and, where appropriate, coded thematically using the constant comparative method

Memos documenting conversations with intervention implementers and providers

Implementers’ views on discrepancies between what was intended and what was delivered. Providers’ accounts of why they ‘went off script’

Memos documenting consultations with the intervention designers

How the designers envisaged the change principles manifesting in intervention sessions