Skip to main content

Table 3 Precision and rater agreement

From: Identifying quality improvement intervention publications - A comparison of electronic search strategies

Search strategy

'quality' AND 'improv*' AND 'intervention*'(PubMed, selected journals)

Total yield: 183 publications

Precision (n, % relevant publication)

N = 183

Total Inter-Rater-Agreement on Relevance

Kappa

(95% Confidence Interval)

Publications rated as relevant for quality improvement by at least 1 rater

122 (67%)

---

---

Publications rated as relevant for quality improvement by both raters

99 (54%)

87%

0.74

(CI: 0.64, 0.84)

Publications rated as reporting on effects of a quality improvement intervention by at least 1 rater

74 (40%)

---

---

Publications rated as reporting on effects of a quality improvement intervention by both raters

50 (27%)

90%

0.77

(CI: 0.67, 0.87)

QI Publications rated MRC definitive study by at least 1 rater

35 (19%)

---

---

QI Publications rated MRC definitive study by both raters

25 (14%)

92%

0.78

(CI: 0.65, 0.91)

  1. * notates truncation; CI: confidence interval