Skip to main content

Table 3 Delphi survey to confirm desirable GItool featuresGItool feature

From: A framework of the desirable features of guideline implementation tools (GItools): Delphi survey and assessment of GItools

 

Round #1

Round #2

1. Tool objectives are stated

28 (90.3)

---

2. Target users of tool are identified

27 (87.1)

---

3. Methods used to develop the tool are clearly described

21 (67.7)

---

4. Instructions are provided on how to use the tool

28 (90.3)

---

5. Conflicts of interest of those involved in tool development are disclosed

19 (61.3)

19 (63.3)

6. Target users informed tool content and format (survey, interview, focus group, committee)

22 (71.0)

---

7. Experts in tool content, and instrument development and design were involved in tool development

19 (61.3)

13 (43.3)

8. A comprehensive literature review was undertaken to inform and assemble tool content

23 (74.2)

---

9. Sources are cited for evidence upon which tool content is based

23 (74.2)

---

10. Quantity and quality of evidence upon which tool content is based is described

20 (64.5)

---

11. The tool was pilot-tested with users and refined based on their feedback prior to implementation

22 (71.0)

---

12. Pilot-testing was rigorous (appropriate sampling, methods)

17 (54.8)

15 (50.0)

13. A description is included of how the tool was evaluated

20 (64.5)

---

14. Tool effectiveness was assessed by full scale evaluation of impact on clinicians and/or patients

15 (48.4)

8 (26.7)

15. Full scale evaluation was rigorous (appropriate sampling, methods)

15 (48.4)

12 (40.0)

16. User feedback about tool use and impact is prospectively collected

15 (48.4)

20 (66.7)

17. The type of tool (domain, subdomain) are specified

---

16 (53.3)

18. The theoretical basis or rationale for the tool is described

---

11 (36.7)

19. Electronic versions are available for computer or mobile device application

---

13 (43.3)

20. Experts in the context/setting in which tool will be used were involved in development

---

18 (60.0)

21. Details of the context/setting in which tool was developed/will be used are described

---

20 (66.7)

22. Success factors/learning based on tool use or evaluation are described

---

14 (46.7)

23. Impact was assessed with rapid cycle testing (i.e., PDSA, different from full scale research)

---

7 (23.3)

24. The meaning of full scale evaluation results are interpreted based on an implementation threshold

---

6 (20.0)