Skip to main content

Designing clinical practice feedback reports: three steps illustrated in Veterans Health Affairs long-term care facilities and programs

Abstract

Background

User-centered design (UCD) methods are well-established techniques for creating useful artifacts, but few studies illustrate their application to clinical feedback reports. When used as an implementation strategy, the content of feedback reports depends on a foundational audit process involving performance measures and data, but these important relationships have not been adequately described. Better guidance on UCD methods for designing feedback reports is needed. Our objective is to describe the feedback report design method for refining the content of prototype reports.

Methods

We propose a three-step feedback report design method (refinement of measures, data, and display). The three steps follow dependencies such that refinement of measures can require changes to data, which in turn may require changes to the display. We believe this method can be used effectively with a broad range of UCD techniques.

Results

We illustrate the three-step method as used in implementation of goals of care conversations in long-term care settings in the U.S. Veterans Health Administration. Using iterative usability testing, feedback report content evolved over cycles of the three steps. Following the steps in the proposed method through 12 iterations with 13 participants, we improved the usability of the feedback reports.

Conclusions

UCD methods can improve feedback report content through an iterative process. When designing feedback reports, refining measures, data, and display may enable report designers to improve the user centeredness of feedback reports.

Peer Review reports

Background

Feedback interventions are widely used [1], but evidence suggests we have limited knowledge of how and when these interventions positively influence clinical practice [2]. Theory and expert consensus support the idea that report design may affect the influence of feedback interventions on clinical practice [3,4,5,6]. Best practice guidance about designing feedback reports and dashboards recommends testing them with users (i.e., the people who receive reports and use them to change practice) [1, 3, 7,8,9,10,11]. When used in audit and feedback as an implementation strategy [12], the content of feedback reports depends on a foundational audit process involving performance measures and data, but these important relationships have not been adequately described.

User-centered design (UCD) refers to various methods for developing and testing human-created products (i.e., artifacts). These methods can enable feedback report designers to recognize defects and improvement opportunities in both the form of the report (i.e., how information is displayed) and its content (i.e., what information is communicated). For example, UCD activities improved the design of feedback reports for home healthcare professionals regarding report colors (form) and regional performance comparisons (content) [13]. By repeatedly testing an artifact, a designer can further refine the design of a prototype until no significant problems are identified [14].

UCD methods have been applied to feedback reports in home healthcare [13] and primary care settings [8, 15]. Colquhoun et al. [13] incorporated paper prototyping, interviews, focus groups, cognitive interviewing, and think-aloud methods across two design phases to optimize an audit and feedback intervention for home healthcare providers. Brown et al. incorporated interviews, video-based content analysis, eye-tracking analysis, and questionnaires to improve the usability of an electronic audit and feedback system [8, 15]. These studies demonstrate the potential contribution of UCD methods to feedback report design and highlight a range of methods that might improve the influence of feedback on clinical practice.

Our objective is to propose a method for user-centered design of feedback reports that recognize relationships between refinements to audit (i.e., measurement) processes and feedback display. To illustrate the method, we describe a feedback report design process for a national initiative to implement goals of care conversations (GoCCs) for veterans in long-term care facilities and programs within the Veterans Health Administration [16, 17]. The purpose of using feedback reports is to influence healthcare professionals and teams to adopt new practices and to identify opportunities for performance improvement.

Methods

We describe three key steps in a UCD process for feedback reports. The UCD process begins with understanding the user, and then proceeds through the three refinement steps for a prototype report’s measures, data, and display, followed by observation of the use of the refined prototype (Fig. 1). Measures, also called metrics, key performance indicators, or quality measures, are standardized processes for generally numeric assessment of the structure, processes, or outcomes of care [18, 19]. For each measure, data items are specified that originate from various sources, such as manual chart abstraction, administrative and billing systems, or an electronic health record system. Measures and data are organized in a feedback report containing design elements [4], including content and form elements, which we refer to as the report’s display.

Fig. 1
figure 1

Three refinement steps in a user-centered design process [14] for feedback reports: refine measures, data, and display

Understanding the user

Designing a useful and appropriate feedback report requires the designer to understand the purpose of the report, the people who will use it, and their context. Users are the intended recipients of the reports, including both frontline healthcare professionals and quality improvement managers. The role of a user is different from that of other project stakeholders, such as policymakers or administrators who may direct a program and decide to initiate a feedback intervention, but who do not receive feedback about their clinical practice.

Understanding how users are influenced by a report, especially regarding emotional responses, is essential for refining a prototype [20]. Suggested design methods for understanding the user are described in Additional file 1.

Develop/refine prototype

Prototype feedback reports are sketches, drafts, or models of reports that typically contain artificial data and which provide enough information for a user to understand the proposed form and content of the report. Prototypes can be co-created with users or developed without user involvement prior to testing. Refinement of prototypes can be based on user feedback and suggestions during co-creation or based on requirements that explicitly outline constraints for the report. For example, requirements can be expressed as statements such as “reports must be printable in black and white” or can be expressed in the form of “user stories” that link a specific user role with a report characteristic and the purpose of the specific requirement [21, 22].

  1. 1.

    Refine measures: Refining measures is a process of changing performance calculations. Performance calculations are sums, averages, or rates, in addition to other less common calculations such as distributions, standard deviations, or more complex functions [23]. Changes to these calculations can include adding or removing a measure. One common change is to the inclusion or exclusion criteria for measure’s denominator (i.e., risk adjustment) to more accurately identify the patient population whose care can be improved. Refinement of a measure may require collection of additional reliability and validity evidence.

  2. 2.

    Refine data: Refinement of data is a change to the data items and sources used in the calculation of performance. Data refinement may be required by a refined understanding of the user directly or from refinement of a measure (Fig. 1). Changes to measures may result in a need to add or remove data items or sources. Data quality requirements may also lead to the refinement of data sources. System and team-level changes, such as new personnel, EMR software updates, and clinical workflow redesign, can affect data quality, requiring the refinement of data. Variation in practice across clinics or facilities may also require refinement of data.

  3. 3.

    Refine display: Refinement of a display is a change to the report’s content or form elements. Examples of display content include comparators (e.g., benchmarks, goals), time intervals shown in a performance history, and framing that relates performance to anticipated gains or losses. Form elements include charts, tables, and text. The appropriate display for a report can be impacted by refined measures and data, or from refined understanding of the user directly (Fig. 1). For example, the addition of a measure requires changes to the reports form and content to display new information. Refinements from understanding the user may relate to user preferences or ability to read a type of chart [24] or user expectations for meaningful comparators [25].

In early iterations of the prototype, “low-fidelity” [26] sketches can be created with contrasting features to yield insights into the usefulness of feedback report elements, including visual displays and text. To create sketches and generate ideas for reports, a collaborative design approach can be used by teams to produce and critique many ideas within a half-day design session [27]. A recommended activity for design of prototypes is to identify and specify the cognitive tasks involved in using the report [28, 29]. For feedback report prototypes, cognitive tasks include comparison of values between current performance and a comparator, and perception of a trend that shows a change in practice over time. Sub-optimal visualization of the information can mislead users or increase cognitive burden [30, 31]. For example, requiring users to do arithmetic or to process a large set of practice measures increases cognitive burden, which can result in users ignoring the report.

Observe user

Observing users as they attempt to understand and interpret a report can reveal its defects. Usability testing is a form of user observation that involves preparing tasks and classifying the types of errors that occur as a result of using a prototype [32]. During observation, a facilitator may use an interview guide with questions about users’ comprehension and interpretation of visualizations or to assess appropriateness and acceptability. Suggested usability testing methods for observing the user are included in Additional file 1.

Exiting the UCD cycle

Exiting the UCD cycle can happen when users have demonstrated that the feedback report is usable and that all of the identified user requirements are satisfied, including those relating to users’ perceptions and emotional responses to the report. Reaching this point enables the design team to confidently proceed to develop reporting tools or to deliver a design specification for report development.

Results

We illustrate the proposed method through its application in a UCD study to design feedback reports as part of a broader, large-scale audit and feedback intervention study [16]. The intervention was funded by the VA Quality Enhancement Research Initiative program in support of the Life-Sustaining Treatment Decisions Initiative, a United States Veterans Health Administration initiative led by the National Center for Ethics in Health Care to promote high-quality goals of care conversations (GoCCs) and documentation of patients’ preferences for life-sustaining treatment [17, 33].

One key element of the Life-Sustaining Treatment Decisions Initiative is the use of a nationally standardized progress note and order set for documenting veterans’ goals of care and life-sustaining treatment decisions in the VA’s electronic health record system. The progress note and order set may be written in any Veterans Health Administration care setting (e.g., outpatient, inpatient, nursing home) and are viewable and durable across settings. Both the notes and orders are accessible via the VA’s Corporate Data Warehouse. Additional data relevant to report generation, such as a veteran’s admission status and the location of the GoCC, are also available in the data warehouse, creating a set of historical data elements that are available for practice measurement.

Setting

This work was done in five long-term care Veterans Health Administration sites in the Western and Midwestern regions of the USA. Four of the sites were participating in the Life-Sustaining Treatment Decisions Initiative as demonstration sites that had implemented the initiatives’ progress note and order set. One site was added for the convenience of in-person meetings with the project team, where a key individual involved as a user was also a member of the research team. Long-term care settings and services at sites included both community living centers, which are VA-owned nursing homes, and home-based primary care programs to which eligible veterans were admitted for care provided in their homes by visiting healthcare professionals. All five sites had a home-based primary care program, while only four of the sites had a community living center facility (Table 1).

Table 1 Participating site facility and program characteristics

Participants

Participants were healthcare professional managers and administrators. One or more participants at each site was a designated “site champion” who had been identified by the VA National Center for Ethics in Health Care to lead and coordinate implementation efforts for the Life-Sustaining Treatment Decisions Initiative. Site champions from the 5 facilities recommended a total of 13 individuals, including themselves, to participate in the design process. The number of participants from each site ranged from 1 to 4. This interdisciplinary group of participants included 2 nurses, 1 nurse practitioner, 3 physicians, 2 quality improvement/compliance professionals, and 5 social workers.

Application of the method: the UCD cycle applied to feedback report design

We conducted 12 iterations of the UCD cycle over approximately 18 months. We describe the method through its application over the 12 cycles that led to the design and development of feedback reports for community living centers and home-based primary care programs.

Understand user—initial cycle

We visited all 5 sites to meet with participants. During the site visits, we interviewed participants, toured facilities, and met with other staff involved in implementing the Life-Sustaining Treatment Decisions Initiative to understand the context, professional roles, veterans’ care processes and environments, and the activities involved in conducting GoCCs and using the progress note and order template. We took field notes during interviews and reviewed them in group discussions about the context of participants’ work, including when GoCCs occur, where routine practice feedback is delivered, and to identify potential opportunities to measure documentation of GoCCs that could be summarized in feedback reports. With permission, we photographed bulletin boards on which practice feedback reports were routinely posted to capture characteristics of reports that healthcare professionals were accustomed to receiving (Fig. 2).

Fig. 2
figure 2

Example of a bulletin board that is used to post feedback reports

After each site visit, we discussed our observations in team meetings. We reviewed our notes and had monthly calls with the larger study team in which we discussed requirements and their implications for the practice measures, data, and visual displays in a report. We compared and contrasted characteristics of site contexts and participants to identify generalizable traits of users across sites and to identify contextual differences that reports would need to accommodate. The key differences that reports were adapted to accommodate were the intent to disseminate feedback reports and interest in facility-level comparison, which varied across sites (Additional file 2). These requirements also included channels for delivery of feedback (e.g., use of email, bulletin board posting). This initial step occurred over 7 months, during which we used meeting discussion notes to identify and develop preliminary measures, data, and displays.

Understand user—subsequent cycles

After completing the site visits, we conducted 12 iterations of the UCD cycle. Each iteration involved the usability testing of the prototype with multiple users (Table 2) and resulted in a refined understanding of users and changes to the prototype reports. We used anecdotes from our observations to support several types of refinements in our understanding, including recognizing false assumptions and contrasting user needs and preferences (Table 3 and Additional file 2, Example 1). The design team referenced these anecdotes, described in our discussion notes, when considering changes to measures, data, and displays.

Table 2 Example usability testing and interview guide for prototype report testing
Table 3 Example refinement of prototype measures, data, and display about documenting goals of care conversations for VA providers in long-term care settings

Develop/refine prototype

As a team, we processed a wide range of issues and considerations, including participants’ work, beliefs, priorities, and perceptions; barriers to anticipated change processes, data quality issues, technical limitations, and stakeholder priorities. Refinements varied, with some implicating minor changes to the display, and others implicating all three-prototype refinement steps (Additional file 2, Example 2). In some cases, there was a need to resolve conflicting requirements expressed from participants. For example, some sites requested regional comparison data while others requested to have no comparators in the report, requiring us to identify the most appropriate solution for all stakeholders, which was a report that did not use comparators, such as benchmarks or organizational performance goals.

Refine measures

Based on the requirements we identified in our initial site visits, we created ratio-based practice measures for GoCC documentation. The initial measures used counts of veterans newly admitted to community living centers or home-based primary care for a denominator. For a numerator, we used the count of those newly admitted and having GoCC documentation, yielding a percentage of newly admitted veterans who had a completed template within 7 days of admission. As a result of continued testing, we modified the measures. For example, participants expressed concerns that although measurement of GoCC timeliness was appropriate, it was at a lower priority than increasing the reach of documentation at any time in the patient population. In response to these concerns, we created a new measure addressing the historical reach for documentation of GoCC (Table 3). We further refined measures by broadening the time windows for timeliness-focused measures. This change enabled the reports to display when conversations had ever been documented prior to admission and up to a full 30 days after admission to provide more information about conversation timing in community living centers. Similar changes were introduced for home-based primary care program reports. In later cycles of report testing with refined measures, participants confirmed that the measures were appropriate and raised no further significant concerns (Additional file 2, Example 3).

Refine data

Refinement of measures required refinement of data items from the Corporate Data Warehouse. This process involved confirmation of data items and their appropriateness for use in new measures. For example, to support refined measure for the GoCC reach in community living centers, we needed to differentiate patients who were more likely to be seriously ill from those who were not (Table 3). We identified the admitting service (short stay or long stay) as a data item that adequately served as a proxy for severity of illness that participants could accept. Veterans admitted to long stay services in community living centers were recognized as likely seriously ill and at higher priority for having a GoCC than those admitted to short stay services, such as for physical rehabilitation (Additional file 2, Example 2).

Refine display

We created prototype charts and drafted report text to display practice data. We used graphical displays of practice data as a focus of the report which could potentially be used in electronic or paper form and delivered via handout, email attachment, or bulletin board posting. In the initial cycle of the design process, we created charts in a spreadsheet and document editor as low-fidelity prototypes (Fig. 3). By the end of the 12th cycle, the report design featured two charts which alternately emphasize the reach and timeliness of documentation (Figs. 4 and 5).

Fig. 3
figure 3

Prototype displays, version 1, that summed only conversations documented within 7 days. CLC, community living center

Fig. 4
figure 4

Prototype displays, version 12, that summed conversations ever documented (top) and the timeliness of documentation (bottom). CLC, community living center; LST, life-sustaining treatment

Fig. 5
figure 5

Chart from feedback report in June 2019 with refined measures addressing admitting service (short stay vs. long-term care). CLC, community living center; LST, life-sustaining treatment

Observe user

We conducted usability testing in 14 sessions (6 in-person meetings and 8 phone interviews) with 11 of the 13 participants. Interviews occurred over 7 months. During interviews, we asked participants to think aloud when viewing reports and gave them tasks in the form of questions that required them to interpret the data in the report (Table 3). We used a qualitative approach to usability testing [32], simply capturing key observations in notes. We opted to use phone interviews because participants worked at several locations and because our usability tasks were primarily based on the interpretation of non-interactive reports.

Usability testing generated insights in each of the three design steps. For example, some participants raised data quality issues, such as the accuracy of the denominator, which led to further refinement of data. Presentation format issues were raised, such as expressing a preference for seeing data presented both as counts and as percentages so that providers could better assess data accuracy. Participants also expressed a preference for bright colors to attract attention when reports were posted on a bulletin board. These and other observations were captured in notes for each call and reviewed in group discussions to interpret and refine our understanding of the participants in subsequent cycles.

Exiting the cycle

After 12 iterations, we transitioned from a prototyping phase to a software development phase and began producing reports for the 4 demonstration sites (Additional file 3).We also sought additional feedback that would allow for minor improvements to the report design. At the conclusion of the UCD phase, we began sending reports to new VA facilities that did not participate in the design process, as part of the larger project. Reports were generated each quarter and sent to site champions via email for further internal distribution. As of July 2019, quarterly reports had been delivered to more than 23 facilities or programs. In the 2 years following report, implementation we routinely requested site feedback about the usefulness of the reports, but have not conducted further usability testing. Multiple responses have led to ongoing minor changes and additions to the report, such as shifting the report timing from quarterly to monthly, and adding the provision of patient list data to complement the report data. We plan to provide ongoing maintenance support and have maintained software for the reporting tool in a public code repository (https://github.com/Display-Lab/goals-of-care).

Discussion

We have described a proposed method for the UCD of feedback reports that follow refinement steps of the reports’ measures, data, and displays. We used the proposed method to identify feedback report requirements and to iteratively change the report design to increase its usability. Using this method, we identified divergent project stakeholder perspectives to arrive at a final design, including decisions about both its content and form that was acceptable to stakeholders and usable for participants.

Best practice guidance recommends the use of UCD to develop reports, including testing the recommended best practices themselves [3]. Studies of UCD applied to feedback report development are limited, but have demonstrated the use of iterative prototyping and user observation [8, 13]. We build on these studies to contribute a method that incorporates key feedback report components and their dependencies.

We propose a method that may serve as a general process for UCD of feedback reports, to implement and test best practice recommendations along with other insights gained from testing reports with users. The method specifies steps within a general UCD cycle [14] (understand user, develop/refine prototype, and observe user) and operationalizes the cycle to include 3 steps during prototype refinement: measures, data, and display. The transition from the understand user step to the develop/refine prototype step involves decisions for 3 alternate pathways (Fig. 1). The 3 steps have dependencies, such that display depends on data, which depends on measures. Changes to measures therefore are likely to necessitate subsequent refinement of data and displays.

We have illustrated this method with a feedback report design process in a large VA quality improvement project. We used the UCD cycle to iteratively improve the design of reports, based on the observed use of and reactions to feedback report prototypes by healthcare professionals. Improvements were made by identifying and reducing errors associated with the report design, both for interpretation of the data and in the appropriateness of the report for a given context, such as using peer-based comparators.

Across the 12 iterations through the UCD cycle, new requirements shifted from a tendency to address measures towards displays. In earlier report tests, participants frequently raised concerns about measures that were fundamental to the subsequent report design steps. Later in the testing of reports, measurement issues were less common, and refinement was focused on improving the text and charts in the report.

The proposed method enabled us to identify requirements that differed across participants and across the VA. In some cases, preferences for reports were contradictory, preventing us from designing a single optimal report for all facilities and programs. Key differences expressed were in the use of comparators in reports and in the expressed intention to disseminate the report among staff. Based on these observations, we anticipate that optimal feedback may need to be adaptable to recipient differences, allowing for a choice of alternate measures and corresponding displays.

The method we describe has implications for best practice in feedback interventions. We call attention to the maturation of performance measures as an important factor in the success of feedback interventions. For example, measure revisions resulted directly from report testing with long-term care providers, who indicated that the timeliness of GoCC and treatment preference documentation was a lower priority than reach (i.e., the spread of documentation practice in the veteran population). Given the unique contextual factors that appear to affect the appropriateness of measures, we anticipate that such adaptation of measures is a potentially important initial step in any feedback intervention.

An additional implication is that feedback reporting projects should consider allowing additional time to refine reports with users to ensure that reports will be useful and appropriate for influencing practice. Finally, an implication for the broader adoption of UCD in feedback report design would require better description of feedback report content and form, to enable improved evidence generation across research networks, such as the Audit and Feedback Metalab [34]. We recommend that work to better describe and define the components of feedback reports and their relationships be prioritized to enable better learning from feedback interventions that have been designed for specific user populations.

Limitations

The proposed method emerged as we progressed through the UCD cycles that were applied to our project; therefore, they may not generalize to other design projects. For example, the method we describe may be most relevant to the development of novel measures, or their application to a new population, as more refinement and changes resulting from measure development will trigger more or different data needs and report development. In an intervention using standardized measures that cannot change over the duration of the intervention, such as with Healthcare Effectiveness Data and Information Set measures [35], the proposed measure refinement step may be less relevant. However, standardized measures may nevertheless be optimized for policymakers’ and organizational leaders’ information needs that differ from the information needs of healthcare professionals who receive feedback about their practice, especially at the facility or organizational level. As such, a key lesson is that standardized measures may require alignment with frontline providers’ information needs in order to be useful for improving the quality of care.

In applying UCD to the development of the feedback reports, our identification of divergent stakeholder perspectives resulted in design choices that reflected the trade-offs we encountered in terms of technical feasibility, available resources, stakeholder interests, best practice guidance, and evidence about audit and feedback. We anticipate that UCD methods hold potential to increase the effectiveness of reports, but recognize that this is an area in which evidence is lacking, and hence a potentially important area for future research.

We did not quantitatively assess the impact of using the proposed method. Further evaluation is needed to understand the impact of the proposed method on feedback report usability and uptake.

We did not assess the cost of using the method, in particular, the use of a design team with implementation science experts. The method’s value could perhaps be realized at low cost by a single individual, rather than involving a design team.

In our application of the proposed method, phone-based usability testing reduced the information we could perceive from participants’ facial expressions and body language that might inform emotional aspects of participants’ comments. Usability testing in an in situ or naturalistic work environment setting for healthcare professionals is likely to support better fidelity to the cognitive processes of perception and comprehension resulting from interaction with reports [28, 36]. Phone-based testing was perhaps more feasible because the prototypes we tested were single-page, non-interactive PDF documents that did not require navigation of interface controls or a sequence of actions to perform, which reduced participant’s task complexity.

We did not capture participant demographic data or assess participant diversity. We experienced challenges in arranging face-to-face interactions with extremely busy clinicians and coordinating schedules in this multi-site, large-scale project.

We believe that specializing the methods of UCD for feedback reports will make it more feasible for implementation researchers to use this method. We describe steps for operationalizing the “develop/refine prototype” step of the UCD cycle for feedback report design. Future research could additionally explore the operationalization of the UCD cycle for other steps of the cycle, which we have illustrated with examples.

Conclusions

UCD methods can improve the usability of feedback reports through an iterative cycle of understanding users, developing prototypes, and observing interactions. When designing feedback report prototypes, using the key steps of refining measures, data, and displays, and planning to take time for this process may enable report designers to better translate users’ needs into report design changes. In a national-scale initiative to design clinical practice feedback reports for long-term care settings in the Veterans Health Administration, this method yielded important design changes and insights. These types of systematic approaches may improve the ability of feedback interventions to influence the provision of high-quality care for patients and their families everywhere.

Availability of data and materials

Because this work is being done as quality improvement, data will only be available from the authors on request and after approval by the authorizing officials. Publicly developed materials for this work are available in a GitHub repository: https://doi.org/10.5281/zenodo.1300783.

Abbreviations

GoCC:

Goals of Care Conversation

UCD:

User-centered design

VA:

Veterans Affairs

References

  1. Confidential Physician Feedback Reports: Designing for Optimal Impact on Performance [Internet]. 2016 [cited 2016 May 16]. Available from: http://www.ahrq.gov/professionals/clinicians-providers/resources/confidreportguide/index.html

  2. Ivers NM, Sales A, Colquhoun H, Michie S, Foy R, Francis JJ, Grimshaw JM. No more “business as usual” with audit and feedback interventions: towards an agenda for a reinvigorated intervention. Implement Sci. 2014;9(1):14.

    Article  PubMed  PubMed Central  Google Scholar 

  3. Brehaut JC, Colquhoun HL, Eva KW, Carroll K, Sales A, Michie S, Ivers N, Grimshaw JM. Practice Feedback Interventions: 15 Suggestions for Optimizing Effectiveness. Ann Intern Med. 2016;164(6):435–41.

    Article  PubMed  Google Scholar 

  4. Colquhoun H, Michie S, Sales A, Ivers N, Grimshaw JM, Carroll K, Chalifoux M, Eva K, Brehaut J. Reporting and design elements of audit and feedback interventions: a secondary review. BMJ Qual Saf. 2017;26(1):54–60. https://doi.org/10.1136/bmjqs-2015-005004.

  5. Kluger AN, DeNisi A. The effects of feedback interventions on performance: a historical review, a meta-analysis, and a preliminary feedback intervention theory. Psychol Bull. 1996;119(2):254–84.

    Article  Google Scholar 

  6. Ilgen DR, Fisher CD, Taylor MS. Consequences of individual feedback on behavior in organizations. J Appl Psychol. 1979;64(4):349–71.

    Article  Google Scholar 

  7. Shaller D, Kanouse D. Private Performance Feedback Reporting for Physicians [Internet]. Rockville, MD: Agency for Healthcare Research and Quality; 2012 Nov [cited 2018 Oct 4]. Available from: /professionals/clinicians-providers/resources/privfeedbackgdrpt/index.html

  8. Brown B, Balatsoukas P, Williams R, Sperrin M, Buchan I. Multi-method laboratory user evaluation of an actionable clinical performance information system: implications for usability and patient safety. J Biomed Inform. 2018;77:62–80.

    Article  PubMed  PubMed Central  Google Scholar 

  9. Part Two: Design of Physician Feedback Reporting Systems | Agency for Healthcare Research & Quality [Internet]. Rockville, MD: Agency for Healthcare Research and Quality; 2016 [cited 2018 Oct 4]. Available from: https://www.ahrq.gov/professionals/clinicians-providers/resources/confidreportguide/system-design.html

  10. Tip 7. Test the Report With Your Audience [Internet]. [cited 2018 Dec 13]. Available from: https://www.ahrq.gov/talkingquality/resources/writing/tip7.html

  11. Testing Your Report By Getting Feedback From Users [Internet]. [cited 2018 Dec 19]. Available from: https://www.ahrq.gov/talkingquality/resources/design/testing.html

  12. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, Proctor EK, Kirchner JE. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10(1):21.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Colquhoun HL, Sattler D, Chan C, Walji T, Palumbo R, Chalmers I, Jokhio I, Ivers NM. Applying user-centered design to develop an audit and feedback intervention for the home care sector. Home Health Care Manag Pract. 2017;29(3):148–60.

    Article  Google Scholar 

  14. Witteman HO, Dansokho SC, Colquhoun H, Coulter A, Dugas M, Fagerlin A, Giguere AM, Glouberman S, Haslett L, Hoffman A, Ivers N, Légaré F, Légaré J, Levin C, Lopez K, Montori VM, Provencher T, Renaud J-S, Sparling K, Stacey D, Vaisson G, Volk RJ, Witteman W. User-centered design and the development of patient decision aids: protocol for a systematic review. Syst Rev. 2015;4:11.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Brown B, Balatsoukas P, Williams R, Sperrin M, Buchan I. Interface design recommendations for computerised clinical audit and feedback: Hybrid usability evidence from a research-led system. Int J Med Inf. 2016;94:191–206.

    Article  Google Scholar 

  16. Sales AE, Ersek M, Intrator OK, Levy C, Carpenter JG, Hogikyan R, Kales HC, Landis-Lewis Z, Olsan T, Miller SC, Montagnini M, Periyakoil VS, Reder S. Implementing goals of care conversations with veterans in VA long-term care setting: a mixed methods protocol. Implement Sci. 2016;11:132.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Foglia MB, Lowery J, Sharpe VA, Tompkins P, Fox E. A comprehensive approach to eliciting, documenting, and honoring patient wishes for care near the end of life: the Veterans Health Administration’s Life-Sustaining Treatment Decisions Initiative. Jt Comm J Qual Patient Saf. 2019 Jan 1;45(1):47–56.

    Article  PubMed  Google Scholar 

  18. McGlynn EA, Asch SM. Developing a clinical performance measure. Am J Prev Med. 1998;14(3, Supplement 1):14–21.

    Article  CAS  PubMed  Google Scholar 

  19. Campbell SM, Braspenning J, Hutchinson A, Marshall MN. Research methods used in developing and applying quality indicators in primary care. BMJ. 2003;326(7393):816–9.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  20. Roberts JP, Fisher TR, Trowbridge MJ, Bent C. A design thinking framework for healthcare management and innovation. Healthcare. 2016;4(1):11–4.

    Article  PubMed  Google Scholar 

  21. Cohn M. User Stories Applied: For Agile Software Development. 1st ed. Boston: Addison-Wesley Professional; 2004. p. 304.

  22. Panicker V, Lee D, Wetmore M, Rampton J, Smith R, Moniz M, Landis-Lewis Z. Designing tailored displays for clinical practice feedback: developing requirements with user stories. Stud Health Technol Inform. 2019;264:1308–12. https://doi.org/10.3233/SHTI190438.

  23. Few S. Information dashboard design: displaying data for at-a-glance monitoring. Second ed. Burlingame, CA: Analytics Press; 2013. 260 p.

  24. Hegarty M. Advances in Cognitive Science and Information Visualization. In: Score Reporting Research and Applications [Internet]. Routledge; 2018 [cited 2019 Apr 30]. Available from: https://www.taylorfrancis.com/

  25. Eden AR, Hansen E, Hagen MD, Peterson LE. Physician perceptions of performance feedback in a quality improvement activity. Am J Med Qual. 2018;33(3):283–90.

    Article  PubMed  Google Scholar 

  26. Buxton B. Sketching User Experiences: Getting the Design Right and the Right Design. 1st ed. San Francisco, CA: Morgan Kaufmann; 2007. 448 p.

  27. Gothelf J, Seiden J, Lean UX. Designing Great Products with Agile Teams. 2nd ed. O’Reilly Media: Sebastopol, CA; 2016. p. 208.

  28. Kushniruk AW, Patel VL. Cognitive and usability engineering methods for the evaluation of clinical information systems. J Biomed Inform. 2004;37(1):56–76.

    Article  PubMed  Google Scholar 

  29. Munzner T. Visualization Analysis and Design: CRC Press; 2014. p. 422.

  30. Zhang J. A representational analysis of relational information displays. Int J Hum Comput Stud. 1996;45(1):59–74.

    Article  Google Scholar 

  31. Vessey I. Cognitive fit: a theory-based analysis of the graphs versus tables literature*. Decis Sci. 1991;22(2):219–40.

  32. Krug S. Rocket surgery made easy: the do-it-yourself guide to finding and fixing usability problems. 1st ed. New Riders: Berkeley, CA; 2009. p. 168.

  33. Fried TR, Tinetti ME, Iannone L, O’Leary JR, Towle V, Van Ness PH. Health outcome prioritization as a tool for decision making among older persons with multiple chronic conditions. Arch Intern Med. 2011;171(20):1854–6.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Grimshaw JM, Ivers N, Linklater S, Foy R, Francis JJ, Gude WT, Hysong SJ. Audit and Feedback MetaLab. Reinvigorating stagnant science: implementation laboratories and a meta-laboratory to efficiently advance the science of audit and feedback. BMJ Qual Saf. 2019;28(5):416–23.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  35. HEDIS Measures and Technical Resources [Internet]. NCQA. [cited 2018 Dec 19]. Available from: https://www.ncqa.org/hedis/measures/

  36. Borycki E, Kushniruk A, Nohr C, Takeda H, Kuwata S, Carvalho C, Bainbridge M, Kannry J. Usability methods for ensuring health information technology safety: evidence-based approaches. Contribution of the IMIA Working Group Health Informatics for Patient Safety. Yearb Med Inform. 2013;8:20–7.

    CAS  PubMed  Google Scholar 

Download references

Acknowledgements

All views expressed are those of the authors and do not reflect the views of the Department of Veterans Affairs. The authors thank Orna Intrator for guidance in the use of VA’s Corporate Data Warehouse. The authors also thank the VA Information Resource Center (VIReC) for an opportunity to present and receive feedback on this work.

Funding

See “Ethics approval and consent to participate”

Author information

Authors and Affiliations

Authors

Contributions

ZLL and AS conceived the study. ZLL, CL, ME, and AS contributed to the study design. All authors contributed to data analysis. ZLL, WJS, JH, and CG conducted the design work. ZLL drafted the manuscript. All authors read, edited, and approved the final manuscript.

Corresponding author

Correspondence to Zach Landis-Lewis.

Ethics declarations

Ethics approval and consent to participate

This program was funded through the VA Quality Enhancement Research Initiative (QUERI) program. VA QUERI funds are operational funds in the Veterans Health Administration, and much of the work funded through QUERI is conducted as quality improvement. Veterans Health Administration has specific guidance for approval of quality improvement work, outlined in Handbook 1058.05. We have provided documentation that we have met requirements under this guidance, and this documentation is available from the authors on request.

Consent for publication

Not applicable

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Additional file 1.

Example design techniques. Additional examples of design techniques that can be used for understanding and observing users of feedback reports.

Additional file 2.

Examples from long term care. Additional examples of the application of the method in long term care facilities and the design team roles and responsibilities.

Additional file 3.

Software development. Software development process used following the application of the proposed method.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Landis-Lewis, Z., Kononowech, J., Scott, W.J. et al. Designing clinical practice feedback reports: three steps illustrated in Veterans Health Affairs long-term care facilities and programs. Implementation Sci 15, 7 (2020). https://doi.org/10.1186/s13012-019-0950-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13012-019-0950-y

Keywords