Skip to main content

Assessing fidelity to evidence-based quality improvement as an implementation strategy for patient-centered medical home transformation in the Veterans Health Administration


Effective implementation strategies might facilitate patient-centered medical home (PCMH) uptake and spread by targeting barriers to change. Evidence-based quality improvement (EBQI) is a multi-faceted implementation strategy that is based on a clinical-researcher partnership. It promotes organizational change by fostering innovation and the spread of those innovations that are successful. Previous studies demonstrated that EBQI accelerated PCMH adoption within Veterans Health Administration primary care practices, compared with standard PCMH implementation. Research to date has not documented fidelity to the EBQI implementation strategy, limiting usefulness of prior research findings. This paper develops and assesses clinical participants’ fidelity to three core EBQI elements for PCMH (EBQI-PCMH), explores the relationship between fidelity and successful QI project completion and spread (the outcome of EBQI-PCMH), and assesses the role of the clinical-researcher partnership in achieving EBQI-PCMH fidelity.


Nine primary care practice sites and seven across-sites, topic-focused workgroups participated (2010–2014). Core EBQI elements included leadership-frontlines priority-setting for QI, ongoing access to technical expertise, coaching, and mentoring in QI methods (through a QI collaborative), and data/evidence use to inform QI. We used explicit criteria to measure and assess EBQI-PCMH fidelity across clinical participants. We mapped fidelity to evaluation data on implementation and spread of successful QI projects/products. To assess the clinical-researcher partnership role in EBQI-PCMH, we analyzed 73 key stakeholder interviews using thematic analysis.


Seven of 9 sites and 3 of 7 workgroups achieved high or medium fidelity to leadership-frontlines priority-setting. Fidelity was mixed for ongoing technical expertise and data/evidence use. Longer duration in EBQI-PCMH and higher fidelity to priority-setting and ongoing technical expertise appear correlated with successful QI project completion and spread. According to key stakeholders, partnership with researchers, as well as bi-directional communication between leaders and QI teams and project management/data support were critical to achieving EBQI-PCMH fidelity.


This study advances implementation theory and research by developing measures for and assessing fidelity to core EBQI elements in relationship to completion and spread of QI innovation projects or tools for addressing PCMH challenges. These results help close the gap between EBQI elements, their intended outcome, and the finding that EBQI-PCMH resulted in accelerated adoption of PCMH.

Peer Review reports


The patient-centered medical home (PCMH) model is widely endorsed by professional societies and has been shown to improve quality of care and patient, provider, and staff satisfaction, while reducing costs [1,2,3,4]. Transforming primary care toward the PCMH model, however, requires cultural, technical, and clinical innovation [5]. Implementation strategies are methods to promote adoption, implementation, and sustainability of a new practice, and might facilitate PCMH adoption and spread by targeting barriers to change at multiple levels (e.g., external context, within the organization, among professionals, and in the intervention) [6].

Implementation researchers are developing a knowledge base about which implementation strategies are effective for promoting uptake of evidence-based interventions within different contexts, and have argued that assessing fidelity to implementation strategies (e.g., the extent to which intended implementation strategies are used) is critical [7, 8]. Some studies describe or evaluate PCMH implementation [9,10,11,12], but few document or assess fidelity to PCMH implementation strategies, and fewer still assess the relationship between the strategies and their intended outcomes [13,14,15]. Most do not include sufficient detail to support replication of the strategy, and few address use of implementation strategies to promote PCMH in large, integrated healthcare systems like the Veterans Health Administration (VHA) [16].

In 2010, VHA embarked on nationwide PCMH transformation called Patient Aligned Care Teams (PACT) [17, 18]. In parallel with VHA’s implementation effort, we introduced evidence-based quality improvement (EBQI) as a multi-faceted implementation strategy to improve PCMH implementation in one VHA region [19, 20]. Our ongoing impact evaluation has shown that EBQI-PCMH sites, in comparison to control sites, experienced accelerated achievement of PCMH goals, including decreases in ambulatory care visits, increases in non-face-to-face visits, lower primary care provider burnout, and larger improvements in patient-provider communication [21,22,23].

The primary outcome of EBQI-PCMH is development and spread of locally initiated primary care QI innovation projects directed at achieving adherence to the PCMH model (Fig. 1). These projects reflect three main inputs. First, they are proposed by frontline clinicians and staff and selected by regional and executive leaders through a priority setting process [19]. Second, QI project development and completion are supported by an ongoing collaborative that features technical support by health services researchers. Third, development of the QI projects is informed not only by PCMH literature, but by targeted evidence specific to each QI project from a responsive evidence review [24] and current local data [20]. The QI innovation projects are the short-term outcomes of EBQI-PCMH, and are the hypothesized link between EBQI-PCMH and longer-term PCMH outcomes. The investigation reported here tests the link between fidelity to 3 core EBQI elements, or inputs (leadership-frontlines priority-setting; ongoing access to technical expertise, coaching, and mentoring in QI methods through a QI collaborative; and data/evidence use to inform QI), and the EBQI outcome of completed, spreadable QI innovation projects for supporting PCMH in VHA.

Fig. 1
figure 1

EBQI-PCMH promotes organizational change through implementation and spread of practice-level systematic quality improvement.

In addition to assessing the EBQI outcome of successful QI innovation project completion and spread, this paper addresses the critical role of partnership between healthcare clinical leaders and health services researchers [25, 26], which provided the substrate for EBQI-PCMH. Our fidelity measures focus on the extent to which clinical participants engaged in EBQI activities, but do not describe or assess the role of health services researchers. Understanding how health services researchers supported and engaged clinical participants, what clinical participants found valuable, and what the health services researchers could have done better is critical to understanding variations in EBQI-PCMH fidelity. For understanding the role of health services researchers in the clinical-researcher partnership and achievement of EBQI-PCMH fidelity, we analyze key stakeholder qualitative interview data.

Using data abstracted from project records and key stakeholder interviews, the objectives of this study are the following: (1) to develop measures and assess variations in fidelity for three core EBQI-PCMH implementation strategy elements, (2) to explore the relationship between EBQI-PCMH fidelity and its primary outcome (implementation and spread of locally developed and initiated QI innovation projects), and (3) to assess the role of health services researchers to understand variations in EBQI-PCMH fidelity.

VHA PCMH implementation and the EBQI-PCMH implementation strategy

VHA’s standard national PCMH implementation strategies included a national mandate describing the PCMH elements to be implemented (e.g., PCMH core principles, clinic restructuring, new staffing model, and guidance on new roles and responsibilities), increased funding for primary care staffing, regional learning collaboratives and training centers, new performance measures, and an online toolkit [18, 27,28,29]. A few studies during early implementation described local efforts to implement PCMH and implementation barriers and facilitators [17, 30,31,32], and a national evaluation developed a measure to rank clinics on PCMH model fidelity [33]. These studies show considerable variation in VHA’s PCMH implementation, but only one assessed fidelity to an implementation strategy (e.g., a learning collaborative) and association with performance [16].

EBQI-PCMH was developed by the Veterans Assessment and Improvement Laboratory (VAIL) and funded by the VHA Office of Primary Care. Based on EBQI interventions developed by Rubenstein and colleagues [34,35,36], EBQI was effective for improving uptake of evidence-based clinical practices such as collaborative care for depression [34, 37], supported employment [38], cultural competency for healthcare staff gender sensitivity [39], and VHA’s PCMH implementation [21,22,23]. Rubenstein et al. have previously described the theoretical basis underlying EBQI-PCMH [19]. The intervention key features were derived from literature on PCMH implementation challenges and theories of organizational change [40,41,42,43], clinical quality improvement [44,45,46], complex adaptive systems [5], and diffusion of innovation [47, 48]. Rubenstein et al.’s logic model illustrated hypothesized relationships between EBQI-PCMH intervention features (e.g., organizational structures and the clinical-researcher partnership activities) and short, intermediate, and long-term outcomes. Local implementation and spread of successful QI innovation projects is a short-term outcome of EBQI-PCMH, hypothesized to result in intermediate and long-term outcomes such as improved achievement of PCMH goals and metrics, workforce job satisfaction, and improved patient experiences [19].

In previously published work, we evaluated EBQI-PCMH organizational structures [20]. Here we derive the three core elements of the EBQI-PCMH implementation strategy from the clinical-researcher partnership activities conceptualized by Rubenstein et al., which include the following: (1) regional consensus-based priority setting that includes leaders and frontlines; (2) communication, collaboration, and coaching; and (3) use of evidence and data (including formative feedback) to inform locally initiated innovation [19]. To arrive at the three core elements, we mapped these activities to VAIL project administrative records for empirical substantiation. Table 1 provides a more detailed description of the core elements, which include the following: (1) a leadership and frontline (i.e., top-down, bottom-up) priority-setting process for focusing QI efforts; (2) ongoing technical expertise and coaching/mentoring in QI methods by health services researchers, delivered through a QI collaborative; and (3) the use of data and evidence to inform QI efforts with project management support provided by internal coordinators.

Table 1 Description of EBQI-PCMH core elements and criteria for assessing EBQI-PCMH fidelity


Setting and participants

Using a modified stepped-wedge design [49] with three phases [19, 20], EBQI-PCMH was introduced in five local healthcare systems in the Southern California/Nevada region from April 2010 to September 2014. Participants included multi-level, interdisciplinary leaders, primary care providers and staff, other clinicians (e.g., social workers, pharmacists, behavioral health), health services researchers, VAIL project staff, and patient representatives. In phase 1 (April 2010–May 2011), one primary care practice at each of three local healthcare systems began participating in EBQI-PCMH activities. Three additional primary care practices from the same 3 healthcare systems began participating in phase 2 (January 2012). In phase 3 (September 2013–January 2014), we added one primary care practice from one of the original three healthcare systems and two primary care practices from two new healthcare systems. By the end of phase 3, five local healthcare systems (nine primary care practices) and seven across-site, topic-focused workgroups composed of VHA and non-VHA subject matter experts had participated in EBQI-PCMH. Participating primary care practices included four large medical center-based clinics (16,000 or more unique primary care patients) and five medium-large community-based outpatient clinics (8000–20,000 unique primary care patients). The initiative also included a Steering Committee composed of regional and local healthcare system executive leaders.

Data sources and measures

We used project administrative records, systematically reviewing them and abstracting data to construct fidelity measures for the EBQI elements described in Table 1, and a measure to assess the EBQI short-term outcome of implementation and spread of successful locally developed and initiated QI projects. The following procedures were followed to ensure the accuracy of data abstraction. A primary reviewer reviewed the administrative documents, abstracted information, and entered the information into a database with pre-specified fields. A secondary reviewer reviewed the same documents and checked them against the database to assure the accuracy of information entered. Errors or disagreements were discussed with the lead author, who made final decisions about values assigned. Supplemental Table 1 contains a complete list of variables derived from administrative records.

We mapped data extracted from administrative records to activities corresponding to each of three EBQI core elements and the outcome and created site/workgroup-level measures (Table 1). Based on the distribution of these measures, we developed and applied specific criteria to determine high, medium, and low/no fidelity. We chose cut-points based on how the values clustered, taking into account high outliers. For example, for the sum of number of QI projects proposed and approved, sites A and B were clear outliers with 21 and 19 respectively, while sites C and D were the next highest with 8. We set the cut-point for high at 8 to conservatively include the high outliers and the next highest achieving sites. (See Supplemental Table 2 for data for each site and workgroup.)

Of note, we did not have complete administrative records for across-site, topic-focused workgroup participation. These groups did not routinely participate in bi-weekly collaborative calls. Most did organize and hold their own regularly occurring meetings supported by VAIL staff, but we were unable to locate minutes or other records from these meetings. We were also missing 5 months of meeting minutes for the bi-weekly collaborative calls. We report in Supplemental Table 2 that VAIL organized 87 bi-weekly conference calls, when in fact the total was around 100 including these 5 months.

To understand the researchers’ roles in variations in fidelity, we analyzed 73 qualitative interviews from regional (7), healthcare system (22), and primary care practice (21) leaders; other site participants, including Veteran patient representatives (8); VAIL team members (12); and internal coordinators (8). Interviews were conducted in Sept 2013–Feb 2014 and July 2014–Jan 2015, with 51/60 (88%) and 22/27 (92%) agreeing to participate, respectively. Ten key stakeholders were interviewed in both waves and 53 were interviewed only once. Interviews were semi-structured, most were conducted in-person, and lasted approximately 60 min. Audio recordings were professionally transcribed, reviewed by the research team, and edited for accuracy. Interviews contained questions about EBQI-PCMH features, including VAIL Steering Committee experiences, internal coordinators, support and resources received from the VAIL project team, and collaborative learning sessions.


To assess fidelity for EBQI-PCMH core elements and implementation and spread of locally initiated QI projects, we applied criteria described in Table 1 for high, medium, or low/no fidelity to the measures.

To analyze the interview data, the lead author developed an initial code list of a priori codes pertaining to core elements of the EBQI-PACT implementation strategy. She and a second qualitative analyst used the code list to independently code the same five interviews; discrepancies in applying codes were discussed and definitions of each code were refined to create a codebook. Using this codebook and Atlas.ti, a trained qualitative analyst coded all interviews, and a second qualitative analyst reviewed all coding. We then generated Atlas.ti reports for the codes for VAIL Steering Committee, collaborative learning sessions, VAIL team support and resources, quality improvement projects, and internal coordinator, and used matrix analysis to identify common themes related to each EBQI-PCMH core element. Specifically, the lead author did the following: (1) abstracted data from the reports and entered quotes/paraphrased quotes from each interview into fields in an Excel spreadsheet corresponding to the core elements, and (2) summarized experiences described across multiple interviewees into themes. A second qualitative analyst reviewed the data in the spreadsheet to confirm the themes. While rare, disagreements about coding, application of codes, and derivation of the themes were resolved through discussion and consensus [50].


Data for fidelity measures and the outcome appear in Tables 2, 3, and Supplemental Table 2. Duration of participation in EBQI-PCMH (Supplemental Table 2) was similar across sites within implementation phase, except for sites C and G which lagged slightly behind. Duration of participation could not be calculated for 5 across-site workgroups, because either they had no approved QI projects or we had incomplete administrative data for those groups.

Table 2 EBQI-PCMH fidelity for participating primary care practice sites
Table 3 EBQI-PCMH fidelity for across-site, topic-focused workgroups

Leadership and frontline priority-setting process for primary care QI

Site/workgroup fidelity

As shown in Tables 2 and 3, 7 of 9 sites and 3 of 7 across-site workgroups achieved high or medium fidelity. Sites and workgroups submitted 72 QI project proposals over 4 years (Supplemental Table 2). Twenty-four regional and healthcare system leaders served as members of the VAIL Steering Committee and reviewed and rated proposals across four rounds of priority-setting. They approved 26 projects to receive VAIL support. Sites with longer duration in the project (e.g., phase 1 and 2 sites) had higher fidelity for this element.

Health services researchers’ role

VAIL health services researchers and staff organized the priority-setting process, including arranging meeting logistics, assisting sites and workgroups with preparing QI project proposals, and developing rating forms and a process for reviewing and approving projects. Engaging both leaders and frontlines in priority setting facilitated bi-directional communication and built consensus around which projects to prioritize. Key stakeholders at all sites noted that regional and executive-level leaders signaled organizational priorities through their approval of QI projects. As one site participant described, “It was good for me to see how engaged our leadership is… It helped me see the [leaders] as more of a support role instead of a role where people are like no, you can’t do this, trying to figure out how we can make things happen.” Inclusion of frontline providers and staff in priority-setting ensured that leaders learned about PCMH implementation challenges and successes. As one regional leader explained, “I’m really getting the ear from the Steering Committee members [about] what is going on at facilities and the forward movement…being able to work together and share opinions on what areas are important and what areas aren’t has been really helpful and a really powerful way to move forward.”

Feedback from a few key stakeholders indicated that some may not have fully understood the purpose of the priority-setting process and/or it was not sufficiently transparent: “So VAIL and the Steering Committee seem to have…got a little bit politicized over time…and degenerated significantly, in my opinion” (site participant). A key stakeholder at one of the less-engaged sites thought that allowing sites (rather than the Steering Committee) to determine which projects they implemented would foster more “ownership” and incentivize participation.

QI learning collaborative for QI teams

Site/workgroup fidelity

Learning collaborative activities included participation in bi-weekly phone calls and in-person learning sessions. As shown in Tables 2 and 3, only 6 of 9 sites (and no across-site workgroups) achieved high or medium fidelity for bi-weekly phone calls. Participation in learning sessions was generally high, with all but one site and one workgroup attending at least one, and most attending all 7 (Supplemental Table 2). Number of participants per site or workgroup varied between 1 and 28. In terms of leadership participation, site primary care practice leaders participated regularly in calls, but regional and executive healthcare system leaders only participated in in-person learning sessions (15–47 participants per learning session, 33–50% of all attendees). Four Veteran patient representatives also participated in learning sessions.

Health services researchers’ role

The VAIL team organized and participated in bi-weekly calls and twice-yearly in person learning sessions as part of the QI collaborative for sites/workgroups with approved QI projects. Health services researchers mentored and coached the QI teams in QI methods, and moderated discussion between participants. Key stakeholders reported that collaborative activities were helpful for fostering innovation, contributed to a sense of community, and provided reassurance that everyone was experiencing the same challenges. As one site participant described, “Every single last person that’s come [back] from a collaborative meeting has said, ‘Gosh, I wish we had more time to talk to people from the sites.’ Not chitchat or gossiping, it’s about actual quality improvement, we want to ask specific questions and get specific answers and that takes a certain forum to be able to do that.”

The QI collaborative activities set the expectation for cross-site sharing and innovation spread, promoting regional development of a QI culture: “If VAIL went away, somebody would have do some of the things VAIL is doing because otherwise, it’s going to be, oh, [healthcare system]’s over here doing whatever they’re doing and [healthcare system] won’t share with anybody…having VAIL forced us to come to the sandbox and play together” (site participant). Many key stakeholders reported that QI collaborative activities reinforced local mechanisms for QI oversight and accountability by setting the expectation of regular reporting on calls and at conferences. As a VAIL team member explained, “having that level of leadership engaged in the initiative has really made people more accountable. It really conveys to them the sense that the [regional leadership] is taking [this] very seriously.”

Comments from some key stakeholders suggested that a basic level of “readiness” may be necessary for sites to prioritize collaborative activities. A few key stakeholders, for example, questioned the utility of focusing on QI when more basic needs at their sites had not been addressed: “Sometimes I feel like it’s too grandiose of a plan when basic resources are missing, that you can’t implement things if you don’t have certain pieces like staff.” One key stakeholder was not sure how useful the QI toolkits really were, because they may not work or need a lot of adaptation due to variation in how PCMH teams were configured at different sites. In addition, a few key stakeholders noted that getting release time for QI team members to attend in-person meetings was difficult.

Technical assistance for using data and evidence for QI

Site/workgroup fidelity

As shown in Tables 2 and 3, fidelity for this element was mixed, and does not appear to be associated with QI implementation and spread or duration of participation in EBQI-PCMH. Except for site G, sites and across-site workgroups with low fidelity for use of data also achieved low fidelity for bi-weekly calls, suggesting they may not have been well-connected with VAIL resources and supports.

Health services researchers’ role

VAIL health services researchers supported use of evidence and data by providing rapid evidence reviews, formative feedback on sites’ performance, and help with obtaining data and evaluating QI project impact. Key stakeholders reported that the VAIL team and internal site coordinators were helpful for obtaining data and creating measures to track their QI projects and for general PCMH improvement efforts. The expectation that QI teams would collect and report data on feasibility and effectiveness facilitated forward momentum, because “measurement keeps people honest in a way, or it gives a reality check. Because people can say things are happening, but if it’s not documented and measured, it really didn’t happen” (site participant). The internal coordinators described their role as number crunchers, problem solvers, liaisons with leadership and the VAIL project, and quality improvement coaches/teachers for PCMH teams, and key stakeholders at all sites considered them essential. Key stakeholders recalled that internal coordinators organized meetings, provided support for QI projects, and provided data and reports for improving PCMH performance. Commenting on reports provided by the internal coordinator, one site participant explained, “It’s really frustrating when you get a number that doesn’t seem to reflect reality…you get an A-plus if you look at third next available, but then you get an F when you’re looking at the Compass and you’re thinking how is this possible?… If they took away our coordinator I would drown.”

For a few sites, stakeholders reported that internal coordinators played a critical role as liaisons with the VAIL team, receiving assistance from VAIL for data and measures on their site’s behalf and reporting on the site’s activities. Coordinators participated in QI collaborative activities, and were frequently the only representatives for sites B, D, and E on bi-weekly calls. A few sites temporarily or permanently lost the support of their internal coordinators. Key stakeholders at those sites reported difficulty obtaining accurate and reliable data for QI, lapses in or discontinuation of meetings, and losing their connections with the VAIL team. After losing their coordinator, one site B participant lamented, “That individual essentially held the [site-level QI] group together. Once you bring them together, you have your best chance of coming up with ideas and then enforcing them. And once that’s gone, we don’t even meet. So how can we come up with a project let alone continuing and reporting to make sure it does well? It’s all gone.”

Some key stakeholders were not aware of resources or support provided by VAIL, and suggested that an organizational chart of the VAIL project would be helpful. One site, comparing themselves to another site, felt they could have been more successful if they had more direct access to the VAIL team: “I don’t know where VAIL fits in without having that research person there or [internal] coordinator who really understands how to frame it.”

EBQI-PACT outcome: implementation and spread of successful locally developed and initiated QI Projects

As of the end of the study period (October 2014), sites and across-site workgroups had completed 21 projects, with 16 resulting in tools/toolkits for spread (Supplemental Table 2). As shown in Tables 2 and 3, phase 1 sites completed the most projects and toolkits. Two phase 3 sites and four workgroups did not complete any projects or toolkits. Generally, sites with a longer duration were more successful implementing and spreading their QI projects. Workgroups were less successful; only 3 workgroups completed projects and/or toolkits. Sites and workgroups with the highest levels of participation in priority-setting and collaborative learning sessions appear to have had more success.


This study advances implementation theory and research by developing measures for and assessing fidelity to the EBQI implementation strategy as applied to PCMH. Translating a successful intervention into routine practice requires use of effective implementation strategies that are appropriate to the context, but development of a knowledge base about which implementation strategies “work” and under what conditions has been hampered by inconsistent use of terminology, lack of operational definitions, and poorly defined fidelity measures [51, 52]. Implementation researchers highlight the importance of defining, describing, and evaluating fidelity to implementation strategies in order to bridge the evidence to practice gap. Previously published impact analyses of EBQI-PCMH demonstrated its overall effectiveness for accelerating achievement of key PCMH goals within VHA [21,22,23]. In previous papers [19, 20], we described the theoretical origins and presented a logic model hypothesizing that key features of EBQI-PCMH facilitate achievement of PCMH goals through the implementation and spread of effective QI projects. Our findings here highlight three core elements of EBQI-PCMH as “active ingredients” that, in combination, facilitated QI efforts to adapt PCMH to local contexts and spread innovation across sites, helping EBQI-PCMH sites better achieve PCMH goals.

Results from this study have important theoretical and practical implications for PCMH implementation. First, we demonstrate how PCMH implementation can be improved by using EBQI-PCMH to target barriers to change at multiple levels. At the organizational and provider/staff levels, EBQI-PCMH engaged multi-level, interdisciplinary leaders and frontline primary care clinicians and staff in priority-setting for primary care. Our fidelity assessment found high to medium fidelity for this core element, with higher fidelity associated with longer participation in the overall initiative. EBQI-PCMH fostered bi-directional communication across hierarchical and disciplinary boundaries and set expectations for collaboration across service lines, reporting, and accountability. By identifying the challenges that different primary care practices may encounter with PCMH implementation at different stages, the priority-setting process also allowed individual practices to tailor PCMH implementation to address contextual factors at their sites. Although one previous study found successful PCMH implementation to be associated with empowering/authorizing leaders to make change [14], ours is the first study that we know of that has documented and assessed leadership engagement in QI priority-setting as an implementation strategy to improve PCMH implementation.

Second, our results support findings from a previous study of the American Academy of Family Physicians’ National Demonstration Project that showed PCMH adoption was associated with practice facilitation, participation in a learning collaborative, and ongoing consultation [13, 15]. Our study and previous EBQI studies describe ongoing technical expertise and coaching/mentoring in QI methods, provided by health services researchers, as a core element of the strategy [34, 53]. EBQI-PCMH applied this primarily through an organized QI learning collaborative with bi-weekly calls and semi-annual in-person learning sessions, as well as individual consultations with QI teams as needed/requested. Our learning collaborative provided QI teams with opportunities for cross-site sharing and learning, fostering a regional culture of QI, and promoting idea-generation and innovation. Regular reporting by QI teams in these forums also enforced accountability of QI. We posit that this learning and QI culture permeated EBQI-PCMH sites and may have changed their approach to addressing challenges with PACT implementation. Practice leaders were now empowered with knowledge of QI methods and techniques (as well as what has been tried at other primary care practices) and may have applied this learning more broadly to improve primary care delivery in their clinics.

Third, as the “learning healthcare system” gains popularity as a means of improving efficiency and quality and reducing costs [54, 55], healthcare systems will need to develop “health data infrastructures” capable of extracting and transmitting evidence and knowledge to inform decision-making [55, 56]. Systematic translation and dissemination of what is learned will also require application of effective implementation strategies [57]. The clinical-researcher partnership that provides the foundation for EBQI-PCMH highlights the role of health services researchers in transformative initiatives such as PCMH implementation. In particular, use of data and evidence to inform QI innovation (core element 3) is foundational to the EBQI-PCMH implementation strategy [37, 53], and was found in other studies to be important for successful PCMH implementation in small practices (e.g., as used for audit and feedback, conducting small tests of change) [14]. Our qualitative data suggests that support for using data and project management provided by the VAIL team and site internal coordinators was considered crucial to QI project successful completion, as well as general primary care QI efforts. The mixed levels of fidelity for this core element may indicate that our fidelity measures did not fully capture the impact of ongoing exposure to data and measurement expertise particularly through the internal coordinators. The coordinators may have served as knowledge brokers because they were hired by and worked directly for the sites but were mentored by the VAIL team [58].

Our results also highlight several areas for future research on implementation strategies. In combination with our previous studies showing accelerated PCMH implementation in EBQI-PCMH sites, this study suggests that a multi-faceted implementation strategy that combines leadership-frontline engagement in primary care QI; support, mentoring, and training for QI delivered via a QI collaborative; and, technical support for using data/evidence for QI and project management can improve implementation. A critical knowledge gap still exists, however, in terms of linking implementation strategy fidelity to site-specific outcomes (for example, does fidelity to using data to inform QI translate into better patient experience outcomes?). For this, researchers should focus on developing and validating implementation strategy fidelity measures that can be reliably and systematically collected across a variety of PCMH settings. These fidelity measures could then be tested in type 3 hybrid type studies with larger samples (e.g., more sites) [59].

Our results have several practical implications for healthcare policy makers, administrators, and leaders seeking to implement or improve uptake of PCMH in primary care. First, sites with longer exposure to EBQI-PMCH had higher fidelity to its core elements and were more successful with QI implementation and spread, suggesting that development of a QI culture may take a few years to achieve. Second, primary care practices in general had few if any resources to support QI, and thus the role of the internal coordinator dedicated to primary care QI will be important for organizations considering using this implementation strategy. Third, in addition to someone who can act as an internal coordinator to assist with QI efforts, embedded health services researchers [60] can improve PCMH implementation by working with existing personnel (including the internal coordinator) within the site to guide implementation efforts, using data/evidence to inform implementation, and providing mentoring/training in QI methods. Finally, as a large integrated healthcare system with a well-developed electronic health record and performance reporting system, VHA was an ideal setting in which to use EBQI-PCMH. Smaller healthcare systems or practices lacking the ability to easily obtain and analyze data on their care delivery processes and patient outcomes may need to budget more time and resources to support using data to inform QI, and should consider hiring/assigning a dedicated internal coordinator to help with this aspect. Payors could also provide primary care practices with access to data on each practices’ patients and incentivize use of EBQI in other ways. Professional societies could provide opportunities, such as QI learning collaboratives, for cross-site learning and sharing among small practices and/or healthcare systems.


The study had limitations. First, lack of data for across-site workgroup meetings may have affected the validity/reliability of our fidelity assessment for participation in the QI learning collaborative (core element 2) for these groups. Fidelity for workgroups, however, was low across all three EBQI-PCMH elements, suggesting that more could have been done to support their QI project efforts, and/or that cross-site topic-focused workgroups may not be an effective mechanism for facilitating local primary care QI. Missing administrative records for 5 months of bi-weekly calls that impacted fidelity measures equally for all sites and workgroups also resulted in an underreporting of participation (and resources devoted to participating) in EBQI-PCMH core element 2. Additionally, we did not systematically collect data on how sites were selected by healthcare system leaders, but we heard (informally and in interviews) from healthcare system leaders that they selected sites for a variety of reasons, including readiness (for example, one phase 1 site had residents who were required to conduct QI projects), resources and staffing, and sites that they thought might need extra help with PCMH implementation. Furthermore, we have anecdotal reports of sites conducting QI projects that were proposed but not approved by the VAIL Steering Committee, but we did not track or collect data on these additional QI projects and may have underreported the number of initiated and completed QI projects. In addition, we did not interview key stakeholders at sites G, H, and I or some of the workgroups, and thus the findings from qualitative interviews may not fully represent experiences of sites with shorter duration of participation. Finally, EBQI-PCMH required substantial resources, but subsequent applications in VHA Women’s Health and a separate VHA-funded care coordination QI demonstration are testing how EBQI can be accomplished with fewer resources [53, 61].

Several study strengths outweigh these limitations. First, we compiled administrative data from many and varied sources, representing study activities over nearly five years. We are not aware of any studies of PCMH implementation strategy fidelity with this length and breadth of data. The clinical-researcher partnership underlying EBQI distinguishes it from other implementation strategies that rely on a top-down implementation approach and highlights the health services researchers’ role in supporting fidelity to EBQI. We supplemented administrative records with qualitative interview data from key stakeholders, permitting exploration of the health services researchers’ role and how they were able to engage leaders and frontlines with many competing demands in conducting structured QI informed by evidence, and how the VAIL team could have better facilitated QI implementation and spread across sites. Finally, this study advances our understanding of which implementation strategies hold promise for successfully transforming primary care to a PCMH model.


This study described three core elements of a multifaceted implementation strategy—evidence-based quality improvement or EBQI—and assessed fidelity to EBQI as used to implement VHA’s PCMH. The findings revealed that multi-level participation in priority-setting, EBQI collaborative learning sessions, and data/evidence use to inform QI are key features of the EBQI implementation strategy that can accelerate achievement of key PCMH goals. Furthermore, successful implementation and spread of local primary care QI was enhanced by the following: (1) systematically linking multi-level, interprofessional leadership to front-line innovators; (2) across-site communication and learning; and (3) availability of project management and data support. While this study has demonstrated that fidelity to implementation strategies can be assessed using a variety of data sources, to advance implementation science, future research should focus on the development of tools to systematically and prospectively measure and assess implementation strategy fidelity [62]. The practical implication of this study is that healthcare system leaders can incorporate key features of EBQI to improve implementation of evidence-based interventions.

Availability of data and materials

The datasets used and analyzed during the current study are available from the corresponding author on reasonable request. Supplemental Table 2 contains the data used for the fidelity measures reported in this paper.



Evidence-based quality improvement


Patient-Aligned Care Team


Patient-centered medical home


Quality improvement


Veterans Assessment and Improvement Laboratory


Veterans Health Administration


  1. Jabbarpour Y, DeMarchis E, Bazemore A, Grundy P. The impact of primary care practice transformation on cost, quality, and utilization: a systematic review of research published in 2016. Patient-Centered Primary Care Collaborative; 2017.

  2. Sinaiko AD, Landrum MB, Meyers DJ, Alidina S, Maeng DD, Friedberg MW, et al. Synthesis of research on patient-centered medical homes brings systematic differences into relief. Health Affairs. 2017;36(3):500–8.

    Article  PubMed  Google Scholar 

  3. Starfield B, Shi L. The medical home, access to care, and insurance: a review of evidence. Pediatrics. 2004;113:1493–8.

    PubMed  Google Scholar 

  4. Starfield B, Shi L, Macinko J. Contribution of primary care to health systems and health. Milbank Q. 2005;83:457–502.

    Article  PubMed  PubMed Central  Google Scholar 

  5. Crabtree BF, Nutting PA, Miller WL, RR MD, Stange KC, Jae’n CR, et al. Primary care practice transformation is hard work: insights from a 15-year developmental program of research. Med Care. 2011;49(S1).

    Article  PubMed  PubMed Central  Google Scholar 

  6. Lau R, Stevenson F, Ong BN, Dziedzic K, Treweek S, Eldridge S, et al. Achieving change in primary care--causes of the evidence to practice gap: systematic reviews of reviews. Implement Sci. 2016;11:40.

    Article  PubMed  PubMed Central  Google Scholar 

  7. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. 2011;38(2):65–76.

    Article  PubMed  Google Scholar 

  8. Slaughter SE, Hill JN, Snelgrove-Clarke E. What is the extent and quality of documentation and reporting of fidelity to implementation strategies: a scoping review. Implement Sci. 2015;10:129.

    Article  PubMed  PubMed Central  Google Scholar 

  9. Quinn MT, Gunter KE, Nocon RS, Lewis SE, Vable AM, Tang H, et al. Undergoing transformation to the patient centered medical home in safety net health centers: perspectives from the front lines. Ethn Dis. 2013;23(3):356–62.

    PubMed  Google Scholar 

  10. Rissi JJ, Gelmon S, Saulino E, Merrithew N, Baker R, Hatcher P. Building the foundation for health system transformation: Oregon’s patient-centered primary care home program. J Public Health Manag Pract. 2015;21(1):34–41.

    Article  PubMed  Google Scholar 

  11. Stout S, Weeg S. The practice perspective on transformation: experience and learning from the frontlines. Med Care. 2014;52(11 Suppl 4):S23–5.

    Article  PubMed  Google Scholar 

  12. Wagner EH, Coleman K, Reid RJ, Phillips K, Abrams MK, Sugarman JR. The changes involved in patient-centered medical home transformation. Primary Care. 2012;39(2):241–59.

    Article  PubMed  Google Scholar 

  13. Nutting PA, Crabtree BF, Stewart EE, Miller WL, Palmer RF, Stange KC, et al. Effect of facilitation on practice outcomes in the National Demonstration Project Model of the patient-centered medical home. Annals Family Med. 2010;8(Suppl_1):S33–44.

    Article  Google Scholar 

  14. Scholle SH, Asche SE, Morton S, Solberg LI, Tirodkar MA, Jaen CR. Support and strategies for change among small patient-centered medical home practices. Ann Fam Med. 2013;11(Suppl 1):S6–13.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Stewart EE, Nutting PA, Crabtree BF, Stange KC, Miller WL, Jaen CR. Implementing the patient-centered medical home: observation and description of the national demonstration project. Ann Fam Med. 2010;8 Suppl 1:S21–32 S92.

    Article  PubMed  Google Scholar 

  16. Bidassie B, Davies ML, Stark R, Boushon B. VA experience in implementing patient-centered medical home using a breakthrough series collaborative. J Gen Intern Med. 2014;29(Suppl 2):S563–71.

    Article  PubMed  Google Scholar 

  17. Rosland A, Nelson K, Sun H, Dolan ED, Maynard C, Bryson C, et al. The patient-centered medical home in the Veterans Health Administration. Am J Manag Care. 2013;19(7):e263–72.

    PubMed  Google Scholar 

  18. Werner RM, Canamucio A, Shea JA, True G. The medical home transformation in the Veterans Health Administration: an evaluation of early changes in primary care delivery. Health services research. 2014;49(4):1329–47.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Rubenstein LV, Stockdale SE, Sapir N, Altman L, Dresselhaus T, Salem-Schatz S, et al. A Patient Centered Primary Care Practice Approach using evidence-based quality improvement: rationale, methods, and early assessment of implementation. J Gen Internal Med. 2014;29(Suppl 2):S589–97.

    Article  Google Scholar 

  20. Stockdale SE, Zuchowski J, Rubenstein LV, Sapir N, Yano EM, Altman L, et al. Fostering evidence-based quality improvement for patient-centered medical homes: initiating local quality councils to transform primary care. Health Care Manage Rev. 2018;43(2):168–80.

    Article  PubMed  Google Scholar 

  21. Yoon J, Chow A, Rubenstein L. Impact of medical home implementation through evidence-based quality improvement on utilization and costs. Medical Care. 2016;54(2):118–25.

    Article  PubMed  Google Scholar 

  22. Meredith LS, Batorsky B, Cefalu M, Darling J, Stockdale S, Yano EM, et al. Long-term impact of the VA’s medical home demonstration on primary care health professional morale. International Journal for Quality in Health Care. 2016;Under review.

  23. Huynh AK, Lee M, Rose DE, Stockdale SE, Wang M, Rubenstein LVOP, VA Health Services Research & Development/QUERI National Research Meeting. . Evaluation of patient-assessed patient-provider communication in Veterans Assessment and Improvement Laboratory (VAIL) using non-randomized stepped wedge design. VA Health Services Research & Development/QUERI National Research Meeting; Crystal City, VA2017.

  24. Danz MS, Hempel S, Lim YW, Shanman R, Motala A, Stockdale S, et al. Incorporating evidence review into quality improvement: meeting the needs of innovators. BMJ Qual Saf. 2013;22(11):931–9.

    Article  PubMed  PubMed Central  Google Scholar 

  25. Bergman AA, Delevan DM, Miake-Lye IM, Rubenstein LV, Ganz DA. Partnering to improve care: the case of the Veterans’ Health Administration’s Quality Enhancement Research Initiative. J Health Serv Res Policy. 2017;1355819617693871.

    Article  Google Scholar 

  26. Kilbourne AM, Atkins D. Partner or perish: VA health services and the emerging bi-directional paradigm. J Gen Intern Med. 2014;29(Suppl 4):817–9.

    Article  PubMed  PubMed Central  Google Scholar 

  27. Gale RC, Asch SM, Taylor T, Nelson KM, Luck J, Meredith LS, et al. The most used and most helpful facilitators for patient-centered medical home implementation. Implement Sci. 2015;10:52.

    Article  PubMed  PubMed Central  Google Scholar 

  28. Hebert PL, Liu CF, Wong ES, Hernandez SE, Batten A, Lo S, et al. Patient-centered medical home initiative produced modest economic results for Veterans Health Administration, 2010-12. Health affairs. 2014;33(6):980–7.

    Article  PubMed  Google Scholar 

  29. Luck J, Bowman C, York L, Midboe A, Taylor T, Gale R, et al. Multimethod evaluation of the VA’s peer-to-peer toolkit for patient-centered medical home implementation. J Gen Intern Med. 2014;29(Suppl 2):S572–8.

    Article  PubMed  Google Scholar 

  30. Core FE. Report on early implementation findings: VISN 23 Patient Aligned Care Team Demonstration Laboratory Veterans Health Administration; 2011 2011.

  31. Klein SE. The Veterans Health Adminstration: implementing patient-centered medical homes in the nation’s largest integrated delivery system. 2011;2011:1–23.

  32. Tuepker A, Kansagara D, Skaperdas E, Nicolaidis C, Joos S, Alperin M, et al. “We’ve not gotten even close to what we want to do”: a qualitative study of early patient-centered medical home implementation. J Gen Intern Med. 2014;29(Suppl 2):S614–22.

    Article  PubMed  Google Scholar 

  33. Nelson KM, Helfrich C, Sun H, Hebert PL, Liu CF, Dolan E, et al. Implementation of the patient-centered medical home in the Veterans Health Administration: associations with patient satisfaction, quality of care, staff burnout, and hospital and emergency department use. JAMA internal medicine. 2014;174(8):1350–8.

    Article  PubMed  Google Scholar 

  34. Fortney J, Enderle M, McDougall S, Clothier J, Otero J, Altman L, et al. Implementation outcomes of evidence-based quality improvement for depression in VA community based outpatient clinics. Implement Sci. 2012;7:30.

    Article  PubMed  PubMed Central  Google Scholar 

  35. Parker LE, de Pillis E, Altschuler A, Rubenstein LV, Meredith LS. Balancing participation and expertise: a comparison of locally and centrally managed health care quality improvement within primary care practices. Qual Health Res. 2007;17(9):1268–79.

    Article  PubMed  Google Scholar 

  36. Rubenstein LV, Chaney EF, Ober S, Felker B, Sherman SE, Lanto A, et al. Using evidence-based quality improvement methods for translating depression collaborative care research into practice. Fam Syst Health. 2010;28(2):91–113.

    Article  PubMed  Google Scholar 

  37. Curran GM, Pyne J, Fortney JC, Gifford A, Asch SM, Rimland D, et al. Development and implementation of collaborative care for depression in HIV clinics. AIDS Care. 2011;23(12):1626–36.

    Article  PubMed  Google Scholar 

  38. Hamilton AB, Cohen AN, Glover DL, Whelan F, Chemerinski E, McNagny KP, et al. Implementation of evidence-based employment services in specialty mental health. Health services research. 2013;48(6 Pt 2):2224–44.

    Article  PubMed  PubMed Central  Google Scholar 

  39. Fox AB, Hamilton AB, Frayne SM, Wiltsey-Stirman S, Bean-Mayberry B, Carney D, et al. Effectiveness of an evidence-based quality improvement approach to cultural competence training: The Veterans Affairs’ “Caring for Women Veterans” program. J Contin Educ Health Prof. 2016;36(2):96–103.

    Article  PubMed  PubMed Central  Google Scholar 

  40. Lukas CV, Holmes SK, Cohen AB, Restuccia J, Cramer IE, Shwartz M, et al. Transformational change in health care systems: an organizational model. Health Care Manage Rev. 2007;32(4):309–20.

    Article  PubMed  Google Scholar 

  41. Nutting PA, Crabtree BF, Miller WL, Stewart EE, Stange KC, Jaen CR. Journey to the patient-centered medical home: a qualitative analysis of the experiences of practices in the National Demonstration Project. Ann Fam Med. 2010;8 Suppl 1:S45–56 S92.

    Article  PubMed  Google Scholar 

  42. Ovretveit J. Improvement leaders: what do they and should they do? A summary of a review of research. Qual Saf Health Care. 2010;19(6):490–2.

    PubMed  Google Scholar 

  43. Ovretveit J. Understanding the conditions for improvement: research to discover which context influences affect improvement success. BMJ Qual Saf. 2011;20(Suppl 1):i18–23.

    Article  PubMed  PubMed Central  Google Scholar 

  44. Kaplan HC, Provost LP, Froehle CM, Margolis PA. The Model for Understanding Success in Quality (MUSIQ): building a theory of context in healthcare quality improvement. BMJ Qual Saf. 2012;21(1):13–20.

    Article  PubMed  Google Scholar 

  45. King G, Currie M, Smith L, Servais M, McDougall J. A framework of operating models for interdisciplinary research programs in clinical service organizations. Eval Program Plann. 2008;31(2):160–73.

    Article  PubMed  Google Scholar 

  46. McLaughlin C, Kaluzny A. Continuous quality improvement in health care: theory, implementation, and applications. London: Jones and Bartlett Publishers International; 2004.

    Google Scholar 

  47. Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004;82(4):581–629.

    Article  PubMed  PubMed Central  Google Scholar 

  48. Rogers E. Diffusion of innovations. New York: The Free Press; 1995.

    Google Scholar 

  49. Huynh AK, Lee ML, Farmer MM, Rubenstein LV. Application of a nonrandomized stepped wedge design to evaluate an evidence-based quality improvement intervention: a proof of concept using simulated data on patient-centered medical homes. BMC Med Res Methodol. 2016;16(1):143.

    Article  PubMed  PubMed Central  Google Scholar 

  50. MIles MB, Huberman AM. Qualitative data analysis: an expanded sourcebook. Thousand Oaks, CA: Sage; 1994.

    Google Scholar 

  51. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10:21.

    Article  PubMed  PubMed Central  Google Scholar 

  52. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8:139.

    Article  PubMed  PubMed Central  Google Scholar 

  53. Hamilton AB, Brunner J, Cain C, Chuang E, Luger TM, Canelo I, et al. Engaging multilevel stakeholders in an implementation trial of evidence-based quality improvement in VA women’s health primary care. Transl Behav Med. 2017;7(3):478–85.

    Article  PubMed  PubMed Central  Google Scholar 

  54. Atkins D, Kilbourne AM, Shulkin D. Moving from discovery to system-wide change: the role of research in a learning health care system: experience from three decades of health systems research in the Veterans Health Administration. Annu Rev Public Health. 2017;38:467–87.

    Article  PubMed  Google Scholar 

  55. Greene SM, Reid RJ, Larson EB. Implementing the learning health system: from concept to action. Ann Intern Med. 2012;157(3):207–10.

    Article  PubMed  Google Scholar 

  56. Smith M, Halvorson G, Kaplan G. What’s needed is a health care system that learns: recommendations from an IOM report. JAMA. 2012;308(16):1637–8.

    Article  CAS  PubMed  Google Scholar 

  57. Chambers DA, Feero WG, Khoury MJ. Convergence of implementation science, precision medicine, and the learning health care system: a new model for biomedical research. JAMA. 2016;315(18):1941–2.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  58. Dobbins M, Robeson P, Ciliska D, Hanna S, Cameron R, O'Mara L, et al. A description of a knowledge broker role implemented as part of a randomized controlled trial evaluating three knowledge translation strategies. Implement Sci. 2009;4:23.

    Article  PubMed  PubMed Central  Google Scholar 

  59. Landes SJ, McBain SA, Curran GM. An introduction to effectiveness-implementation hybrid designs. Psychiatry Res. 2019;280:112513.

    Article  PubMed  PubMed Central  Google Scholar 

  60. Vindrola-Padros C, Pape T, Utley M, Fulop NJ. The role of embedded research in quality improvement: a narrative review. BMJ Qual Saf. 2017;26(1):70–80.

    Article  PubMed  Google Scholar 

  61. Ganz DA, Barnard JM, Smith NZY, Miake-Lye IM, Delevan DM, Simon A, et al. Development of a web-based toolkit to support improvement of care coordination in primary care. Transl Behav Med. 2018;8(3):492–502.

    Article  PubMed  Google Scholar 

  62. Huynh AK, Hamilton AB, Farmer MM, Bean-Mayberry B, Stirman SW, Moin T, et al. A pragmatic approach to guide implementation evaluation research: strategy mapping for complex interventions. Front Public Health. 2018;6:134.

    Article  PubMed  PubMed Central  Google Scholar 

Download references


We would like to acknowledge Jessica Moreau, Jacqueline Fickel, Daniel Enamorado, and Michael McGowan for helping with the qualitative data coding and with extracting data from administrative data sets.


Funding for this work was provided by the VHA Office of Primary Care Services (XVA 65-018). The views represented in this article are solely those of the authors, and do not represent the views of the VHA or the US government. 

Author information

Authors and Affiliations



SES collected the administrative source documents, participated in the design and collection of the key stakeholder interviews, analyzed the data, and was primarily responsible for conceptualizing and drafting the manuscript. ABH participated in the design and collection of the key stakeholder interviews, provided guidance on qualitative data analysis, and participated in drafting the manuscript. AAB coded qualitative interview data and participated in drafting the manuscript. DER participated in drafting the manuscript. KFG participated in the design and collection of the key stakeholder interviews and participated in drafting the manuscript. TRD provided project oversight, served as a co-Principal Investigator, and contributed significantly to the intellectual content of the manuscript. EMY participated in the overall study (stepped-wedge) design, served as a co-Principal Investigator, and contributed significantly to the intellectual content of the manuscript. LVR developed the evidence-based quality improvement implementation strategy, served as Principal Investigator of the VAIL project, contributed significantly to intellectual content, and participated in drafting the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Susan E. Stockdale.

Ethics declarations

Ethics approval and consent to participate

This study was reviewed and approved by the VHA Greater Los Angeles Healthcare System Institutional Review Board (PCC 2019-010070).

Consent for publication

Not applicable

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Additional file 1.

EBQI Core activities, Participation Measures, and Data Sources.

Additional file 2.

Site and Across-site Workgroup Participation in Core EBQI-PACT Components.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Stockdale, S.E., Hamilton, A.B., Bergman, A.A. et al. Assessing fidelity to evidence-based quality improvement as an implementation strategy for patient-centered medical home transformation in the Veterans Health Administration. Implementation Sci 15, 18 (2020).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: