Skip to main content

Longitudinal assessment of the association between implementation strategy use and the uptake of hepatitis C treatment: Year 2



To increase the uptake of evidence-based treatments for hepatitis C (HCV), the Department of Veterans Affairs (VA) established the Hepatitis Innovation Team (HIT) Collaborative. Teams of providers were tasked with choosing implementation strategies to improve HCV care. The aim of the current evaluation was to assess how site-level implementation strategies were associated with HCV treatment initiation and how the use of implementation strategies and their association with HCV treatment changed over time.


A key HCV provider at each VA site (N = 130) was asked in two consecutive fiscal years (FYs) to complete an online survey examining the use of 73 implementation strategies organized into nine clusters as described by the Expert Recommendations for Implementing Change (ERIC) study. The number of Veterans initiating treatment for HCV, or “treatment starts,” at each site was captured using national data. Providers reported whether the use of each implementation strategy was due to the HIT Collaborative.


Of 130 sites, 80 (62%) responded in Year 1 (FY15) and 105 (81%) responded in Year 2 (FY16). Respondents endorsed a median of 27 (IQR19–38) strategies in Year 2. The strategies significantly more likely to be chosen in Year 2 included tailoring strategies to deliver HCV care, promoting adaptability, sharing knowledge between sites, and using mass media. The total number of treatment starts was significantly positively correlated with total number of strategies endorsed in both years. In Years 1 and 2, respectively, 28 and 26 strategies were significantly associated with treatment starts; 12 strategies overlapped both years, 16 were unique to Year 1, and 14 were unique to Year 2. Strategies significantly associated with treatment starts shifted between Years 1 and 2. Pre-implementation strategies in the “training/educating,” “interactive assistance,” and “building stakeholder interrelationships” clusters were more likely to be significantly associated with treatment starts in Year 1, while strategies in the “evaluative and iterative” and “adapting and tailoring” clusters were more likely to be associated with treatment starts in Year 2. Approximately half of all strategies were attributed to the HIT Collaborative.


These results suggest that measuring implementation strategies over time is a useful way to catalog implementation of an evidence-based practice over time and across settings.

Peer Review reports


Hepatitis C virus (HCV) is a leading cause of liver cancer and liver failure in the USA [1]. In fiscal year 2015 (FY15), new, highly-efficacious treatments for HCV became widely available as the evidence-based practice for curing HCV [2]. Prior treatments included injected interferon, which was suboptimal because of side effects, contraindications, and poor efficacy despite year-long treatments. The newer medications included pill-only regimens with minimal side effects, short courses, and high cure rates. As the largest provider for HCV nationally, the Department of Veterans Affairs (VA) sought to spread this innovation rapidly across the country by developing the Hepatitis C Innovation Team (HIT) Collaborative. Funded by VA leadership as a 4-year, national initiative, the HIT Collaborative supported the development of regional teams of providers with the goal of promoting the uptake of evidence-based HCV care throughout the VA. The HIT Collaborative included the components of learning or quality improvement collaboratives [3], such as using in-person learning sessions, plan-do-study-act cycles, team calls, email/web-support, external support of active data collection, feedback and education by experts and Collaborative leadership, and outreach to local and national leadership.

Together, the availability of new HCV treatments and VA’s implementation efforts to increase their uptake resulted in a dramatic increase in treatment and cure of HCV in VA. While only 10% of Veterans with HCV infection had ever been cured of HCV as of the end of FY14, by the end of FY16, 43% or 84,192 Veterans were cured, representing a fourfold increase [4].

While this rapid, national implementation effort has been a tremendous success for VA, it has also provided the opportunity to study the use of implementation strategies and their association with a measurable clinical outcome over time and on a national scale. We previously reported on the associations between implementation strategies and HCV treatment starts at the site level and the extent to which strategies were related to HIT activities in the first year of the HIT Collaborative [5]. To frame our evaluation, we used expert-based definitions of implementation strategies, or methods to increase the uptake of evidence-based practices [6, 7], from the Expert Recommendations for Implementing Change (ERIC) project. ERIC defined 73 individual strategies [8] and then used a mixed-methods process called concept mapping [9] to develop conceptually distinct clusters of the strategies [10]. As the Collaborative continued, we had the opportunity to study how implementation strategy use changed over time within a nationwide healthcare system, particularly in the context of a learning collaborative [3].

This evaluation aimed to document (1) how reported implementation strategy use evolved over the first two years of the HIT Collaborative, (2) the changes in the associations between implementation strategies and clinical outcomes over time, and (3) the role of the HIT Collaborative in implementation strategy uptake.


Assessment of implementation strategies

Within VA, the HIT Collaborative was led by the National Hepatitis C Resource Center and the Office of Strategic Integration | Veterans Engineering Resource Center with the support of the National HIV, Hepatitis, and Related Conditions (HHRC) Program Office. These data were collected in service of the HIT Collaborative program evaluation, which was reviewed by the VA Pittsburgh Healthcare System IRB and deemed to be a quality improvement project and approved as such by HHRC. All participation in the evaluation was voluntary.

Using implementation strategies as defined by the ERIC project [8] and the clusters of strategies developed by Waltz et al. [10], we created a survey as previously described [5]. The survey asked whether each of the 73 strategies was used to improve HCV care at the site (yes/no) and, if so, whether the use of each strategy could be attributed to support provided by the HIT Collaborative (yes/no). We emailed providers a link to a web-based survey annually in FY15 (Year 1) and FY16 (Year 2).


The HIT Collaborative provided the contact information for VA HCV providers and HIT Collaborative members (as listed on the self-provided team rosters) from the 130 VA medical “stations” as classified by Population Health Services of the VA [11]. The individuals who were emailed included providers with varying degrees of affiliation with the HIT Collaborative. Potential participants were emailed twice as a group and once individually by the HIT Collaborative Leadership team following a modified Dillman approach [12]. Additionally, the HIT Collaborative Leadership Team encouraged members to complete the assessment on regularly-scheduled calls.

At sites with more than one respondent, we retained a single response following a “key informant” technique, where a knowledgeable individual answers questions for a site [13]. In the first year, we determined that the responses would be preferentially retained from an HCV lead clinician. If this person did not respond, then we prioritized responses from the following providers (in descending order of priority): physician, pharmacist, nurse practitioner, physician assistant, other providers, and system redesign staff. In the second year, we prioritized retention from the repeat respondents. If there was not a response from the same person in the second year, then we followed the prioritization scheme as outlined above. Previous experience with the survey and discussions with HIT team members suggested that any of the individuals mentioned above would be knowledgeable enough to answer questions about HCV treatment and the use of implementation strategies.

Data collection

In addition to collecting site-level implementation strategies in each year, respondents provided information regarding their participation in or affiliation with the HIT Collaborative (members vs. non-members), years in VA, and clinical specialty. Additionally, we classified sites using VA site complexity classifications [14]. These ratings range from levels 1a, 1b, 1c, 2, and 3, in descending order of complexity, and are based on site financial resources, number of patients served, acuity, and services provided. The primary clinical outcome of interest was the number of Veterans started on HCV treatment per year at each site, as defined by VA’s Population Health website [11].


We first described the provider and site characteristics in each year. For sites with more than one respondent in a given year, we calculated the interrater reliability. We then assessed the endorsement of strategies to determine which strategies were the most commonly used in Year 2 and the change in strategy use between years. We used chi-square tests to assess the statistical significance of the change in the proportion of participants using each strategy between years. The association between the total number of strategies and the total number of treatment starts was assessed using Pearson’s correlation and then linear regression, controlling for site complexity. Next, we assessed which individual strategies were significantly associated with the number of treatment starts using Spearman’s test of correlation. Using the map of strategy clusters from Waltz et al. [10], we arrayed the strategies significantly associated with treatment starts in Years 1 and 2 to show how they differed over time. Sensitivity analyses were conducted to assess whether the findings differed between repeat responders and first-time responders in Year 2 (at the site and individual respondent levels). We also assessed differences in responses by HIT membership status using chi-square tests.

For each implementation strategy, we asked participants whether they would attribute their use of the strategy at their site to the HIT Collaborative. We assessed these data by dividing the total number of sites attributing their use of a strategy to the HIT collaborative by the total number of sites endorsing that strategy. We then calculated the proportion of strategies endorsed in each cluster that was attributed to the HIT Collaborative.


Respondent characteristics

In Year 1 (FY15) and Year 2 (FY16), 62% and 81% of 130 VA sites responded to the surveys, respectively. Of these sites, 69 (53%) responded in both years. The same individual responded in both years in 47 (36%) of these cases. In Year 2, 23 sites had duplicate responses, and the interrater reliability was 0.65. There were 11 sites that only responded in Year 1 and 34 sites that only responded in Year 2. The responding sites in Year 2 were responsible for 84% of all national HCV treatment starts in that year.

Table 1 shows the respondent characteristics in both years. While there was a trend towards more pharmacy providers and less primary care providers who responded in Year 2 vs. Year 1, this difference was not statistically significant (p = 0.14). Otherwise, the general demographic characteristics of the respondents were the same between years. There was a broad distribution of site complexity represented in both years. Notably, not all respondents were affiliated with members of the HIT Collaborative.

Table 1 Respondent characteristics

The number of patients with HCV and the numbers and percentages treated in each year are illustrated in Table 2. Approximately 20% of patients in the participating sites were treated in Year 1.

Table 2 HCV treatment among VA site and responding sites

Association between the total number of strategies endorsed and treatment starts

The FY15 findings were previously published [5] and are presented here for comparison with the FY16 data. A mean of 25 ± 14 strategies were endorsed in Year 1 and 28 ± 14 strategies in Year 2. The total number of strategies endorsed was significantly correlated with the number of treatment starts in both years (Year 1 r = 0.43, p < 0.01; Year 2 r = 0.33, p < 0.01). The sites in the highest vs. lowest quartile of treatment starts endorsed significantly more strategies in both years (Year 1, 33 vs. 15 strategies; Year 2, 34 vs. 20, p < 0.01). The total number of strategies endorsed was significantly associated with total treatment starts when controlling for site complexity in both years. The adjusted R2 for these models was 0.30 in Year 1 and 0.29 in Year 2.

Specific strategies endorsed in each year

The most commonly used strategies in both years were changing the record system, having the medications on the formulary, using data experts, data warehousing, tailoring strategies, promoting adaptability, engaging patients to be active participants in their care, and intervening with patients to promote uptake/adherence (Table 3). Overall strategy use was largely consistent between the two years; however, there were four strategies with statistically significant differential uptake. Those with increased uptake from FY15 to FY16 were tailoring strategies to deliver HCV care (+ 18%), promoting adaptability (+ 20%), sharing knowledge (+ 19%), and using mass media (+ 18%). None of the year-to-year decreases met the threshold for significance. At a more categorical level, the evaluative and iterative strategies had the least amount of change between the years, and the strategies in the clusters of engaging consumers and adapting and tailoring to the context had the most positive increases between the two years.

Table 3 Strategy endorsement in each year and change between years

Table 4 shows that the strategies significantly associated with HCV treatment changed across the two years and that the difference-making strategies varied by year. In Years 1 and 2, respectively, 28 and 26 strategies were significantly associated with treatment starts, 12 strategies overlapped both years, 16 were unique to Year 1, and 14 were unique to Year 2.

Table 4 Strategies significantly associated with treatment in both years vs. only Year 1 or Year 2

Figure 1 illustrates the changes in the strategies significantly associated with treatment starts unique to each year overlaid on the ERIC cluster map, with numbering corresponding to Table 2. The map shows that the significant strategies shifted locations between the years. In Year 1 of availability of the new clinical innovation, the uptake of treatment was significantly associated with strategies in the "provide interactive assistance", "train and educate stakeholders", and "develop stakeholder interrelationships" clusters. In Year 2, the significant strategies were in the "use evaluative and iterative strategies" and adapt and tailor to the context" clusters.

Fig. 1

Strategies associated with treatment starts in Year 1 vs. Year 2 mapped onto strategy clusters

We assessed differences in strategy endorsement between repeat responders and new responders in Year 2. The sites that were newly responding in Year 2 had strategy endorsement patterns more similar to repeat responders’ responses in Year 2 than in Year 1. One exception is that in Year 2, newly responding sites were significantly more likely to endorse “change the record” system than repeat sites (72% vs. 49%, p = 0.01). Otherwise, strategy endorsement appeared very similar to that of the Year 2 results for sites that had also responded in Year 1.

Respondents who were participating in the HIT Collaborative were significantly more likely to endorse specific strategies (Table 5). The strategies associated with increased treatment starts are highlighted in bold. Eight of the 10 strategies that were more likely to be endorsed by HIT members in Year 1 were also significantly positively associated with treatment starts in that year. The two strategies that were more likely to be endorsed by HIT members in Year 2 were significantly associated with treatment starts in Year 1 but not Year 2.

Table 5 Strategies significantly associated with HIT membership*

Attribution to the HIT Collaborative

Respondents self-reported whether each strategy they endorsed was used as a result of the HIT Collaborative (or would not have been used if it were not for the HIT Collaborative), and we assessed this in each year and how this changed over time (raw data presented in Additional file 1). Figure 2 shows the total number of sites endorsing each strategy (height of vertical bars) and the proportion that was attributed to the HIT Collaborative (blue). In Year 2, 54% of all strategy use was attributed to the HIT Collaborative, compared to 41% in Year 1. The ranges of strategy use and attribution were wide. Since the results were similar in both years, Year 2 (FY16) is presented below.

Fig. 2

Strategy use and attribution to the HIT Collaborative in Year 2

Table 6 shows the change between years of strategies attributed to the HIT Collaborative. The cluster least likely to be attributed to the Collaborative was “engaging consumers.” “Training and educating stakeholders” was also unlikely to be attributed to the HIT Collaborative in Year 1 (27%), but the percent attribution increased to 40% in Year 2. There was a 21% increase in the strategies attributable to the HIT in the "evaluative and iterative" cluster between the two years. HIT members were more likely than non-HIT members to attribute seeking the guidance of experts in implementation (29% vs. 9%, p = 0.01) and identifying and preparing champions (36% vs. 16%, p = 0.03) to the Collaborative.

Table 6 Percentage of strategies attributed to the HIT Collaborative by cluster in each year


We previously examined self-reported use of implementation strategies in a national sample and found that the total number of strategies used by a site was associated with the clinical outcome of HCV treatment starts [5]. In this study, we further investigated the use of strategies over time and the associations between site-level implementation strategy use and treatment starts over time. While many of the strategies did not change in use from Year 1 to Year 2, there was a significant increase in the following specific strategies: tailoring strategies, promoting adaptability, sharing knowledge, and using mass media. Moreover, while the total number of strategies used was associated with increased HCV treatment in each year, the specific strategies associated with treatment starts varied over time.

The EPIS Implementation Framework posits that implementation happens along four phases: Exploration, Preparation, Implementation, and Sustainment. The implementation strategies that are appropriate may vary by the stage of implementation [15]. These data support that the implementation strategies associated with successful implementation of a clinical innovation change over time. When the oral HCV medications/clinical innovation first became available, successful sites focused on preparation or “pre-implementation.” The associated implementation strategies included training and education, as well as developing stakeholder interrelationships and seeking interactive assistance. After sites had established the necessary education and relationships, the most successful sites then focused on iterating these and adapting to the context in Year 2. In other words, the strategies associated with treatment shifted across the ERIC group’s concept map between years. The geography of this concept map was developed by implementation experts considering the global similarities of the strategies. The present data suggest that clusters and strategies within clusters may be differentially relevant based on the phase(s) of implementation. For example, strategies from the “train and education stakeholders” and “develop stakeholder interrelationships” clusters were important in the first year while the strategies within the “use evaluative and iterative strategies” cluster were more closely associated with treatment starts in the second year. A more detailed accounting of the specific phases of implementation that were present over the reporting period could further clarify the relationships between phases of implementation and the strategies used. Our finding that the strategies associated with HCV treatment changed from Year 1 to Year 2 supports the notion that successful sites used evolving strategies as the clinical innovation became more available and as the learning collaborative evolved.

These results must be interpreted in the context of the national HIT Collaborative. The timing of the national efforts to improve HCV care also corresponded to a major shift in the treatment of HCV from difficult-to-use, interferon-based treatments to simple, highly efficacious, curative, pill-based treatments. We aimed to assess how the Collaborative influenced the choice of activities to promote HCV care and how much of the strategy uptake related to the HIT Collaborative itself versus independent, local activities in response to the availability of these newer HCV medications. We asked providers to comment on whether strategies would have been done without the Collaborative and assessed how members of the Collaborative employed strategies differently than non-members. We found that approximately half of all implementation efforts nationally were attributed to the HIT Collaborative, meaning that providers felt that half of the activities would not have been done without the HIT Collaborative. Moreover, the activities that members of the Collaborative were more likely to engage in were those that are considered to be core elements of learning/quality improvement collaboratives in the literature [3]. For example, education/training, team-building, and communication with leadership are all essential elements of learning collaboratives and were endorsed more frequently by HIT members than non-members. Thus, these analyses are useful in assessing the role of the learning collaborative in the uptake of the clinical innovation via implementation strategy uptake.

While learning collaboratives are increasingly popular, their effectiveness is relatively untested [3]. These data provide preliminary support for the effectiveness of learning collaboratives. For example, strategies that were difference-making in this sample were often those strategies considered to be core components of learning collaboratives including using data relay, training and education, creating new teams, facilitation/technical assistance, and stakeholder interrelationship strategies. Moreover, the ERIC strategy “facilitate the formation of groups of providers and foster a collaborative learning environment” which specifically refers to learning collaboratives was significantly associated with treatment in both years, suggesting that the learning collaborative itself was associated with increased treatment. The HIT Collaborative specifically focused on building stakeholder interrelationships and using principles of system redesign including rapid cyclic tests of change. Sites frequently endorsed these types of strategies and attributed their uptake to the Collaborative. Additionally, the HIT Collaborative attribution in the "evaluative and iterative" cluster, which was critical to the success in Year 2, increased substantially from Year 1 to 2, indicating that the Collaborative was instrumental to site-level success. Most of the strategies endorsed more often by HIT members vs. non-members were significantly associated with treatment starts. These data thus provide some preliminary support for learning collaboratives as an effective means of increasing the uptake of a clinical innovation or evidence-based practice.

Implementation strategies have historically been difficult to measure. Generally, tracking strategies has been completed by observation and not by self-reporting on a comprehensive list of strategies [16, 17]. While we previously reported on developing a survey using the definitions of implementation strategies from the ERIC study, it remained unclear whether these strategies would be understandably and reliably interpreted by non-implementation scientists. In both years of data collection, we found an association between clinical outcomes with specific implementation strategies. The second year of data collection further demonstrates that providers could interpret and answer questions about implementation strategies. First, there was adequate interrater reliability within sites in both years. Second, there was consistency across the years in that several of the strategies were associated with treatment starts. Third, the strategies that were individually associated with treatment starts were in some cases those strategies supported by implementation literature. For example, facilitation is a well-studied strategy and was associated with higher treatment rates in Year 1 [18,19,20,21,22]. Fourth, we found that the providers were able to generally distinguish between similar strategies. The strategy clusters were designed to group similar strategies, and we did not find strong correlations between endorsement of specific strategies within a cluster. In fact, there was significant variation in endorsement of the strategies within clusters (where the most similar strategies are housed). These findings indicate that such surveys can be used to track implementation strategies across a wide range of provider types, education, and geographic locations.

This study has several notable limitations. First, we relied on year-end self-report of implementation and included only one response per site. We found that the results had face-validity, as outlined above, and there was adequate interrater reliability when we assessed the reports of sites with more than one respondent. However, future studies would benefit from directly observing site-level implementation or documenting the application of implementation strategies over the course of the reporting time period. Second, it is unclear if the self-reported attribution data related to increased awareness of the HIT Collaborative or social desirability, given that more of the respondents in Year 2 were HIT members. Theoretically, any strategy could have been “correctly” attributed to the Collaborative, since the Collaborative leadership team encouraged and supported any strategy that the sites felt would be effective. A third limitation is that we included limited contextual factors in our associations between implementation strategy use and clinical outcomes. However, participant characteristics were not significantly associated with strategy endorsement. Given the uniformity of structure within VA, this may be less important, but in applying these lessons to non-VHA sites, more contextual information may have to be collected. We were also unable to assess the timing, sequencing, or intensity of implementation strategies within each year. It may be that specific strategies need to be sequenced in a specific order within the year. While the simple listing of strategies allowed us to quickly collect data from across the country, these data do not detail how the strategies were operationalized by sites. Often the application of implementation strategies may vary broadly and lead to difficulties assessing which elements of a strategy are critical or difference-making [23]. We successfully tracked strategies across the years, which is a strength of these analyses. A final limitation was our limited choice of outcome measures. We considered focusing on the proportion of patients treated as our primary outcome but were concerned that sites with fewer patients would have an artificial advantage, since sites treating half of the patients would look the same, whether they had 20 or 2000 patients to treat. Our findings should be validated in other clinical contexts with other medical problems. Future work in this area could aim to address whether specific combinations of strategies are important or how to use these data to address low-performing sites. Future work could also assess the associations between the sites’ stage of implementation [24] and strategy utilization.


These findings collectively indicate that the strategies associated with the uptake of a clinical innovation change over time from “pre-implementation” strategies including training and education, interactive assistance, and developing stakeholder interrelationships to strategies that are evaluative and iterative and adapt to the context, which indicate a more mature phase of implementation. This research advances the field by providing support for the implementation strategies and clusters developed by the ERIC group. This project demonstrates the utility of deploying surveys of implementation strategies sequentially across the life of a national clinical program, which could provide guidance to other national initiatives.



Expert Recommendations for Implementing Change


Fiscal year


Hepatitis C virus


Hepatitis C Innovation Team


Interquartile range


Department of Veterans Affairs


  1. 1.

    Naghavi MWH, Lozano R, et al. Global, regional, and national age-sex specific all-cause and cause-specific mortality for 240 causes of death, 1990−2013: a systematic analysis for the Global Burden of Disease Study 2013. Lancet. 2015;385(9963):117–71.

    Article  Google Scholar 

  2. 2.

    Wedemeyer H. Towards interferon-free treatment for all HCV genotypes. Lancet. 2015;385(9986):2443–5.

    Article  Google Scholar 

  3. 3.

    Nadeem E, Olin SS, Hill LC, Hoagwood KE, Horwitz SM. Understanding the components of quality improvement collaboratives: a systematic literature review. Milbank Q. 2013;91(2):354–94.

    Article  Google Scholar 

  4. 4.

    Belperio PS, Chartier M, Ross DB, Alaigh P, Shulkin D. Curing hepatitis C virus infection: best practices from the U.S. Department of Veterans Affairs. Ann Intern Med. 2017;167(7):499–504.

    Article  Google Scholar 

  5. 5.

    Rogal SS, Yakovchenko V, Waltz TJ, Powell BJ, Kirchner JE, Proctor EK, et al. The association between implementation strategy use and the uptake of hepatitis C treatment in a national sample. Implement Sci. 2017;12(1):60.

    Article  Google Scholar 

  6. 6.

    Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8:139.

    Article  Google Scholar 

  7. 7.

    Waltz TJ, Powell BJ, Chinman MJ, Smith JL, Matthieu MM, Proctor EK, et al. Expert Recommendations for Implementing Change (ERIC): protocol for a mixed methods study. Implement Sci. 2014;9:39.

    Article  Google Scholar 

  8. 8.

    Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10:21.

    Article  Google Scholar 

  9. 9.

    Kane M, Trochim WMK. Concept mapping for planning and evaluation. Thousand Oaks: Sage; 2007.

    Google Scholar 

  10. 10.

    Waltz TJ, Powell BJ, Matthieu MM, Damschroder LJ, Chinman MJ, Smith JL, et al. Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: results from the Expert Recommendations for Implementing Change (ERIC) study. Implement Sci. 2015;10:109.

    Article  Google Scholar 

  11. 11.

    US Department of Veterans Affairs: Veterans Health Administration Office of Public Health. Population health services website: US Department of Veterans Affairs: Veterans Health Administration Office of Public Health; 2017 [Available from:

  12. 12.

    Thorpe C, Ryan B, McLean SL, Burt A, Stewart M, Brown JB, et al. How to obtain excellent response rates when surveying physicians. Fam Pract. 2009;26(1):65–8.

    CAS  Article  Google Scholar 

  13. 13.

    Tremblay M. The key informant technique: a nonethnographic application. Am Anthropol. 1957;59:688–701.

    Article  Google Scholar 

  14. 14.

    VHA Facility Complexity Model Veterans Health Administration; 2015 [Available from: Accessed 22 Nov 2017.

  15. 15.

    Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Admin Pol Ment Health. 2011;38(1):4–23.

    Article  Google Scholar 

  16. 16.

    Bunger AC, Powell BJ, Robertson HA, MacDowell H, Birken SA, Shea C. Tracking implementation strategies: a description of a practical approach and early findings. Health Res Policy Syst. 2017;15(1):15.

    Article  Google Scholar 

  17. 17.

    Boyd MR, Powell BJ, Endicott D, Lewis CC. A method for tracking implementation strategies: an exemplar implementing measurement-based care in community behavioral health clinics. Behav Ther. 2018;49(4):525–37.

    Article  Google Scholar 

  18. 18.

    Kirchner JE, Ritchie MJ, Pitcock JA, Parker LE, Curran GM, Fortney JC. Outcomes of a partnered facilitation strategy to implement primary care-mental health. J Gen Intern Med. 2014;29(Suppl 4):904–12.

    Article  Google Scholar 

  19. 19.

    Ritchie MJ, Parker LE, Edlund CN, Kirchner JE. Using implementation facilitation to foster clinical practice quality and adherence to evidence in challenged settings: a qualitative study. BMC Health Serv Res. 2017;17(1):294.

    Article  Google Scholar 

  20. 20.

    Hysong SJ, Best RG, Pugh JA. Audit and feedback and clinical practice guideline adherence: making feedback actionable. Implement Sci. 2006;1:9.

    Article  Google Scholar 

  21. 21.

    Ivers N, Jamtvedt G, Flottorp S, Young JM, Odgaard-Jensen J, French SD, et al. Audit and feedback: effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev. 2012;6:CD000259.

    Google Scholar 

  22. 22.

    Ivers NM, Sales A, Colquhoun H, Michie S, Foy R, Francis JJ, et al. No more ‘business as usual’ with audit and feedback interventions: towards an agenda for a reinvigorated intervention. Implement Sci. 2014;9:14.

    Article  Google Scholar 

  23. 23.

    Improved Clinical Effectiveness through Behavioural Research G. Designing theoretically-informed implementation interventions. Implement Sci. 2006;1:4.

    Article  Google Scholar 

  24. 24.

    Chamberlain P, Brown CH, Saldana L. Observational measure of implementation progress in community based settings: the Stages of Implementation Completion (SIC). Implement Sci. 2011;6:116.

    Article  Google Scholar 

Download references


The contents of this paper are solely from the authors and do not represent the views of the U.S. Department of Veterans Affairs or the U.S. Government.


Funding was provided by the HIV, Hepatitis, and Related Conditions Programs to conduct the program evaluation of the VISN Hepatitis Innovation Team Initiative.

Funding for Dr. Rogal’s time was provided in part by AHRQ grant K12 HS019461 (PI: Kapoor) and the support of the VA Pittsburgh Healthcare System. The preparation of this article was supported in part by the Implementation Research Institute (IRI), at the George Warren Brown School of Social Work, Washington University in St. Louis; through awards from the National Institute of Mental Health (5R25MH08091607) and the Department of Veterans Affairs, Health Services Research & Development Service, Quality Enhancement Research Initiative (QUERI). BJP was supported by the National Institue of Mental Health (K01MH113806).

Availability of data and materials

All data generated or analyzed during this study are included in this published article.

Author information




SR, VY, RG, AP, DR, TM, MC, and MJC helped to conceptualize the study design and data collection tool. The analytical plan was developed by SR, VY, TW, BP, JK, RG, AP, TM, and MJC. SR and VY conducted the analyses. All authors performed significant editing of the manuscript and read and approved the final manuscript.

Corresponding author

Correspondence to Shari S. Rogal.

Ethics declarations

Ethics approval and consent to participate

The Pittsburgh VA IRB determined that the initiative was exempt under a provision applying to quality improvement. This study was approved by the HIV, Hepatitis and Related Conditions Programs in the Office of Specialty Care Services as a quality improvement project as a part of the program evaluation for the Hepatitis C Innovation Team Collaborative.

Consent for publication

Figure 1 is adapted from a previously-published image from Dr. Waltz et al’s paper and is used with his permission.

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Additional file

Additional file 1:

Participant attribution of strategies to the HIT Collaborative in each year. (DOCX 18 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Rogal, S.S., Yakovchenko, V., Waltz, T.J. et al. Longitudinal assessment of the association between implementation strategy use and the uptake of hepatitis C treatment: Year 2. Implementation Sci 14, 36 (2019).

Download citation


  • Learning collaborative
  • Quality improvement
  • Cirrhosis
  • Advanced liver disease
  • Implementation science