Skip to main content

The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change

Abstract

Background

Despite growth in implementation research, limited scientific attention has focused on understanding and improving sustainability of health interventions. Models of sustainability have been evolving to reflect challenges in the fit between intervention and context.

Discussion

We examine the development of concepts of sustainability, and respond to two frequent assumptions —'voltage drop,’ whereby interventions are expected to yield lower benefits as they move from efficacy to effectiveness to implementation and sustainability, and 'program drift,’ whereby deviation from manualized protocols is assumed to decrease benefit. We posit that these assumptions limit opportunities to improve care, and instead argue for understanding the changing context of healthcare to continuously refine and improve interventions as they are sustained. Sustainability has evolved from being considered as the endgame of a translational research process to a suggested 'adaptation phase’ that integrates and institutionalizes interventions within local organizational and cultural contexts. These recent approaches locate sustainability in the implementation phase of knowledge transfer, but still do not address intervention improvement as a central theme. We propose a Dynamic Sustainability Framework that involves: continued learning and problem solving, ongoing adaptation of interventions with a primary focus on fit between interventions and multi-level contexts, and expectations for ongoing improvement as opposed to diminishing outcomes over time.

Summary

A Dynamic Sustainability Framework provides a foundation for research, policy and practice that supports development and testing of falsifiable hypotheses and continued learning to advance the implementation, transportability and impact of health services research.

Peer Review reports

Background

As implementation science has grown [1, 2], researchers have advanced from study of facilitators and barriers that influence uptake of effective programs and policies to investigations of strategies to improve uptake. However, often studies evaluate only initial intervention adoption and implementation. Sustained practice change and broader scale-up of interventions [3] rarely are investigated, often due to the constrained timeframes for research that are set by grant mechanisms, and the budgetary and political necessity of many decision-makers to take on a short-term lens.

Recently, there has been interest in understanding and influencing the sustainability of implemented interventions. While this is progress, frequently used conceptualizations of sustainability implicitly replicate assumptions and limitations inherent in the traditional research-to-practice pathway [4, 5], or in its more recent conceptualization as translational research [6]. These conceptualizations of knowledge translation often assume that interventions are optimized prior to implementation, and that they are largely independent of the context in which they are delivered [7].

The presumed linear model of intervention development, efficacy testing and implementation has resulted in the development of an armamentarium of efficacious healthcare treatments, preventive strategies, and public health interventions. While these discoveries have made advances in a number of health domains, they are often difficult to implement in a myriad of practice settings and even harder to sustain over time and in many real world and low-resource settings [8]. In addition, interventions are traditionally expected to perform worse in real-world practice than in the laboratory or the rarified clinical trial setting. We argue for a new approach to sustainability that instead integrates the themes of adaptive, contextually sensitive continuous quality improvement (CQI) and a learning healthcare system with the challenge of intervention sustainment.

The purposes of this article are to explicate:

  1. 1.

    Evolving understandings of sustainability and of related concepts of CQI and the learning healthcare system;

  2. 2.

    An iterative, dynamic approach to sustainability, termed the 'Dynamic Sustainability Framework’ (DSF) that integrates and extends these concepts; and

  3. 3.

    Implications of this framework for research, policy, and practice.

Given the variation with which terms central to dissemination and implementation research can be used, we include a table that lays out working definitions for the central terms of this debate Table 1.

Table 1 Definitions of key terms used in this paper

Moving beyond 'voltage drop’ and 'program drift

While the traditional linear process of intervention development, derived from pharmaceutical medication development models [13], has often resulted in the creation of initially successful interventions, it may be less helpful in enabling these innovations to maximally benefit health. A linear approach may be particularly challenging to apply to complex, multi-component interventions, psychosocial treatments, treatment of the growing number of people with multimorbid conditions [14, 15], and systemic approaches to care [15]. Linear approaches place a premium on creating and 'freezing’ an intervention, developing manuals to ensure its consistent delivery with fidelity [16], and then minimizing deviations from the intervention. For example, the COMMIT stop smoking national project had expert researchers design the complex, lengthy intervention protocol based on research evidence, and then gave a several hundred page manual of operations to local staff to implement in their communities [17].

The importance of internal validity to the scientific process should not be ignored, but its overemphasis relative to generalizability and adaptation runs the risk of creating interventions that will not fit within different, complex or changing settings and of failing to benefit settings, clinicians, and patient populations who are underrepresented in the intervention testing process [7, 16]. Two key implicit assumptions within the traditional intervention development approach may limit ultimate progress toward intervention sustainability and population impact:

First, interventions are often developed with the idea that they can be optimally constructed, manualized, and then tested in a single form applicable across settings and over time. Efficacy trials are designed to screen out noise in the system (patient comorbidities, competing demands and skill variance of clinicians, resource limitations, varying motivations of patients) [7], and thus maximize outcomes. As interventions move to effectiveness and into implementation, one expects that the individual benefit of the intervention will likely drop, due to the added complexity of heterogeneous patients, providers and settings. This is referred to as 'voltage drop’ (Figure 1). The assumption of 'voltage drop’ results in missed opportunities to refine and improve the intervention, instead concluding that the declining benefit is expected and acceptable, and the best possible outcome is that which is achieved at the efficacy stage.

Figure 1
figure 1

Program drift and voltage drop. Illustrating the concepts of 'program drift,’ in which the expected effect of an intervention is presumed to decrease over time as practitioners adapt the delivery of the intervention (A), and 'voltage drop,’ in which the effect of an intervention is presumed to decrease as testing moves from Efficacy to Effectiveness to Dissemination and Implementation (D&I) research stages (B).

Second, the assumption that interventions can be optimally constructed in the early stages of the development and testing process, independent of context, suggests that, even at the stages of implementation and sustainability, change to the intervention is expected to have negative consequences, and that the further a practitioner deviates from the manual, the lower the benefit. This is the concept of 'program drift’ (Figure 1). Delivering the intervention within an efficacy trial may require adherence to protocols that are challenging to deliver within real-world practice. Fidelity ratings then assume that 100% fidelity to original protocols will yield optimal outcomes, and effort is expended to ensure that practitioners do not deviate from the manual. Where clinicians do deviate from the protocol, the field expects that the resulting 'program drift’ will compromise outcomes [12]. We see that this over-reliance on quality assurance to prevent 'program drift’ leads to extensive pressure on real-world practices to adhere to the intervention protocols without evidence that this adherence will lead to optimal outcomes. Quality assurance may inadvertently hamper sustainability and ongoing improvement, customization and optimization of interventions to the detriment of population health.

In contrast, we reject the notion that an intervention can be optimized prior to implementation, and explicitly reject the validity of 'program drift’ and 'voltage drop.’ Rather, we suggest that the most compelling evidence on the maximal benefit of any intervention can only be realized through ongoing development, evaluation and refinement in diverse populations and systems [18]. Instead of viewing contextual factors as interfering with the delivery of an effective intervention and needing to be controlled, we see the opportunity to learn about the optimal fit of an intervention to different care settings [2]. For example, strategies have been developed to adjust organizational characteristics (e.g., culture, climate, structure) to enable improved fit between the intervention and the setting [19]; harnessing the understanding of context can enable beneficial adaptation of the intervention and improve sustainability. Without rejecting these assumptions, we reify early phase interventions tested in the most artificial settings, set quality assurance of interventions as an optimal outcome, and miss opportunities for continued learning and development.

Understanding and advancing sustainability research

As the field of implementation science has matured [20, 21], more emphasis has been placed on understanding sustainability. Researchers have recognized that implementation of interventions, which can often require substantial resources, is meaningless without successful long-term use. Following Rabin et al.’s glossary for Dissemination and Implementation Research in Health [9], we draw distinctions between 'implementation,’ which relates to the initial process of embedding interventions within settings and 'sustainability,’ which relates to the extent that these interventions can continue to be delivered over time, institutionalized within settings, and have necessary capacity built to support their delivery.

Recent articles [22–24] have advanced the idea of an adaptation phase that bridges from the initial implementation effort to a longer-term sustainability phase. They argue the need to examine the fit between the practice setting and the intervention and make changes necessary to improve the integration of the intervention into ongoing care processes. This is consistent with the institutional theory of organizations, which argues that the final stage of innovation requires the 'institutionalizing’ of the new practice so that it becomes a working part of the organization [25].

As a consequence, assessment of organizational characteristics (e.g., structure, climate, culture, resources) is seen as an essential component of sustainability, and indeed, the fit between context and the intervention is at the center of a sustainability phase [24]. There has also been an emphasis on planning for sustainability much earlier in the intervention process [26]. Recent approaches to sustainability locate key efforts squarely in the implementation phase, arguing that once a practice has been implemented within a care system, those who manage the delivery of that practice should turn their attention to ensuring that the practice can be maintained over time [6, 24]. Authors typically suggest that this entails attention to issues of long-term financing, training of the workforce, supervision, and organizational support for the practice [26, 27]. A characteristic of this approach is to postpone emphasis on sustainability until after implementation is well underway, assuming that implementation and sustainability are sufficiently independent.

Authors have also highlighted the utility of assessing outcomes of those who have received the practice, something infrequently collected in routine practice [24, 28]. Seldom is it demonstrated that continued delivery of an intervention confers benefit on the patient population that receives the intervention or the system that delivers it (e.g., cost containment, efficiency of care, quality metrics). Measurement of outcomes over time to determine continued benefit has been shown to support sustainability of the practice [29, 30].

Recent implementation projects have created new tools and scales to study sustainability, including needs assessments, long-term action plans, tracking of program adaptation, financial planning, mapping of community networks, and measurement of the degree to which practices are integrated and institutionalized into service systems [31–33]. While this emerging focus on sustainability is an advance, many studies still assume a largely static service delivery system that needs to be assessed only at key time points, but not in an ongoing manner. To better reflect complexity and change within the system and in context, a more dynamic approach to sustainability is needed.

Framework

The dynamic sustainability framework (DSF)

As Heraclitus observed, 'The only constant is change.’ The Dynamic Sustainability Framework has developed out of our evolving thinking and our collective experience in conducting and advancing implementation science, where we have seen inattention to constant change limit the ability to which implemented interventions are sustained over time in complex clinical and community settings. The DSF (Figure 2) emphasizes that change exists in the use of interventions over time, the characteristics of practice settings, and the broader system that establishes the context for how care is delivered. As classical thinking eloquently captures, change impacts the ability of health interventions to be optimally used and sustained over time. This dynamism exists in the evidence base for interventions that links causal factors to health outcomes, as judged by the continual stream of new publications in academic journals that add to available evidence on the effectiveness of interventions, as well as ongoing practice surveillance systems that capture intervention impact. Dynamism exists in the interventions that support the evidence, which acknowledge ad hoc adaptation and experimentation of evidence-based interventions. Furthermore, it exists in a constantly changing multi-level context [34], internal to a clinical or community setting and the broader care system, be it an organization, community, county, state or country.

Figure 2
figure 2

The dynamic sustainability framework. Illustrating the goal of maximizing the fit between interventions, practice settings, and the broader ecological system over time (represented by T0, T1,…,Tn), each of which has constituent components that may vary.

The DSF, like many implementation models, centers on a few major elements: the intervention, the context in which the intervention is delivered, and the broader ecological system within which the practice settings exist and operate. Distinct from those models, however, is the consideration of these elements over time. The intervention, as shown in the figure, often includes a set of individual components chosen for their ability to effect behavior or biochemical change, an assumed set of characteristics defining who should deliver the intervention, targeted, patient-centered outcomes that the intervention should generate as a result of its use, and a delivery platform (e.g., face-to-face, telephonic, web-based, mobile health app, etc.). Other constructs may also define the intervention.

The DSF anchors the ultimate benefit of the intervention in terms of its ability to fit within a practice setting, typically a clinical or community setting. This context carries its own set of characteristics, including human and capital resources, information systems, organizational culture, climate and structure, and processes for training and supervision of staff. The DSF, consistent with other models, argues that these practice characteristics will directly influence the ability of the intervention to reach the patient population that could benefit, and thus measurement of these contextual constructs is paramount to resolving fit.

At a third level, the DSF identifies the ecological system as an additional driver of the successful implementation and sustainability of an intervention. The ecological system consists of many practice settings that influence those working to incorporate a particular intervention, as well as the legislative and regulatory environment, characteristics of local, regional, state and national markets, and characteristics of the broad population. The ecological system is influenced by changes to available interventions and practice settings, and in turn, influences them.

Specific to the DSF, as emphasized by the dotted lines in Figure 2, is the expectation that change is constant at each of these levels (and ripples across multiple levels), and thus the success of an intervention to be sustained over time lies in the measured, negotiated, and reciprocal fit of an intervention within a practice setting and the practice setting within the larger ecological system. The DSF suggests that optimal fit requires that characteristics of the intervention, practice setting, and ecological system be consistently tracked, using valid, reliable and relevant measures, and expects that interventions, settings and the ecological system should change over time, particularly where data can suggest improvements for each to better meet the needs of patients, the skills and resources within the practice setting, and the larger ecology.

The DSF is intended to suggest a new paradigm to consider the long-term use and ongoing improvement of interventions, recognizing the limitations of the evidence base made available through efficacy and effectiveness trials, and allowing that continuous exposure of the intervention to new populations, new contexts, and new innovations can result in continued improvement of resulting outcomes, thus minimizing the perils of 'program drift’ and 'voltage drop.’ Indeed, the DSF posits that ongoing quality improvement of interventions is the ultimate aim, not quality assurance of them. To be clear, we see value in ensuring appropriate quality assurance within healthcare systems where clear assessments of an appropriate standard of care are made through knowledge of core intervention components. However, the DSF recognizes the limitations of intervention evidence solely from clinical trials and argues that quality improvement processes focused on intervention optimization are ultimately more relevant to achieve sustainment.

The DSF, which has benefitted from the authors’ ongoing dialogue with the Implementation Science community about the challenge of sustainability, follows the spirit of a number of existing models that emphasize three things—importance of context, the need for ongoing evaluation and decision-making, and the goal of continuous improvement. These include Wandersman’s Getting to Outcomes model [31], Continuous Quality Improvement (CQI) [34], system dynamics [35], complexity theory [36], adaptive management [37], and the Evidence Integration Triangle [30]. In addition, the DSF is consistent with alternative views of organizational development [38] and the principles of system science [39]. Distinct in the DSF from many of these other models is the emphasis on omnipresent change, and the central goal of continuously optimizing the fit between the intervention and a dynamic delivery context to achieve maximal benefit. The DSF is anchored around the following seven tenets, for which we think there is evidence, but recommend explicit testing in this context:

An intervention should not be optimized prior to implementation, or even prior to 'sustainability phase’ onset

Interventions benefit from ongoing optimization as they are applied in different contexts [37]. The evidence that supports the benefits of health interventions arises from trials that represent a very small slice of the diversity of demographics, preferences, and health status of the population at large [7, 40], but we should not expect evidence collected in one set of narrow, relatively optimal circumstances to apply perfectly in other, vastly different contexts [41]. A 'corollary’ of this recommendation is that, other things being equal, quality improvement approaches that involve adjusting and refining program should be more effective than 'quality assurance’ procedures that emphasize fidelity to an initial protocol.

Interventions can be continually improved, boosting sustainment in practice, and can enable ongoing learning among developers, interventionists, researchers and patients

There is tremendous opportunity to aggregate evidence on the real-world impact of interventions when used in practice. We can apply models of continual refinement that have been the cornerstone of software development (e.g., Firefox, Reaper, iTunes), as well as Web 2.0 sites (e.g., Wikipedia, Facebook, etc.) [42]. By augmenting trial data with practice-based evidence, we can understand much more about what works for whom, the question underlying personalized medicine [43, 44]. This articulation of the DSF suggests the need for a long-term plan to commit resources for training and ongoing improvement. One implication of the DSF is that intervention impact can also be enhanced through increases in efficiency. The field has developed a plethora of multi-component interventions, often without studies that determine what the minimal set of components are needed to ensure benefit [45]. The DSF, congruent with the Consolidated Framework for Implementation Research [46], emphasizes the importance of streamlining interventions to peel away components that may not be central to improving outcomes or to adapt intervention components to a particular context.

Ongoing feedback on interventions should use practical, relevant measures of progress and relevance

Too often, intervention trials focus on markers that are psychometrically valid but of less relevance to patients and clinicians [3, 47]. For example, very specific, intervention-related symptomatic scales may be most sensitive to change, but there is little guarantee that the measured change translates into a tangible, functional benefit for the patient [48]. The DSF thus suggests the use of measures such as checklists that are relevant to desired outcomes of patients, as well as sensitive to the 'fit’ between interventions and context, and can be feasibly implemented [49]. Across each of the changes in Figure 2, we see available streams of data that can offer leverage points for improving interventions. Environmental changes, for example, can be tracked via population surveys, and market and claims data. Practice changes can be captured through electronic health records, claims data and practice surveys. Evidence reviews can provide key information on knowledge changes, and policy changes can be tracked via available Federal and Non-Profit sources (e.g., CMS, Kaiser Family Foundation, etc.). This is consistent with the CQI model, but with specific emphasis on the ongoing refinement of the intervention to counteract the assumptions of 'Program Drift.’

Voltage drop is NOT inevitable

We reject the assumption that the more diverse and complex a patient population is, the smaller the benefit of intervention, referred to above as 'voltage drop’ [11]. This stems from an expectation that intervention studies require control of the environment to isolate a treatment effect [7]. If we embrace CQI of the specific health intervention, we expect that with more experience, we will better be able to adapt interventions to contexts and patients. As we learn more about what works and what doesn’t and adjust protocols accordingly, the 'voltage’ could maintain or possibly increase over time. This echoes the computer industry, where each new release of a hardware or software line is expected to be better than the prior version. It also finds consonance with the evolution of the flu vaccine, which is constantly refined in response to the changing nature of the influenza virus each season. A culture of improvement is central to ongoing intervention use and treats improvement of the intervention as central to the sustainability process [33].

Programs should be more likely to be maintained when there is strong 'fit’ between the program and the implementation setting

The concept of 'fit’ has been discussed by other authors (e.g., Estabrooks, Glasgow, Dzewaltowski), [50] and goes back at least as far as Rogers’ Diffusion of Innovations[51], where the concept of reinvention evoked the notion of departing from the original intervention concept to 'create’ a new version suited to the preferences and constraints of the local context. Fit is a multi-level construct and involves alignment along multiple dimensions [50]. The DSF posits that fit will likely change over time, due to changes in the way in which an intervention is delivered, the characteristics of patients, providers and settings, and the broader ecological system within which healthcare settings reside. Attention to this fit, through ongoing assessment and quality improvement efforts, should improve sustainment and ultimately identify opportunities for intervention improvement.

Organizational learning should be a core value of the implementation setting

While training and ongoing analyses exist in many organizations, the demands imposed by multiple levels of change require learning to be central to organizational activity for interventions to become sustainable. The context both within an organization and in the broader ecological system is constantly changing, and requires a 'learning organization’ [52] to engage in problem-solving capacity at multiple levels [53]. Organizational learning should also target appropriate adaptation of evidence-based interventions, possibly in rapid learning cycles [44, 54], followed by ongoing assessment and feedback loops. The DSF is congruent with the concepts of the learning healthcare system, again with the emphasis on learning how to better develop, deliver and sustain interventions.

Ongoing stakeholder involvement throughout should lead to better sustainability

Continuously engaging stakeholders throughout the planning, implementation and adaption processes should help increase the fit between the intervention and the local context, and help address evolving issues that might interfere with sustainability. Just as researchers have proposed intervention development processes [12] that focus on the ultimate site where interventions will be delivered, we argue that partnership among all relevant stakeholders is essential to maintaining and improving interventions within care settings.

As Figure 3 depicts, we view sustainability as akin to the challenge of fitting a puzzle piece within an evolving large tableau. Without sensitivity to the characteristics of the intervention, practice setting and the larger system, there is little expectation that the intervention will fit well within the setting, and as the context changes (noted by the changing shape in the figure), sustainment will be harder and harder to achieve. However, by examining and adapting the intervention to a changing context, we believe that sustainment is not only possible, but that the utility of the intervention can be optimized. The experience of delivering the intervention in vivo over time serves to inform the ongoing evolution of the intervention (noted by the change in shape of the puzzle piece). By concentrating on the dynamic 'fit’ between an intervention and its delivery context as the core ingredient underlying sustainability, we embrace opportunities to refine and improve the intervention.

Figure 3
figure 3

Using the dynamic sustainability framework as an engine for quality improvement. The DSF depicts a dynamic view of sustainability, which allows for the evolution of an intervention within a changing delivery system. The changes in the shape of the puzzle pieces and of the contexts reflects the ongoing change to interventions, practice settings, and care systems, and shows the use of quality improvement methods to optimize the 'fit’ and improve the public health benefit of sustained use of interventions.

Contrasting static and dynamic views of sustainability

Table 2 compares static views of sustainability with the DSF, offering sharp contrasts in data collection and analysis, opportunities for knowledge development and incorporation of the 'noise’ within healthcare contexts. Rather than seeking to simplify the phenomenon of study, either by avoiding adaptation of interventions, or assuming the context to be unchanging, the DSF embraces change as a central influence on sustainability. Adaptation is expected, and even encouraged. Assessment of care settings and outcomes is ongoing and incorporated within practice, and staffing and policy changes are incorporated in sustainability planning. Perhaps the biggest contrast of the static and dynamic views is that the static view limits lessons that can in turn provide feedback to other areas of science; the DSF views an abundance of ongoing evidence that can be cycled to continuously improve intervention design, testing, and ongoing system change.

Table 2 Contrasting static views of sustainability with the dynamic sustainability framework

Discussion

Implications of the DSF

This initial formulation of the DSF has implications for future practice, research and policy. For practice, the DSF highlights the need for continuous assessment of the local context, not just prior to implementation. This enables care settings to better manage the fit between their resources, needs and the interventions, including generating consistent feedback on how interventions are delivered to diverse patients and how patients do as a result. The collection and analysis of this information allows practitioners to make informed decisions about how best to utilize existing interventions, allows for potential enhancements to the interventions to be made and shared, and offers better information on which to make decisions to cease delivering interventions that do not have benefit. The intention is to recognize and support rapid learning, real-time problem-solving organizations [54] that are full partners in the generation of knowledge, not just its application. Thus, the DSF promotes the use of multiple methods of planning for sustainability, including simulation modeling of the impact of different decisions, pilot testing of adaptations within local contexts, and continued experimentation. Perhaps an even greater benefit to practice would come through pooling of data across a larger set of sites, practitioners and patients, something done with success for chronic disease [55, 56].

For research, the DSF dispels the notion that intervention development, refinement and improvement are completed prior to real world implementation. In contrast, we suggest that development and refinement is never complete. Rather, sustainability is the process of managing and supporting the evolution of an intervention within a changing context. We recommend (and welcome testing of the idea) that programs that monitor context and adjust accordingly do better long-term. In addition, we see research studies testing whether settings and programs using ongoing CQI or other means of feedback improvement perform better over time. More broadly, we see the DSF as changing the notion of a linear transition from research to practice into a shared process of continual experimentation and analysis through the use of both practice settings and ecological systems to track changes and assess evolving fit between interventions and practice settings. Principles of 'crowd sourcing’ made popular within the IT industry and resulting in open source products like Firefox, Wikipedia and Reaper, blur the lines between research and practice. We see this in the evolution of the electronic health record, and the rapidly evolving patient health record, which is dramatically changing the scenario from one where the medical expert makes all the decisions to one of collaborative care and shared decision-making [57, 58]. Instead of a small team of researchers developing a priori an 'optimal,’ static product, a large and often virtual community including users and consumers continuously upgrades dynamic products.

This initial formulation of the DSF also has implications for policy. Incentives are needed to support ongoing adaptation of interventions, particularly where evidence is limited, specifically including monitoring of progress and documentation of adaptations, using quality measures relevant to stakeholders and patients. In addition, research funders must determine how to support longer-term projects related to sustainability with flexible research designs, since the ultimate benefit of integrating and modifying interventions may not be evident for many years. In addition, infrastructure to support pooling of 'practice-based evidence’ will be needed, in order to ensure sufficient information is available about long-term use and adaptation of interventions. The DSF aligns directly with a number of existing policy initiatives, at national, state and local levels, including the advance of Patient-Centered Medical Homes, Accountable Care Organizations, Pay for Performance initiatives, and support for local demonstration projects.

We recognize that this conceptualization of the DSF model should, consistent with its internal logic, be refined and improved over time. Whether this happens through testing of the tenets laid out in the previous section or contributions of others’ theoretical or empirical studies, we offer the DSF as the beginning of a longer debate. For example, while Figure 2 shows three levels (intervention, practice setting, ecological system), we appreciate that many more levels of the system exist than what we have depicted in the figure. We see further specification of the interrelationships of those levels as a useful area for further research and development. In addition, we see the utility of aligning the DSF with alternative methods of developing and testing dissemination and implementation interventions from the prevailing linear model. In an effort to present examples of how the DSF might be used to consider the sustainability of various types of interventions, we have included Table 3.

Table 3 Illustrative examples of the use of DSF for different types of interventions

Summary

It is time to embrace the culture of a learning healthcare system [44, 59] to promote sustainability of interventions that are optimized and customized to the myriad of clinical and community settings. The enormous changes in health systems in the past few years give particular salience to a conceptualization of sustainability as a dynamic process, and provide unparalleled opportunity to test and refine the principles offered in this paper. Without this emphasis, we reify the past by asserting the primacy of evidence-based guidelines based on evidence from rarified clinical trial settings, and give second-class status to ongoing learning in real world settings. Allowing researchers to be the only ones allowed to generate new knowledge limits the opportunities to consistently improve the care we provide. In the past, these limitations were technological; we lacked the tools and the sources of information to drive rapid improvements. In this era of 'crowd sourcing,’ of exponentially-expanding processing power and global connectedness, we no longer need to adhere to a view that once created, interventions and healthcare settings must be 'frozen’ to optimize effectiveness. Instead, we propose the DSF as helping to reconfigure the research-practice-policy interface, in which the best possible information is gathered and used in real-time to inform policy, improve practice, and answer the highest priority research questions. Only then will the promise of a learning healthcare system be reached.

References

  1. Greenhalgh T, Robert G, Macfarlane F: Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004, 82 (4): 581-629. 10.1111/j.0887-378X.2004.00325.x.

    Article  PubMed  PubMed Central  Google Scholar 

  2. Glasgow RE, Chambers D, Developing Robust, Sustainable: Implementation systems using rigorous, rapid, and relevant science. Clin Transl Sci. 2012, 5 (1): 48-55. 10.1111/j.1752-8062.2011.00383.x.

    Article  PubMed  Google Scholar 

  3. Haines A, Kuruvilla S, Borchert M: Bridging the implementation gap between knowledge and action for health. Bull World Health Organ. 2004, 82 (10): 724-732.

    PubMed  PubMed Central  Google Scholar 

  4. Flay BR: Efficacy and effectiveness trials (and other phases of research) in the development of health promotion programs. Prev Med. 1986, 15: 451-474. 10.1016/0091-7435(86)90024-1.

    Article  CAS  PubMed  Google Scholar 

  5. Greenwald P, Cullen JW: The new emphasis in cancer control. J Natl Cancer Inst. 1985, 74: 543-551.

    CAS  PubMed  Google Scholar 

  6. Stirman SW, Kimberly J, Cook N: The sustainability of new programs and innovations: a review of the empirical literature and recommendations for future research. Implement Sci. 2012, 7: 17-10.1186/1748-5908-7-17.

    Article  Google Scholar 

  7. Rounsaville BJ, Carroll KM, Onken LS: A stage model of behavioral therapies research: getting started and moving on from stage 1. Clin Psychol. 2001, 8 (2): 133-142.

    Google Scholar 

  8. Glasgow RE, Vinson C, Chambers D, Khoury MJ, Kaplan RM, Hunter C: National institutes of health approaches to dissemination and implementation science: current and future directions. Am J Public Health. 2012, 102 (7): 1274-1281. 10.2105/AJPH.2012.300755.

    Article  PubMed  PubMed Central  Google Scholar 

  9. Rabin BA, Brownson RC, Haire J, Kreuter MW, Weaver NL: A glossary for dissemination and implementation research in health. Public Health Manag Pract. 2008, 14 (2): 117-123. 10.1097/01.PHH.0000311888.06252.bb.

    Article  Google Scholar 

  10. Aarons GA, Hurlburt M, Horwitz SM: Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011, 38: 4-23. 10.1007/s10488-010-0327-7.

    Article  PubMed  Google Scholar 

  11. Kilbourne AM, Neumann MS, Pincus HA: Implementing evidence-based interventions in health care: application of the replicating effective programs framework. Implement Sci. 2007, 2: 42-10.1186/1748-5908-2-42.

    Article  PubMed  PubMed Central  Google Scholar 

  12. Weisz JR, Kazdin AE: Evidence-based psychotherapies for children and adolescents. 2010, New York: Guildford Press

    Google Scholar 

  13. Kessler R, Glasgow RE: A proposal to speed translation of healthcare research into practice: dramatic change is needed. Am J Prev Med. 2011, 40 (6): 637-644. 10.1016/j.amepre.2011.02.023.

    Article  PubMed  Google Scholar 

  14. Kaitin KI: Deconstructing the drug development process: the new face of innovation. Clin Pharmacol Ther. 2010, 87 (3): 356-361. 10.1038/clpt.2009.293.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  15. Craig P, Dieppe P, Macintyre S: Developing and evaluating complex interventions: the new medical research council guidance. BMJ. 2008, 337: a1655-10.1136/bmj.a1655.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Prochaska JO, Evers KE, Prochaska JM: Efficacy and effectiveness trials: examples from smoking cessation and bullying prevention. J Health Psychol. 2007, 112 (1): 170-178.

    Article  Google Scholar 

  17. COMMIT Research Group: Community intervention trial for smoking cessation (COMMIT): summary of design and intervention. J Natl Cancer Inst. 1991, 83: 1620-1628.

    Article  Google Scholar 

  18. Cohen DJ, Crabtree BF, Etz RS: Fidelity versus flexibility: translating evidence-base research into practice. Am J Prev Med. 2008, 35: S381-S389. 10.1016/j.amepre.2008.08.005.

    Article  PubMed  Google Scholar 

  19. Glisson C, Schoenwald SK, Hemmelgarn A: Randomized trial of MST and ARC in a two-level evidence-based treatment implementation strategy. J Consult Clin Psychol. 2010, 78 (4): 537-550.

    Article  PubMed  PubMed Central  Google Scholar 

  20. Brownson RC, Colditz GA, Proctor EK: Dissemination and Implementation Research in Health. 2012, New York: OUP

    Google Scholar 

  21. Chambers DA: Advancing sustainability research: challenging existing paradigms. Public Health Dent. 2011, 1 (71): S99-100.

    Article  Google Scholar 

  22. Shediac-Rizkallah MC, Bone LR: Planning for the sustainability of community-based health programs: conceptual frameworks and future directions for research, practice and policy. Health Educ Res. 1998, 13 (1): 87-108. 10.1093/her/13.1.87.

    Article  CAS  PubMed  Google Scholar 

  23. Scheirer MA: Is sustainability possible? A review and commentary on empirical studies of program sustainability. Am J Eval. 2005, 26 (3): 320-347. 10.1177/1098214005278752.

    Article  Google Scholar 

  24. Scheirer MA, Dearing JW: An agenda for research on the sustainability of public health programs. Am J Public Health. 2011, 101 (11): 2059-2067. 10.2105/AJPH.2011.300193.

    Article  PubMed  PubMed Central  Google Scholar 

  25. DiMaggio PJ, Powell WW: The iron cage revisited: institutional isomorphism and collective rationality in organizational fields. Am Sociol Rev. 1983, 48 (2): 147-160. 10.2307/2095101.

    Article  Google Scholar 

  26. Ory MG, Lee SM, Mier N: The science of sustaining health behavior change: the health maintenance consortium. Am J Health Behav. 2010, 34 (6): 647-659.

    Article  PubMed  PubMed Central  Google Scholar 

  27. Swerissen H, Crisp BC: The sustainability of health promotion interventions for different levels of social organization. Health Promot Interv. 2004, 19 (1): 123-130. 10.1093/heapro/dah113.

    Article  Google Scholar 

  28. Greenhalgh J, Meadows K: The effectiveness of the use of patient-based measures of health in routine practice in improving the process and outcomes of patient care: a literature review. J Eval Clin Pract. 1999, 5 (4): 401-416. 10.1046/j.1365-2753.1999.00209.x.

    Article  CAS  PubMed  Google Scholar 

  29. Duckers ML, Wagner C, Vos L: Understanding organisational development, sustainability, and diffusion of innovations within hospitals participating in a multilevel quality collaborative. Implementation science. 2011, 6 (1): 18-10.1186/1748-5908-6-18.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Glasgow RE, Green LW, Taylor M: An Evidence integration triangle for aligning science with policy and practice. Am J Prev Med. 2012, 42 (6): 646-654. 10.1016/j.amepre.2012.02.016.

    Article  PubMed  PubMed Central  Google Scholar 

  31. Wandersman A, Imm P, Chinman M: Getting to Outcomes: Methods and tools for planning, evaluation and accountability. 1999, Rockville, MD: Center for Substance Abuse Prevention

    Google Scholar 

  32. Estabrooks PA, Smith-Ray RL, Dzewaltoski DA: Sustainability of evidence-based community-based physical activity programs for older adults: lessons from Active for Life. Transl Behav Med. 2011, 1 (2): 1-8.

    Article  Google Scholar 

  33. Wilson KD, Kurz RS: Bridging implementation and institutionalization within organizations: proposed employment of continuous quality improvement to further dissemination. J Public Health Manag Pract. 2008, 14 (2): 109-116. 10.1097/01.PHH.0000311887.06252.5f.

    Article  PubMed  Google Scholar 

  34. McLaughlin CP, Kaluzny AD: Continuous Quality Improvement in Health Care. 2004, Sudbury, MA: Jones and Bartlett

    Google Scholar 

  35. Homer JB, Hirsch GB: Systems dynamics modeling for public health: background and opportunities. Am J Public Health. 2006, 96 (3): 452-458. 10.2105/AJPH.2005.062059.

    Article  PubMed  PubMed Central  Google Scholar 

  36. Plsek PE, Greenhalgh T: The challenge of complexity in health care. BMJ. 2001, 323: 625-628. 10.1136/bmj.323.7313.625.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  37. Ebi K: Climate change and health risks: assessing and responding to them through 'adaptive management. Health Aff. 2011, 30 (5): 924-30. 10.1377/hlthaff.2011.0071.

    Article  Google Scholar 

  38. Bushe GR, Marshak RJ: Revisioning organizational development: diagnostic and dialogic premises and patterns of practice. J Appl Behav Sci. 2009, 45 (3): 348-368. 10.1177/0021886309335070.

    Article  Google Scholar 

  39. Checkland P: Soft systems methodology: a thirty year retrospective. Syst Res. 2000, 17: S11-S58. 10.1002/1099-1743(200011)17:1+<::AID-SRES374>3.0.CO;2-O.

    Article  Google Scholar 

  40. Stange KC, Breslau ES, Dietrich AJ: State-of-the-art and future directions in multilevel interventions across the cancer control continuum. J Natl Cancer Inst Monogr. 2012, 44: 20-31.

    Article  Google Scholar 

  41. Green LW: Making research relevant: if it is an evidence-based practice, where’s the practice-based evidence?. Fam Pract. 2008, 25 (Supp 1): i20-i24.

    Article  PubMed  Google Scholar 

  42. Bughlin J, Chui M, Johnson B: The McKinsey Quarterly. The next step in open innovation. 2012, http://www.mckinseyquarterly.com/next_step_in_open_innovation_2155. Accessed

    Google Scholar 

  43. Khoury MJ, Gwinn M, Glasgow RE: Am J Prev Med. 2012, 42 (6): 639-45. 10.1016/j.amepre.2012.02.012.

    Article  PubMed  PubMed Central  Google Scholar 

  44. Etheredge LM: A rapid-learning health system: what would a rapid-learning health system look like, and how might we get there. Health Aff. 2007, Web Exclusive Collection(26):w107-w118 (Published ahead of print January 26, 2007, doi:10.1377/hlthaff.26.2.w107

    Google Scholar 

  45. Meissner HI, Vernon SW, Rimer BK, Wilson KM, Rakowski W, Briss PA: The future of research that promotes cancer screening. Cancer. 2004, 101 (5 Suppl): 1251-9.

    Article  PubMed  Google Scholar 

  46. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC: Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implementation Science. 2009, 4: 50-10.1186/1748-5908-4-50.

    Article  PubMed  PubMed Central  Google Scholar 

  47. Glasgow RE, Kaplan R, Ockene J, Society of Behavioral Medicine Health Policy Committee: Patient-reported measures of psychosocial issues and health behavior should be added to electronic health records. Health Affairs. 2012, 31 (3): 497-504. 10.1377/hlthaff.2010.1295.

    Article  PubMed  Google Scholar 

  48. Tanenbaum S: Evidence-based practice as mental health policy: three controversies and a caveat. Health Aff. 2005, 24 (1): 163-173. 10.1377/hlthaff.24.1.163.

    Article  Google Scholar 

  49. Rabin BA, Purcell P, Naveed S: Advancing the application, quality and harmonization of implementation science measures. Implement Sci. 2012, 7: 119-10.1186/1748-5908-7-119.

    Article  PubMed  PubMed Central  Google Scholar 

  50. Estabrooks P, Glasgow RE, Dzewaltowski DA: Physical activity promotion through primary care. JAMA. 2003, 289 (22): 2913-2916. 10.1001/jama.289.22.2913.

    Article  PubMed  Google Scholar 

  51. Rogers EM: Diffusion of Innovations (4th Edition). 1995, New York: Free Press

    Google Scholar 

  52. Senge PM, Kleiner A, Roberts C: The dance of change: The challenges of sustaining momentum in learning organizations. 1999, New York: Currency/Doubleday, 1

    Google Scholar 

  53. Lorig KR, Holman HR: Self-management education: history, definition, outcomes, and mechanisms. Ann Behav Med. 2003, 26: 1-7.

    Article  PubMed  Google Scholar 

  54. Institute of Medicine: A foundation for evidence-driven practice: A rapid learning system for cancer care. 2010, Workshop Summary. Washington, D.C: NAS Press

    Google Scholar 

  55. Wagner EH, Austin BT, Von Korff M: Organizing care for patients with chronic illness. Milbank Q. 1996, 74: 511-544. 10.2307/3350391.

    Article  CAS  PubMed  Google Scholar 

  56. Unutzer J, Powers D, Katon W: From establishing an evidence-based practice to implementation in real-world settings: IMPACT as a case study. Psychiatr Clin North Am. 2005, 28 (4): 1079-1092. 10.1016/j.psc.2005.09.001.

    Article  PubMed  Google Scholar 

  57. Detmer D, Bloomrosen M, Raymond B: Integrated personal health records: Transformative tools for consumer-centric care. BMC Med Inform Decis Mak. 2008, 8 (45): 1-14.

    Google Scholar 

  58. PatientsLikeMe: PatientsLikeMe website. 2012, http://patientslikeme.com. Accessed March 15, 2012

    Google Scholar 

  59. Institute of Medicine: Best Care at Lower Cost: The Path to Continuously Learning Health Care in America. 2012, Consensus Report. Washington, D.C: NAS Press

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to David A Chambers.

Additional information

Competing interests

REG is a member of the Senior Advisory Board of Implementation Science; all decisions regarding this paper were made by the editors. The authors declare no other competing interests.

Authors’ contributions

DAC conceptualized and wrote the manuscript. REG and KCS provided input on the conceptual framework and edited the manuscript. All authors read and approved the final manuscript.

Authors’ original submitted files for images

Rights and permissions

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Chambers, D.A., Glasgow, R.E. & Stange, K.C. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implementation Sci 8, 117 (2013). https://doi.org/10.1186/1748-5908-8-117

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1748-5908-8-117

Keywords