Skip to main content

Studying de-implementation in health: an analysis of funded research grants

Abstract

Background

Studying de-implementation—defined herein as reducing or stopping the use of a health service or practice provided to patients by healthcare practitioners and systems—has gained traction in recent years. De-implementing ineffective, unproven, harmful, overused, inappropriate, and/or low-value health services and practices is important for mitigating patient harm, improving processes of care, and reducing healthcare costs. A better understanding of the state-of-the-science is needed to guide future objectives and funding initiatives. To this end, we characterized de-implementation research grants funded by the United States (US) National Institutes of Health (NIH) and the Agency for Healthcare Research and Quality (AHRQ).

Methods

We used systematic methods to search, identify, and describe de-implementation research grants funded across all 27 NIH Institutes and Centers (ICs) and AHRQ from fiscal year 2000 through 2017. Eleven key terms and three funding opportunity announcements were used to search for research grants in the NIH Query, View and Report (QVR) system. Two coders identified eligible grants based on inclusion/exclusion criteria. A codebook was developed, pilot tested, and revised before coding the full grant applications of the final sample.

Results

A total of 1277 grants were identified through the QVR system; 542 remained after removing duplicates. After the multistep eligibility assessment and review process, 20 grant applications were coded. Many grants were funded by NIH (n = 15), with fewer funded by AHRQ, and a majority were funded between fiscal years 2015 and 2016 (n = 11). Grant proposals focused on de-implementing a range of health services and practices (e.g., medications, therapies, screening tests) across various health areas (e.g., cancer, cardiovascular disease) and delivery settings (e.g., hospitals, nursing homes, schools). Grants proposed to use a variety of study designs and research methods (e.g., experimental, observational, mixed methods) to accomplish study aims.

Conclusions

Based on the systematic portfolio analysis of NIH- and AHRQ-funded research grants over the past 17 years, relatively few have focused on studying the de-implementation of ineffective, unproven, harmful, overused, inappropriate, and/or low-value health services and practices provided to patients by healthcare practitioners and systems. Strategies for raising the profile and growing the field of research on de-implementation are discussed.

Peer Review reports

Background

In recent years, as healthcare costs have continued to rise [1], wasteful spending has been identified [2, 3], and more robust evidence about health practices and programs has become available, issues pertaining broadly to reducing (frequency and/or intensity) or stopping (ceasing) the use of harmful, ineffective, low-value, and/or unproven health services and practices have become more salient [4,5,6]. Indeed, overuse of health services and practices is quite costly: a report from the Institute of Medicine (IOM) estimated that waste in healthcare accounted for approximately $750 billion in 2009. Further, Berwick and Hackbarth [2] estimated that overtreatment accounted for upwards of $226 billion in wasteful spending in 2011. Rates of overuse vary widely by health area, patient population, and type of health service or practice [5, 7,8,9]. Among a sample of 2106 physicians in the US, participants considered approximately 20% of overall medical care to be unnecessary, including prescription medications (22%), tests (24.9%), and procedures (11.1%) [10]. Overuse of health services and practices has a deleterious effect on patients too, including cost, emotional distress, anxiety, harm, physical discomfort, adverse events, incidental findings, and quality of life, among others [10,11,12,13,14].

With the increasing recognition of these issues, there now exist specialty conferences and tracks focused on overuse (e.g., Preventing Overdiagnosis Conference (http://www.preventingoverdiagnosis.net)) and professional society campaigns (e.g., American Board of Internal Medicine’s [ABIM] Choosing Wisely; [15]). Moreover, the number of commentaries and empirical studies on medical reversals [16,17,18], overuse (i.e., including overuse of screening, testing, and treatment) [4, 5, 8], inappropriate or misuse [13], and low-value care [19, 20] is increasing. Publications on de-implementation of specific health services and practices are increasing. Niven and colleagues identified 43 unique terms relevant to de-adoption, operationalized as “the discontinuation of a clinical practice after it was previously adopted,” among a sample of 109 articles [21]. Such variability in the use of terms is similar to that in implementation science, as reported by McKibbon and colleagues [22] in 2010.

Considerations for studying de-implementation, and identification of multilevel, contextual factors that may facilitate or impede de-implementation, have been discussed in the literature. For example, Prasad and Ioannidis described de-implementation processes that may vary as a function of the type of evidence for the practice, including (1) medical practices for which existing evidence is contradictory, (2) medical practices that are unproven, and (3) medical practices that are novel despite widespread use [17]. Importantly, Prasad and Ioannidis point to the need for more rigorous and replicable studies as a prerequisite to justify broader adoption, implementation, and routine use of health services and practices. Montini and Graham explored historical, economic, professional, and societal factors associated with de-implementation using radical mastectomy as a case study [23]. Niven and colleagues outlined ethical considerations for the discontinuation of health services and practices, including issues pertaining to beneficence, non-maleficence, justice, and autonomy [24]. Several studies have focused on understanding factors associated with de-implementation and developing strategies to facilitate de-implementation as well (e.g., [19, 25,26,27,28].

To complement ongoing efforts to study de-implementation, we used systematic methods to analyze grants funded by the US National Institutes of Health (NIH) and the US Agency for Healthcare Research and Quality (AHRQ) using a search database. Consistent with the general goals of portfolio analyses, our objectives were to identify and describe research studies on de-implementation. Such data are critical for assessing the current state-of-the-science, synthesizing findings across health areas and delivery settings, and informing targeted efforts needed to advance research in this area.

Methods

Consistent with best practices in portfolio analyses, we used the NIH internal-use-only Query, View and Report (QVR) system to identify funded research grants on de-implementation across all 27 NIH Institutes and Centers (ICs) and AHRQ. The analysis included a selective text query involving key search terms along with specific criteria to find the most relevant grants. We limited our search to research-specific grants (vs. conference grants, for example), including the R-series mechanisms (research proposals; R01, R21, R03, R56 [29]) and the K-series mechanisms (Career Development Awards, which include training objectives and a study proposal; K08, K12, K23, K24) [29] that were funded between fiscal year 2000 and February 2017. In addition, we reviewed all research grants that were funded by targeted funding opportunity announcements (FOAs, including program announcements (PAs); program announcement with special receipt, referral, and/or review considerations (PARs); and request for applications (RFAs) with at least a partial (but not exclusive) focus on areas related to de-implementation (e.g., implementation science, healthcare delivery).

Search strategy

We relied on the review by Niven and colleagues [21] to inform our selection of search terms for grants on de-implementation. Table 1 displays the 11 terms that were used in the search, including choosing wisely [15], de-adopt% [21], decrease use [21], de-implement% [21, 23], de-prescrib% [30, 31], disincent% [21], disinvest% [32, 33], exnovat% [34], low-value [13, 19, 20], medical reversal [14, 18], and undiffus% [35], with the “%” notation capturing all tenses and endings for the given base word. The 11 key words were searched in the abstracts, title, and specific aims of grants; grant documents were subsequently extracted from the QVR system for full text coding.

Table 1 Search terms (n = 11), definition and/or key reference, and targeted funding opportunity announcements (FOA; n = 3)

In addition to the 11 key words, we included funded grants from three specific FOAs related to the study of de-implementation in our search strategy. Funding opportunity announcements are displayed in Table 1 and include (1) NIH Dissemination and Implementation Research in Health (R01, R21, and R03 mechanisms), (2) Research Answers to the National Cancer Institute’s (NCI) Provocative Questions (R01 and R21), and (3) the National Heart, Lung, and Blood Institute’s (NHLBI) Research Career Development Programs in T4 Implementation Research (K12). We searched all R-series and K-series grants funded from these announcements across all years they were accepting applications.

Eligibility criteria

We used a sequential process to identify eligible grants for full text extraction and coding and to exclude ineligible grants unrelated to de-implementation. Initially, one author (WN) reviewed all titles and excluded irrelevant grants; a second author (AK) reviewed a randomly selected sample of 10% of the excluded grants for quality control. Next, one author (WN) reviewed the abstract and specific aims of the remaining grants and excluded those considered out-of-scope; again, a second author (AK) reviewed a randomly selected sample of 10% of the excluded grants for quality control. Finally, one author (WN) reviewed the entire research plan of each grant for an explicit focus on de-implementation and excluded those deemed irrelevant. The final sample of grants was coded.

Codebook development and coding process

The codebook was developed through an iterative process that included review of other NIH-specific portfolio analysis codebooks and publications [36,37,38], NIH’s Office of Portfolio Analysis [39], review of the literature on de-implementation [17, 21, 23, 35], and discussion among the study team. A randomly selected subset of four grants was double-coded by two authors (WN and AK) to pilot the codebook; the codebook was discussed, refined, and finalized. The final version of the portfolio analysis codebook (Appendix 1) included eight domains and 36 codes with a “select all that apply” response option. Final domains (and select examples of codes) include (1) overall study objective related to de-implementation (e.g., understand or characterize factors influencing de-implementation; develop strategies to facilitate de-implementation), (2) health area (e.g., cancer, cardiovascular disease), (3) continuum of care (e.g., prevention, treatment), (4) practice or program (e.g., medication, screening test), (5) target patient population (e.g., children, adults), (6) study setting (e.g., hospitals, schools), (7) study design and research methods (e.g., experimental, observational), and (8) data source (e.g., primary, secondary).

Data analysis

Descriptive data for eligible grants were extracted from the QVR system, including administrative details (e.g., funding institute/agency, grant mechanism, year awarded), awardee information (e.g., principal investigator’s (PI) primary affiliation, institution type), funding information (e.g., total amount of funding awarded [USD], FOA, study section review), and publications associated with each grant (e.g., overall, journal) to characterize the portfolio of grants included in analyses.

Full grant application files were downloaded from the NIH platform that provides electronic access to complete grant files. The final set of grants was coded by two authors (WN and AK) using the final version of the codebook; codes were compared and discrepancies were discussed until consensus was reached, as applicable. Each coder read each grant application twice: first to gain familiarity with the content and second to code. Frequency and descriptive statistics were used to characterize the overall sample of grants by each of the eight domains and 36 codes, respectively.

Results

Search and selection of de-implementation grants

Figure 1 displays the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement [40] flow diagram for the reporting of systematic reviews and meta-analyses, as adapted to the portfolio analysis. A total of 1277 grants were retrieved using the search terms and targeted FOAs listed in Table 1. After removing duplicates, the titles of the remaining 542 grants were reviewed for general relevance to de-implementation. Of these, 398 grants were deemed unrelated and subsequently removed; example titles of excluded grants include “The Crystal Optimizer: Kinetic Control of Protein Crystallization,” “Role of Mitochondria in HIV Lipoatrophy,” and “Droplet Cell Array Assays.” The abstracts and specific aims of the remaining 144 grants were further reviewed for relevance to de-implementation. A total of 124 grants were excluded, as they did not have an explicit focus on de-implementation. Examples of excluded grants included those proposing to estimate the effect of health policy reform on patients’ utilization of care services, examine factors influencing providers’ use of new effective drugs, and estimate the impact of payment reform on incidence of hospital-associated infections. The remaining grants (n = 20) were included in the final sample for the portfolio analysis. A copy of each full grant application was reviewed by two study authors (WN and AK). Titles of the 20 grants are listed in Appendix 2.

Fig. 1
figure 1

Flow diagram of identification, screening, eligibility, and inclusion of grants for portfolio analysis on de-implementation of health services and practices. Flow diagram adapted from the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) Statement. aSearch includes the following: National Institutes of Health and Agency for Healthcare Research and Quality; years 2000–2017; all awarded and funded grants; activity codes for all research grants (R series) and career development awards (K series); free text search in abstract, specific aims, title, and summary statement: disinvest%, medical reversal, de-implement%, de-adopt%, exnovat%, low value, undiffus%, “decrease use,” disincentiv%, “choosing wisely,” and de-prescrib% (combined with “or” and % searching all tenses of the base word). FOA query includes the following: all funded grants from the Dissemination and Implementation Research in Health (DIRH) FOAs, PAR-06-039, PAR-07-086, PAR-06-520, PAR-06-521, PAR-10-038, PAR-10-039, PAR-10-040, PAR-13-055, PAR-13-056, PAR-13-054, PAR-16-238, PAR-16-236, and PAR-16-237. RFA query includes funded grants from the Provocative Questions RFAs, CA-13-024 and CA-13-025 (group E, question 3), CA-15-008, and CA-15-009 (question 12). bNumber of unique projects, after removing duplicates (included amended applications, duplicate entries due to multiple principal investigators, etc.). cFirst found of quality control: examined grant titles and study sections of grants. dSecond round of quality control: examined abstract and specific aims of grants. eExclusion reasons: broad focus on variation in patient outcomes, quality of care, or cost; no specific focus on decreasing or stopping use of health services or practices; and examination of impact of health policy or reimbursement changes on utilization of health services or patient outcomes (e.g., reduction in hospital-associated infections) not specific to de-implementation

Descriptives of de-implementation grants

Table 2 displays the descriptives of the 20 de-implementation grants included for full coding and analysis. Fifteen grants were funded by NIH and five were funded by AHRQ. Most grants utilized the R-series research mechanism (n = 17), including R01 (n = 12), R21 (n = 3), R03 (n = 1), and R56 (n = 1), with fewer utilizing the K-series career development award grant mechanism (n = 3; K08 = 2; K24 = 1). A little more than half (n = 11) were awarded funding between 2015 and 2016, reflecting a marked increase within the past few years. Most were awarded to academic institutions (n = 18) compared to a research organization (n = 1) or an independent hospital (n = 1). Principal Investigators’ (PIs) affiliations were with schools/colleges of medicine (n = 18), schools/colleges of public health (n = 8), and schools/colleges of pharmacy (n = 1). Among the sample of 25 PIs (including co-PIs for five grants), 10 held a medical degree (MD), 12 held a doctoral degree (PhD or ScD), and three held a dual degree (MD/PhD). Per NIH career-phase classification, 10 were new investigators and two were early stage investigators. Table 3 displays the amount of money (USD) awarded for all 20 grants, stratified by R-series or K-series and by specific mechanism. A total of $16.5M in direct costs was awarded for all 20 grants; R-series grants totaled $14.9M and K-series grants totaled $1.5M, respectively.

Table 2 Descriptives of de-implementation grants (N = 20)
Table 3 Amount of direct costs (USD) awarded for de-implementation grants (N = 20)

Grants were funded under a range of FOAs (Appendix 2). Examples include generally broad FOAs, such as the AHRQ Health Services Research Projects (R01 mechanism; PA-13-045; PA-14-291), AHRQ Mentored Clinical Scientist Research Career Development Award (K08; PA-13-039), and Research Project Grant (R01; PA-13-302), as well as more narrowly focused FOAs, such as Pilot Clinical Trials for the Spectrum of Alzheimer’s Disease and Age-related Cognitive Decline (R01; PAR-16-365), Dissemination and Implementation Research in Health (DIRH; R21; PA-13-054), and Research Answers to NCI’s Provocative Questions (R01; RFA-CA-15-008). Four de-implementation grants were funded through the DIRH FOA, representing 2% of the 201 grants funded through this FOA across the R01, R21, and R03 mechanisms and all years the FOA was available. Interestingly, this was the first time the PIs of these four grants had been funded through the DIRH PAR. Given the range of FOAs, variability was observed in terms of the study sections through which grants were reviewed. For example, three grants were reviewed by the Health Services Organization and Delivery Study Section (HSOD) and two each by the Health Care Research Training Study Section (HCRT) and the Healthcare Systems and Value Research (HSVR), respectively.

Using the bibliometric function in the QVR database, we identified 64 articles that acknowledged at least one of the 20 de-implementation grants as a source of funding (data not shown). Nine grants were associated with at least one publication (mean 7 per grant; range 1–26; median 3). Collectively, the 64 articles were published across 37 journals (mode = 1). Five of the 37 journals published three or more articles, including Journal of Oncology Practice (n = 7), Statistics in Medicine (n = 6), Health Affairs (n = 5), Health Services Research (n = 3), and Medical Care (n = 3). When compared to the list of top 20 dissemination and implementation (D&I) journals [41], only seven of those journals overlapped with the 37 journals identified herein. This discrepancy may be due in part to the specialty clinical areas in which the de-implementation grants were funded (e.g., cancer, cardiovascular) compared to the more general areas encompassed in many of the top 20 D&I journals. Of note, however, three of the seven overlapping journals (Health Affairs, Medical Care, and Health Services Research) published three or more articles from at least one of the de-implementation grants, perhaps reflecting a growing recognition that these issues cut across health domains and delivery settings.

Table 4 displays the results of the 20 applications that were reviewed and coded. The overall study objective domain included two codes: (1) understanding, describing, and/or characterizing factors influencing de-implementation and (2) developing, evaluating, and/or testing strategies to facilitate de-implementation, with grants double-coded, where applicable. Among the sample of 20 grants, 14 focused on understanding, describing, and/or characterizing factors influencing de-implementation, and 15 focused on developing, evaluating, and/or testing strategies to facilitate de-implementation.

Table 4 Results of portfolio analysis of de-implementation grants (N = 20)

Five grants had an exclusive focus on understanding, describing, and/or characterizing factors influencing de-implementation. For example, one of these five grants proposed to develop and validate a predictive model to identify older, frail, elderly adults for whom colonoscopy would not be recommended, as a first step toward developing a clinical decision support tool for providers to reduce unnecessary colonoscopy procedures. Another grant proposed to use social network analyses to examine multi-level factors associated with the abandonment of inappropriate radiation therapy for cancer patients.

Six grants had an exclusive focus on developing, evaluating, and/or testing strategies to facilitate de-implementation. For example, one grant proposed to conduct a group-randomized controlled trial to assess the impact of an electronic clinical quality measure on reducing overuse of preventive services in primary care settings. Another study proposed to examine the impact of FDA-issued “black box” warning labels and restricted reimbursement on decreasing providers’ use of Epogen to treat anemia in patients with end-stage renal disease. Nine of the 20 grants were coded as having a dual focus on understanding, describing, and/or characterizing factors influencing de-implementation and developing, evaluating, and/or testing strategies to facilitate de-implementation. For example, one grant proposed to study how social networks relate to providers’ recommendation for routine breast cancer screening (including those for whom routine screening is not recommended), and subsequently use agent-based modeling to simulate interventions for changing providers’ screening behavior. Another grant proposed to identify patient- and provider-level factors influencing hormone replacement therapy (HRT) continuation or discontinuation, and to examine temporal trends in HRT prescribing behavior post-Women’s Health Initiative study. As a final example, one grant proposed to understand factors influencing providers’ use of antibiotics for skin and soft tissue infections in the emergency department, and to test the impact of a multi-component antibiotic stewardship intervention on decreasing inappropriate antibiotic prescribing behavior.

The de-implementation grants included a range of both acute and chronic conditions. Although the mode per health area was one, several grants focused on cancer (n = 8), mental health (n = 2), and infectious diseases (n = 3), with most focused on the treatment phase (n = 14) in the care continuum. Many grants focused on reducing or stopping the use of drugs, medications, or therapies (n = 15; e.g., potentially inappropriate medications (PIMs), including the American Geriatric Society’s Beers Criteria and the Screening Tool of Older People’s Prescriptions (STOPP) criteria [42,43,44]; non-curative chemotherapy; antibiotics) with comparatively fewer focused on reducing or stopping the use of preventive or screening tests (n = 8; e.g., colorectal cancer screening, breast cancer screening, use of imaging and biomarkers for post-treatment surveillance). Grants proposed to study the de-implementation of a health service or practice provided to adults (i.e., 18–64 years old; n = 12), older adults (i.e., 65+ years old; n = 11), and children (i.e., < 18 years old; n = 2). Most proposals focused on clinical settings (i.e., clinical care = 16, hospital = 4, nursing home/assisted living facility = 2), with only one in a non-clinical setting (i.e., school = 1).

A variety of study designs and research methods were proposed across the 20 grants. Somewhat surprisingly, however, given the relatively recent emergence of this field of inquiry, twelve grants (60%) proposed to use an experimental (i.e., randomized controlled trial (RCT), cluster RCT (cRCT), pragmatic RCT (pRCT)) or quasi-experimental (i.e., regression discontinuity, natural experiment, interrupted time series) design in their study. Other grants proposed observational (n = 7; prospective or retrospective) or mixed methods designs (n = 4).

Discussion

Relatively few research grants funded by NIH and AHRQ have focused explicitly and/or exclusively on de-implementation, defined as reducing (frequency and/or intensity) or stopping the use or delivery of health services or practices that are ineffective, unproven, harmful, overused, inappropriate, and/or low-value by practitioners and delivery systems to patients. Among the sample of 542 non-duplicative grants, only 20 (3.6%) focused on understanding factors associated with de-implementation and/or testing strategies to facilitate de-implementation. It is encouraging, however, and important to note, that most of these de-implementation grants were funded relatively recently (n = 11 in fiscal years 2015–2016), perhaps reflecting the beginning of an upward trend.

The relatively few de-implementation grants identified herein is rather surprising considering the large population of grants from which they were sampled and the fact that we used 11 key search terms and three specific FOAs across all 27 NIH ICs and AHRQ over a 17-year timeframe. In comparison, a recent portfolio analysis on D&I research grants limited to one NIH IC (i.e., NCI) identified 67 funded grants over a 10-year time frame [38]. A separate analysis on D&I research grants funded across nine NIH ICs identified 76 funded grants over a 7-year timeframe [45]. Although uncommon, NIH-wide portfolio analyses have been conducted on single health areas or diseases (e.g., sickle cell disease [46]; cutaneous wounds [47]); such analyses still identified more grants (n = 247; n = 91) in a shorter time frame (6 years; 1 year) than the 20 grants and 17-year time frame reported herein.

Study findings reflect trends reported in the limited yet growing literature on de-implementation. For example, among the current sample of de-implementation grants, most focused on drugs, medications, or therapies (n = 15) and, to a lesser extent, on preventive or screening tests (n = 8) in healthcare delivery settings. Consistent with the scoping review by Niven and colleagues [21], as well as literature on prevalence of overuse of health services and practices [8, 48], drugs, medications, and therapies tend to be examined most frequently compared to other services or practices (e.g., behavioral interventions). Future research is needed to understand how strategies for de-implementation may vary as a function of the type of health service or practice (e.g., medical intervention, public health intervention, psychological intervention).

Given the relatively nascent state of the field, a surprising number of grants included experimental or quasi-experimental designs. However, these findings are aligned with a review of published de-implementation studies [21]. Moreover, a systematic review by Tabak and colleagues found that 95 (83%) of studies testing implementation strategies proposed to use an experimental design and 13 (11%) proposed to use a quasi-experimental design [49]. Experimental and quasi-experimental designs are the most appropriate design for testing strategies to facilitate de-implementation, as was the overall objective of many of the de-implementation grants. The range of study designs and research methods proposed in the sample of 20 de-implementation grants is encouraging, to the extent that they reflect the best type of design needed to answer the diverse types of questions in de-implementation. Strategies proposed in the grants overlapped with multilevel classifications identified by Colla [19] and colleagues for reducing the use of low-value services (e.g., patient-, provider-, system-, and policy-level strategies).

Overall, results indicate there is a need for increasing the submission and receipt of research grants on de-implementation within the context of two major biomedical research funding agencies in the US. To increase the number and scope of studies in de-implementation, several targeted efforts on behalf of the research, practice, policy, and funding communities may be warranted. Table 5 summarizes these recommendations and provides examples from other initiatives of how each may be pursued.

Table 5 Recommendations for raising the profile of research on de-implementation in health

First, as with any emerging area of inquiry, multifaceted approaches are needed to raise awareness and increase interest among the research community, most likely through the traditional forums of conference presentations, publications, meetings, and working groups, some of which are already underway (e.g., [50, 51]). Second, although several funding announcements are available with an explicit focus on de-implementation (e.g., Reducing Overscreening in Breast, Cervical, and Colorectal Cancers among Older Adults; R01, PA-17-110; Dissemination and Implementation Research in Health; R01, PAR-16-238), additional funding opportunities within and across NIH ICs and AHRQ, as well as other government funding agencies and private foundations, may be needed.

Third, consensus meetings on terminology, definitions, measurement, processes, and outcomes are needed to establish a solid foundation for how best to study de-implementation to move this area of inquiry forward. A recognition of the historical roots of studying overuse, underuse, and misuse, including landmark studies and reports (e.g., [13, 52,53,54,55]), as well as the contribution of other disciplines (e.g., clinical psychology, social psychology, public policy, health economics) in understanding and facilitating de-implementation, will serve efforts to advance research in this area well. Consistent with the overall tenets of implementation research, which emphasizes the use of diverse study designs (e.g., experimental, quasi-experimental, observational, modeling), research methods (e.g., qualitative, quantitative, mixed methods), partnerships, context, and generalizability, research on de-implementation may similarly seek to incorporate such perspectives.

Fourth, better coordination with ongoing de-implementation initiatives and key stakeholders is essential for advancing research on de-implementation. Natural partnerships may include those with the Choosing Wisely campaign [16] and the Canadian Deprescribing Network [56], among others. Fifth, researchers and funders should leverage forthcoming policy and practice changes as an opportunity to conduct “embedded research” on de-implementation. Embedded research studies are those nested prospectively within ongoing or forthcoming policy and practice changes as such efforts unfold (e.g., Oregon Health Insurance Experiment; [57,58,59]). Changes in policy—whether it be “small p” policy changes at an organization or “big p” policy changes at local, state, or federal government [60, 61]—may be a particularly opportune time to study how policy termination has an impact on the de-implementation of health services and practices. Across all recommendations, carefully-crafted messages will need to convey the importance and urgency of additional research in this area without inadvertently promulgating misconceptions of withholding appropriate health services and practices.

Several limitations of this study should be noted. Although we used 11 different terms identified from a recent systematic review [21], relevant published literature [14, 17, 35, 62], and targeted FOAs, some grants may not have been captured in our search. In the absence of clear, consensus-based conceptualizations, we relied on published examples of de-implementation protocols and studies (e.g., [25, 27, 63,64,65,66]) as well as recent reviews, thought pieces, and conceptualizations of de-implementation [14, 17, 21] to guide our selection of funded grants and subsequent coding process. It is possible that the sample of 20 grants is an underestimate of the total number of grants on de-implementation, to the extent that some grants may include several items in a survey or questions in a semi-structured interview about de-implementation, but do not have a predominant or exclusive focus on de-implementation. Future research is needed to identify similarities and differences between the study of de-implementation of health practices and programs and related areas of inquiry, such as healthcare delivery, implementation science, improvement science, and others. Although related, the predictors, processes, strategies, constructs, and outcomes involved in studying the reduction or cessation of an established health practice or program may be different than those involved in studying the increase or initiation of a new health practice or program, respectively.

Finally, given that our access to full copies of grant applications is limited to two US federally funded research entities (i.e., NIH and AHRQ), these findings may not generalize to other US-funded entities (e.g., Patient-Centered Outcomes Research Institute (PCORI), Centers for Disease Control and Prevention (CDC)) or non-US research-funding organizations.

Conclusions

Over the past 17 years, relatively few research grants on de-implementation of health services and practices have been funded across NIH and AHRQ. Collaboration is needed among researchers, practitioners, policymakers, patients, and funding agencies to increase the importance of research on de-implementation across health areas, services, practices, and settings. Moving forward, the 20 grants reported herein provide a snapshot of the status of US-funded research on de-implementation and highlight an opportunity for more activity in this area of inquiry.

Abbreviations

ABIM:

American Board of Internal Medicine

AHRQ:

Agency for Healthcare Research and Quality

cRCT:

Cluster randomized controlled trial

HRT:

Hormone replacement therapy

ICs:

Institutes and Centers

NCI:

National Cancer Institute

NIH:

National Institutes of Health

PA:

Program announcement

PAR:

Program announcement with special receipt, referral, and/or review considerations

PI:

Principal investigator

pRCT:

Pragmatic randomized controlled trial

PRISMA:

Preferred Reporting Items for Systematic Reviews and Meta-Analyses

QVR:

Query, View and Report

RCT:

Randomized controlled trial

RFA:

Request for applications

References

  1. Martin AB, Hartman M, Benson J, Catlin A. National Health Spending in 2014: faster growth driven by coverage expansion and prescription drug spending. Health Aff (Millwood). 2016;35:150–60.

    Article  Google Scholar 

  2. Berwick DM, Hackbarth AD. Eliminating waste in US health care. JAMA. 2012;307:1513–6.

    Article  CAS  PubMed  Google Scholar 

  3. Bentley TG, Effros RM, Palar K, Keeler EB. Waste in the US health care system: a conceptual framework. Milbank Q. 2008;86:629–59.

    Article  PubMed  PubMed Central  Google Scholar 

  4. Morgan DJ, Brownlee S, Leppin AL, Kressin N, Dhruva SS, Levin L, Landon BE, Zezza MA, Schmidt H, Saini V, Elshaug AG. Setting a research agenda for medical overuse. BMJ. 2015;351:h4534.

    Article  PubMed  PubMed Central  Google Scholar 

  5. Morgan DJ, Dhruva SS, Wright SM, Korenstein D. 2016 update on medical overuse: a systematic review. JAMA Intern Med. 2016;

  6. Sinha P. Don't just do something, stand there! JAMA Intern Med. 2017;

  7. Keyhani S, Falk R, Howell EA, Bishop T, Korenstein D. Overuse and systems of care: a systematic review. Med Care. 2013;51:503–8.

    Article  PubMed  Google Scholar 

  8. Korenstein D, Falk R, Howell EA, Bishop T, Keyhani S. Overuse of health care services in the United States: an understudied problem. Arch Intern Med. 2012;172:171–8.

    Article  PubMed  Google Scholar 

  9. Baxi SS, Kale M, Keyhani S, Roman BR, Yang A, Derosa AP, Korenstein D. Overuse of health Care Services in the Management of cancer: a systematic review. Med Care. 2017;55:723–33.

    Article  PubMed  Google Scholar 

  10. Lyu H, Xu T, Brotman D, Mayer-Blackwell B, Cooper M, Daniel M, Wick EC, Saini V, Brownlee S, Makary MA. Overtreatment in the United States. PLoS One. 2017;12:e0181970.

    Article  PubMed  PubMed Central  Google Scholar 

  11. Hicks LK. Reframing overuse in health care: time to focus on the harms. J Oncol Pract. 2015;11:168–70.

    Article  PubMed  Google Scholar 

  12. Lipitz-Snyderman A, Bach PB. Overuse of health care services: when less is more ... More or less. JAMA Intern Med. 2013;173:1277–8.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Chassin MR, Galvin RW. The urgent need to improve health care quality: Institute of Medicine National Roundtable on health care quality. JAMA. 1998;280:1000–5.

    Article  CAS  PubMed  Google Scholar 

  14. Cifu AS, Prasad VK. Medical debates and medical reversal. J Gen Intern Med. 2015;30:1729–30.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Cassel CK, Guest JA. Choosing wisely: helping physicians and patients make smart decisions about their care. JAMA. 2012;307:1801–2.

    Article  CAS  PubMed  Google Scholar 

  16. Prasad V, Cifu A. Medical reversal: why we must raise the bar before adopting new technologies. Yale J Biol Med. 2011;84:471–8.

    PubMed  PubMed Central  Google Scholar 

  17. Prasad V, Ioannidis JP. Evidence-based de-implementation for contradicted, unproven, and aspiring healthcare practices. Implement Sci. 2014;9:1.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Prasad VK, Cifu AS. Ending medical reversal: improving outcomes, saving lives: JHU Press; 2015.

  19. Colla CH. Swimming against the current—what might work to reduce low-value care? N Engl J Med. 2014;371:1280–3.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  20. Schwartz AL, Landon BE, Elshaug AG, Chernew ME, McWilliams JM. Measuring low-value care in Medicare. JAMA Intern Med. 2014;174:1067–76.

    Article  PubMed  PubMed Central  Google Scholar 

  21. Niven DJ, Mrklas KJ, Holodinsky JK, Straus SE, Hemmelgarn BR, Jeffs LP, Stelfox HT. Towards understanding the de-adoption of low-value clinical practices: a scoping review. BMC Med. 2015;13:255.

    Article  PubMed  PubMed Central  Google Scholar 

  22. McKibbon KA, Lokker C, Wilczynski NL, Ciliska D, Dobbins M, Davis DA, Haynes RB, Straus SE. A cross-sectional study of the number and frequency of terms used to refer to knowledge translation in a body of health literature in 2006: a tower of Babel? Implement Sci. 2010;5:16.

    Article  PubMed  PubMed Central  Google Scholar 

  23. Montini T, Graham ID. “entrenched practices and other biases”: unpacking the historical, economic, professional, and social resistance to de-implementation. Implement Sci. 2015;10:24.

    Article  PubMed  PubMed Central  Google Scholar 

  24. Niven DJ, Leigh JP, Stelfox HT. Ethical considerations in the de-adoption of ineffective or harmful aspects of healthcare. Healthc Manage Forum. 2016;29:214–7.

    Article  PubMed  Google Scholar 

  25. Covinsky KE, Redberg RF. An intervention to reduce use of low-value imaging tests. JAMA Intern Med. 2016;176:198.

    Article  PubMed  Google Scholar 

  26. Clyne B, Smith SM, Hughes CM, Boland F, Bradley MC, Cooper JA, Fahey T. Effectiveness of a multifaceted intervention for potentially inappropriate prescribing in older patients in primary care: a cluster-randomized controlled trial (OPTI-SCRIPT study). Ann Fam Med. 2015;13:545–53.

    Article  PubMed  PubMed Central  Google Scholar 

  27. Colla CH, Mainor AJ, Hargreaves C, Sequist T, Morden N. Interventions aimed at reducing use of low-value health services a systematic review. Med Care Res Rev. 2016:1077558716656970.

  28. Silverstein W, Lass E, Born K, Morinville A, Levinson W, Tannenbaum C. A survey of primary care patients’ readiness to engage in the de-adoption practices recommended by choosing wisely Canada. BMC Res Notes. 2016;9:301.

    Article  PubMed  PubMed Central  Google Scholar 

  29. Types of Grant Programs [http://grants.nih.gov/grants/funding/funding_program.htm#RSeries].

  30. Reeve E, Gnjidic D, Long J, Hilmer S. A systematic review of the emerging definition of ‘deprescribing’with network analysis: implications for future research and clinical practice. Br J Clin Pharmacol. 2015;80:1254–68.

    Article  PubMed  PubMed Central  Google Scholar 

  31. Thompson W, Farrell B. Deprescribing: what is it and what does the evidence tell us? The Canadian journal of hospital pharmacy. 2013;66:201.

    Article  PubMed  PubMed Central  Google Scholar 

  32. Elshaug AG, Hiller JE, Tunis SR, Moss JR. Challenges in Australian policy processes for disinvestment from existing, ineffective health care practices. Aust New Zealand Health Policy. 2007;4:23.

    Article  PubMed  PubMed Central  Google Scholar 

  33. Garner S, Littlejohns P. Disinvestment from low value clinical interventions: NICEly done? BMJ (Online). 2011;343

  34. Rodriguez HP, Henke RM, Bibi S, Ramsay PP, Shortell SM. The Exnovation of chronic care management processes by physician organizations. Milbank Q. 2016;94:626–53.

    Article  PubMed  PubMed Central  Google Scholar 

  35. Davidoff F. On the undiffusion of established practices. JAMA Intern Med. 2015;175:809–11.

    Article  PubMed  Google Scholar 

  36. Alfano CM, Bluethmann SM, Tesauro G, Perna F, Agurs-Collins T, Elena JW, Ross SA, O'Connell M, Bowles HR, Greenberg D, Nebeling L. NCI funding trends and priorities in physical activity and energy balance research among cancer survivors. J Natl Cancer Inst. 2016;108

  37. Ramirez AS, Galica K, Blake KD, Chou WY, Hesse BW. Cancer communication science funding trends, 2000-2012. J Natl Cancer Inst Monogr. 2013;2013:133–9.

    Article  PubMed  Google Scholar 

  38. Neta G, Sanchez MA, Chambers DA, Phillips SM, Leyva B, Cynkin L, Farrell MM, Heurtin-Roberts S, Vinson C. Implementation science in cancer prevention and control: a decade of grant funding by the National Cancer Institute and future directions. Implement Sci. 2015;10:4.

    Article  PubMed  PubMed Central  Google Scholar 

  39. Office of Portfolio Analysis [https://dpcpsi.nih.gov/opa].

  40. Moher D, Liberati A, Tetzlaff J, Altman DG, Group P. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med. 2009;6:e1000097.

    Article  PubMed  PubMed Central  Google Scholar 

  41. Norton WE, Lungeanu A, Chambers DA, Contractor N. Mapping the growing discipline of dissemination and implementation science in health. Scientometrics. 2017;

  42. Gallagher P, O’Mahony D. STOPP (screening tool of older persons’ potentially inappropriate prescriptions): application to acutely ill elderly patients and comparison with beers’ criteria. Age Ageing. 2008;37:673–9.

    Article  PubMed  Google Scholar 

  43. O'Mahony D, O'Sullivan D, Byrne S, O'Connor MN, Ryan C, Gallagher P. STOPP/START criteria for potentially inappropriate prescribing in older people: version 2. Age Ageing. 2014:afu145.

  44. Campanelli CM. American Geriatrics Society updated beers criteria for potentially inappropriate medication use in older adults: the American Geriatrics Society 2012 beers criteria update expert panel. J Am Geriatr Soc. 2012;60:616.

    Article  PubMed Central  Google Scholar 

  45. Tinkle M, Kimball R, Haozous EA, Shuster G, Meize-Grochowski R. Dissemination and implementation research funded by the US National Institutes of Health, 2005-2012. Nurs Res Pract. 2013;2013:909606.

    PubMed  PubMed Central  Google Scholar 

  46. Gavini N, Hoots WK, Mensah GA, Hanspal M. An analysis of the NIH-supported sickle cell disease research portfolio. Blood Cells Mol Dis. 2015;54:198–205.

    Article  PubMed  Google Scholar 

  47. Richmond NA, Lamel SA, Davidson JM, Martins-Green M, Sen CK, Tomic-Canic M, Vivas AC, Braun LR, Kirsner RS. US-national institutes of health-funded research for cutaneous wounds in 2012. Wound Repair Regen. 2013;21:789–92.

    Article  PubMed  Google Scholar 

  48. Colla CH, Morden NE, Sequist TD, Schpero WL, Rosenthal MB. Choosing wisely: prevalence and correlates of low-value health care services in the United States. J Gen Intern Med. 2015;30:221–8.

    Article  PubMed  Google Scholar 

  49. Tabak RG, Ramsey AT, Baumann AA, Kryzer E, Montgomery K, Lewis E, Padek M, Powell BJ, Brownson RC. Variation in research designs used to test the effectiveness of dissemination and implementation strategies: a systematic review. In: Annual conference on the science of dissemination and implementation in health. Washington, D.C.; 2015.

  50. Norton WE, Harris R, Kramer BK. De-implementation: exploring multi-level strategies for reducing Overdiagnosis and overtreatment. In: Preventing Overdiagnosis conference. Barcelona, Spain; 2016.

  51. Boyce C, Mensah GA, Science CfTRaI, National Health L, and Blood Institute: De-implementation scientific challenges and prospects: to have, to hold, or to let go. 2017.

    Google Scholar 

  52. McGlynn EA, Asch SM, Adams J, Keesey J, Hicks J, DeCristofaro A, Kerr EA. The quality of health care delivered to adults in the United States. N Engl J Med. 2003;348:2635–45.

    Article  PubMed  Google Scholar 

  53. Lohr KN, Brook RH, Kamberg CJ, Goldberg GA, Leibowitz A, Keesey J, Reboussin D, Newhouse JP. Use of medical care in the RAND health insurance experiment: diagnosis-and service-specific analyses in a randomized controlled trial. Med Care. 1986;24:S1–S87.

    Article  CAS  PubMed  Google Scholar 

  54. Kahan JP, Bernstein SJ, Leape LL, Hilborne LH, Park RE, Parker L, Kamberg CJ, Brook RH. Measuring the necessity of medical procedures. Med Care. 1994;32:357–65.

    Article  CAS  PubMed  Google Scholar 

  55. Brook RH, Chassin MR, Fink A, Solomon DH, Kosecoff J, Park RE. A method for the detailed assessment of the appropriateness of medical technologies. Int J Technol Assess Health Care. 1986;2:53–63.

    Article  CAS  PubMed  Google Scholar 

  56. Tannenbaum C, Farrell B, Shaw J, Morgan S, Trimble J, Currie J, Turner J, Rochon P, Silvius J. An ecological approach to reducing potentially inappropriate medication use: Canadian Deprescribing network. Can J Aging. 2017;36:97–107.

    Article  PubMed  Google Scholar 

  57. Finkelstein A, Taubman S, Wright B, Bernstein M, Gruber J, Newhouse JP, Allen H, Baicker K, Group OHS. The Oregon health insurance experiment: evidence from the first year. Q J Econ. 2012:qjs020.

  58. Allen H, Baicker K, Taubman S, Wright B, Finkelstein A. The Oregon health insurance experiment: when limited policy resources provide research opportunities. J Health Polit Policy Law. 2013;38:1183–92.

    Article  PubMed  Google Scholar 

  59. Baicker K, Taubman SL, Allen HL, Bernstein M, Gruber JH, Newhouse JP, Schneider EC, Wright BJ, Zaslavsky AM, Finkelstein AN. The Oregon experiment—effects of Medicaid on clinical outcomes. N Engl J Med. 2013;368:1713–22.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  60. Brownson RC, Chriqui JF, Stamatakis KA. Understanding evidence-based public health policy. Am J Public Health. 2009;99:1576–83.

    Article  PubMed  PubMed Central  Google Scholar 

  61. Purtle J, Peters R, Brownson RC. A review of policy dissemination and implementation research funded by the National Institutes of Health, 2007-2014. Implement Sci. 2016;11:1.

    Article  PubMed  PubMed Central  Google Scholar 

  62. Zelmer J. De-prescribing: when less is more in healthcare. Healthc Policy. 2016;11:6–10.

    PubMed  PubMed Central  Google Scholar 

  63. Aron DC, Lowery J, Tseng CL, Conlin P, Kahwati L. De-implementation of inappropriately tight control (of hypoglycemia) for health: protocol with an example of a research grant application. Implement Sci. 2014;9:58.

    Article  PubMed  PubMed Central  Google Scholar 

  64. Fenton JJ, Kravitz RL, Jerant A, Paterniti DA, Bang H, Williams D, Epstein RM, Franks P. Promoting patient-centered counseling to reduce use of low-value diagnostic tests: a randomized clinical trial. JAMA Intern Med. 2016;176:191–7.

    Article  PubMed  Google Scholar 

  65. Martin P, Tamblyn R, Ahmed S, Benedetti A, Tannenbaum C. A consumer-targeted, pharmacist-led, educational intervention to reduce inappropriate medication use in community older adults (D-PRESCRIBE trial): study protocol for a cluster randomized controlled trial. Trials. 2015;16:266.

    Article  PubMed  PubMed Central  Google Scholar 

  66. Winchester DE, Schmalfuss C, Helfrich CD, Beyth RJ. A specialty-specific, multimodality educational quality improvement initiative to deimplement rarely appropriate myocardial perfusion imaging. Open Heart. 2017;4:e000589.

    Article  PubMed  PubMed Central  Google Scholar 

  67. Rogers EM. Diffusion of innovations: Simon and Schuster; 2010.

  68. Norton WE, Kennedy A, Chambers DA. Studying de-implementation in health: an analysis of funded research grants. Implement Sci. in press;

  69. Kimberly JR. Managerial innovation. Handb Organ Des. 1981;1:104.

    Google Scholar 

Download references

Acknowledgements

The authors are grateful to Gracie and Ginger Norton and Zahra and Junior Kennedy for their support on earlier versions of the manuscript.

Funding

Not applicable.

Availability of data and materials

Limited data (e.g., researcher, organization, title, abstract) on the funded research grants reported herein are available from the publicly accessible and searchable database, NIH Research Portfolio Online Reporting Tools (RePORT), https://projectreporter.nih.gov/. Detailed data (e.g., specific aims, research plan) on the funded research grants are not available in the public domain and are only available to extramural staff at the NIH and other HHS/federal agencies.

Author information

Authors and Affiliations

Authors

Contributions

WEN, AEK, and DAC participated in the study development, study design, coordination of data collection, data analysis and interpretation, and manuscript writing and editing. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Wynne E. Norton.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

WEN is on the editorial board for the journal Implementation Science. AEK and DAC declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendices

Appendix 1

Portfolio Analysis Codebook

  1. 1.

    What are the overall study objectives? Select all that apply. (Variable: Objectives)

    1. a.

      Understand, describe, and/or characterize factors influencing de-implementation

    2. b.

      Develop, evaluate and/or test strategies or approaches to facilitate de-implementation

  2. 2.

    What health domain, condition, disease, or area does the grant focus on? Select all that apply. (Variable: Domain)

  1. a.

    Cancer

  2. b.

    Cardiovascular disease

  3. c.

    Geriatric syndromes (e.g., delirium, falls, functional decline, etc.)

  4. d.

    Hormone imbalance (e.g., hormone replacement therapy)

  5. e.

    Infectious diseases (e.g., urinary tract infections)

  6. f.

    Kidney disease

  7. g.

    Mental health

  8. h.

    Neurological (e.g., Alzheimer’s disease)

  9. i.

    Multiple (e.g., multiple preventive services not specified)

  10. j.

    Not specified

  1. 3.

    What part of the continuum of care does the study address? Select all that apply. (Variable: Continuum)

  1. a.

    Prevention

  2. b.

    Screening and/or detection

  3. c.

    Diagnosis

  4. d.

    Treatment

  5. e.

    Surveillance

  1. 4.

    What is the type of practice, program, intervention, or innovation that is the focus of de-implementation? Select all that apply. (Variable: Practice)

  1. a.

    Drugs, medications, or therapies (e.g., opioids, antibiotics, chemotherapy, dialysis)

  2. b.

    Preventive or screening tests (e.g., mammograms, prostate specific antigen [PSA])

  3. c.

    Other: _____________________________________

  1. 5.

    What is the target patient population (i.e., for whom is the practice, program, intervention, or innovation being de-implemented)? Select all that apply. (Variable: PtPop)

  1. a.

    Children (<18 years old)

  2. b.

    Adults (18–64 years old or non-Medicaid population)

  3. c.

    Older adults (≥65 years old or Medicaid population)

  1. 6.

    In what type of organization, delivery system, or setting does the study take place? Select all that apply. (Variable: Setting)

  1. a.

    Clinical care (e.g., primary care, specialty care)

  2. b.

    Hospital

  3. c.

    Nursing home or assisted living facility

  4. d.

    School (e.g., elementary school)

  1. 7.

    What type of study design and research methods are used? Select all that apply. (Variable: Study)

  1. a.

    Experimental (e.g., randomized controlled trial [RCT], pragmatic RCT [pRCT], cluster/group RCT [cRCT])

  2. b.

    Measurement or algorithm development or validation

  3. c.

    Mixed methods (i.e., collection and integration of qualitative and quantitative data; e.g., sequential exploratory design)

  4. d.

    Observational (e.g., prospective, retrospective)

  5. e.

    Qualitative (e.g., focus groups, semi-structured interviews)

  6. f.

    Quasi-experimental (e.g., regression discontinuity, interrupted time series)

  7. g.

    Systems science (e.g., simulation modeling, social network analysis)

  1. 8.

    What is the proposed source of data collection and analysis? Select all that apply. (Variable: Data)

  1. a.

    Primary data collection and analysis (i.e., research team collecting data for purpose of study; e.g., surveys, focus groups)

  2. b.

    Secondary data collection and analysis (i.e., research team leveraging existing or routinely-collected data for purpose of study; e.g., claims data, chart review, pharmacy refill)

Appendix 2

Table 6 Funding agency, de-implementation grant titles, and funding opportunity announcement (FOA; N = 20)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Norton, W.E., Kennedy, A.E. & Chambers, D.A. Studying de-implementation in health: an analysis of funded research grants. Implementation Sci 12, 144 (2017). https://doi.org/10.1186/s13012-017-0655-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13012-017-0655-z

Keywords