Skip to main content
  • Systematic review
  • Open access
  • Published:

Implementation of electronic prospective surveillance models in cancer care: a scoping review

Abstract

Background

Electronic prospective surveillance models (ePSMs) for cancer rehabilitation include routine monitoring of the development of treatment toxicities and impairments via electronic patient-reported outcomes. Implementing ePSMs to address the knowledge-to-practice gap between the high incidence of impairments and low uptake of rehabilitation services is a top priority in cancer care.

Methods

We conducted a scoping review to understand the state of the evidence concerning the implementation of ePSMs in oncology. Seven electronic databases were searched from inception to February 2021. All articles were screened and extracted by two independent reviewers. Data regarding the implementation strategies, outcomes, and determinants were extracted. The Expert Recommendations for Implementing Change taxonomy and the implementation outcomes taxonomy guided the synthesis of the implementation strategies and outcomes, respectively. The Consolidated Framework for Implementation Research guided the synthesis of determinants based on five domains (intervention characteristics, individual characteristics, inner setting, outer setting, and process).

Results

Of the 5122 records identified, 46 interventions met inclusion criteria. The common implementation strategies employed were “conduct educational meetings,” “distribute educational materials,” “change record systems,” and “intervene with patients to enhance uptake and adherence.” Feasibility and acceptability were the prominent outcomes used to assess implementation. The complexity, relative advantage, design quality, and packaging were major implementation determinants at the intervention level. Knowledge was key at the individual level. At the inner setting level, major determinants were the implementation climate and readiness for implementation. At the outer setting level, meeting the needs of patients was the primary determinant. Engaging various stakeholders was key at the process level.

Conclusions

This review provides a comprehensive summary of what is known concerning the implementation of ePSMs. The results can inform future implementation and evaluation of ePSMs, including planning for key determinants, selecting implementation strategies, and considering outcomes alongside local contextual factors to guide the implementation process.

Peer Review reports

Introduction

Cancer is one of the most prevalent, disabling, and costly conditions affecting people worldwide [1, 2]. People with cancer experience deleterious changes to wellbeing including physical, functional, and psychosocial challenges [3, 4]. The presence of cancer-related impairments decreases quality of life and diminishes cancer survivors’ ability to participate in work and life roles meaningfully [5, 6]. Therefore, supportive care strategies to manage treatment-related adverse effects and improve quality of life have become a priority in cancer survivorship research.

Despite the high prevalence of cancer-related impairments, adverse effects of treatments often go undetected and existing interventions to manage these impairments are underutilized [7, 8]. As such, there have been several calls to develop new approaches to care delivery, such as implementing a Prospective Surveillance Model (PSM) into standard care [9, 10]. A PSM includes routine assessment of cancer survivors’ needs and function throughout the cancer care continuum. It may facilitate early identification and intervention to manage anticipated and serious treatment-related adverse effects [9, 10].

Emerging technologies offer a potentially cost-effective and patient-centered solution to implement a PSM into clinical practice. An electronic PSM (ePSM) includes remote monitoring of patients at specified time points throughout their care using electronic patient-reported outcomes (ePROs) [9, 10]. ePROs provide a direct measurement of patient experiences and have been shown to be feasible and provide a reliable estimate of patients’ health and needs [11, 12]. An ePSM may also include an automated triage system to provide education and self-management materials and assist the oncology team with the assessment and synthesis of patient data to improve patient-provider conversations and help clinicians make appropriate referrals. Therefore, an ePSM has the potential to provide timely access to information and services to manage treatment-related symptoms and reduce rates of disability and dysfunction [9, 13].

While randomized controlled trials have demonstrated that ePSMs are effective at improving quality of life and decreasing symptom distress and emergency room visits, as well as associated with increased survival [14, 15], less is known about the implementation of ePSMs into routine care. Known barriers to implementation include a lack of resources for designing the system, ambiguity around appropriate risk stratification criteria to guide referral pathways, and time constraints for providers to address needs that arise from ePRO scores [11, 12, 16].

Using an implementation science approach to move evidence-based practices such as an ePSM into routine clinical care has been identified as a priority for future research in cancer survivorship [17]. A comprehensive summary of the reported barriers and facilitators to implementing ePSMs, as well as the implementation strategies and corresponding outcomes that have been utilized, is necessary to facilitate ePSM use in routine cancer care. This scoping review aimed to provide a comprehensive synthesis of the approach to implementation reported in studies evaluating the use of ePSMs in oncology.

Methods

We conducted a scoping review following guidance by the Joanna Briggs Institute [18] and the Preferred Reporting Items for Systematic Reviews and Meta Analyses Scoping Review reporting recommendations (Additional file 1). The following research questions guided this review:

  1. (1)

    What theories, models, and frameworks (TMFs) have been used to guide the implementation planning and evaluation of ePSMs in oncology?

  2. (2)

    What implementation strategies have been used to promote the implementation of an ePSM in oncology?

  3. (3)

    What outcomes have been used to assess the success of the implementation of ePSMs in oncology?

  4. (4)

    What is known about the determinants (barriers and facilitators) to the implementation of ePSMs in oncology?

Data sources and search strategy

A search was performed in Medline ALL (Medline and Epub Ahead of Prints and In-Process, In-Data-Review & Other Non-Indexed Citations), Embase Classic + Embase, Cochrane Central Register of Controlled Trials, Cochrane Database of Systematic Reviews, Emcare, and PsycInfo (all from the OvidSP platform), and CINAHL from EBSCOhost from inception to February 2021. Each search strategy comprised a combination of controlled vocabulary and text words, adapting the database-specific search syntax. The search was restricted to human studies and adults over 18, excluding books and conferences. There were no language restrictions (see Additional file 2 for all search strategies). Reference lists of relevant reviews and included studies were hand searched, and authors of relevant conference abstracts were contacted for full texts.

Study eligibility criteria

Eligible studies described the real-world implementation of an ePSM for adult cancer survivors (age 18 years and older). For this review, an ePSM must have included the routine collection of ePROs as surveillance to monitor and act on patients’ responses. “Routine” was defined as the systematic use of outcome measure(s) in clinical practice with every eligible patient as part of a standardized assessment [19], as previously reported [16]. Given that the objective of this review was to identify the existing data related to implementation to inform future implementation efforts, we included articles reporting on studies that (1) explicitly used implementation science in their design, data collection, and analysis; or (2) studies that reported on the implementation of an ePSM for routine care but did not use an implementation science approach. The latter were included because while these studies may not have used implementation science explicitly, the approaches used to facilitate implementation (i.e., strategies), outcomes collected, and barriers and facilitators reported provided relevant data that could be used to inform future approaches to implementation. However, studies reporting on the preliminary development of an ePSM (e.g., proof-of-concept) were excluded. Studies that focused on routine collection of ePROs which did not include the option to act on patients’ responses (e.g., establishment of a longitudinal cohort or research database) were excluded. Experimental, observational, qualitative, and mixed methods studies were included, while opinion pieces, guidelines, and published conference abstracts were excluded.

Study selection

After duplicates were removed, identified citations were exported to Covidence systematic review software. Two reviewers independently screened each title and abstract in duplicate. The full texts of all potentially eligible articles were retrieved and assessed independently by two independent reviewers. Disagreements were resolved through discussion during bi-weekly meetings.

Data extraction

Relevant study information was extracted, including ePSM system characteristics and implementation details (e.g., TMFs, implementation strategies, outcomes, and barriers and facilitators). Two reviewers extracted data from all studies independently and in duplicate, and disagreements were resolved through discussion during bi-weekly meetings.

Data synthesis

A descriptive analysis was used to summarize the characteristics of the included studies, the TMFs utilized, implementation strategies used, the outcomes measured, and barriers and facilitators reported. Articles reporting on the same implementation project were analyzed as a single ePSM intervention; however, these studies were reported separately when the same ePSM system was adapted and implemented in different populations or settings. This decision was made as these studies may have used different implementation strategies, assessed different outcomes, or reported different determinants.

Before data analysis, all coded data on TMFs, strategies, outcomes, and determinants were reviewed by two independent reviewers, and disagreements were resolved through discussion. TMFs were categorized as (1) classic theories, which originate from fields outside of implementation science; (2) implementation theories, which implementation researchers have developed; (3) process models, which describe and/or guide the process of translating research into practice; (4) determinant frameworks, which describe factors that may impact implementation; and (5) evaluation frameworks, which specify aspects of implementation that could be evaluated to determine implementation success [20].

The Expert Recommendations for Implementing Change (ERIC) taxonomy [21] was used to label the implementation strategies described by the included articles. Team members extracted the specific terminology used to describe strategies in each study and coded the strategy based on definitions provided by the ERIC project. Each study was coded into one or more of 73 discrete implementation strategies which belong to one of nine thematic clusters, including (1) the use of evaluative and iterative strategies, (2) providing interactive assistance, (3) adapting and tailoring to the context, (4) developing stakeholder interrelationships, (5) training and educating stakeholders, (6) supporting clinicians, (7) engaging consumers, (8) utilizing financial strategies, and (9) changing the infrastructure [22]. For data coding, the definitions from the original ERIC list were slightly adapted for an ePSM intervention (see Additional file 3). For example, changing record systems involved integrating the ePSM into the electronic medical record or a patient portal. Intervening with patients to enhance adherence and uptake involved using system alerts to patients based on inactivity or using in-person reminders to complete ePROs when patients attend a clinic visit. Lastly, changing equipment encompassed setting up computer stations or obtaining tablets for the clinic for patients to complete their screening questions.

Proctor’s implementation outcomes taxonomy was used to categorize the outcomes used to assess implementation, including (1) acceptability, (2) adoption, (3) appropriateness, (4) feasibility, (5) fidelity, (6) cost, (7) reach/penetration, and (8) sustainability [23], following guidance for the use of these outcomes for projects using patient-reported outcomes by Stover and colleagues [24]. The definitions from the implementation outcomes taxonomy were adapted for an ePSM intervention to facilitate the categorization (see Additional file 4). This provided specific measures for evaluating the implementation of an ePSM that could be used to resolve discrepancies between the terminology utilized by the included studies and the implementation outcomes taxonomy. For instance, while studies may report on the feasibility of an ePSM by assessing ePRO completion rates, Stover et al. [24] categorized this measure as fidelity. Similarly, while studies may report on the acceptability of an ePSM by assessing perceptions regarding the fit of the system with the patient population, Stover et al. [24] categorized this measure as appropriateness.

Reported barriers and facilitators to implementation were analyzed according to the Consolidated Framework for Implementation Research (CFIR) [25], a widely used determinant framework that includes 39 constructs within five domains (characteristics of the intervention, inner setting, outer setting, characteristics of individuals, and the process of implementation). The CFIR codebook template [26], which provides descriptions for each construct, guided the classification of the barriers and facilitators. First, one author (CL) categorized the barriers and facilitators extracted based on the five CFIR domains, and then coded the data according to the CFIR constructs within each domain. A second coder (KT) reviewed all the coded data and both coders met to discuss any necessary refinements.

Results

The database search yielded 4996 records, and 126 records were identified through reference checking of included articles and relevant reviews (Fig. 1). Following the removal of duplicates, 3446 citations underwent title and abstract screening, and 394 full-text articles were reviewed. Of these, 63 articles describing 46 interventions met all inclusion criteria (Table 1). While publication years ranged from 2005 to 2021, the majority were published in the last 5 years (n = 43, 68%). Of the 46 interventions included, nearly half (n = 22, 48%) were conducted in Europe [27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48], followed by North America (n = 20, 43%) [14, 15, 49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66], Australia (n = 3, 7%) [67,68,69], and the Philippines (n = 1, 2%) [70]. Most interventions targeted patients with a mix of cancer types (n = 24, 52%) [14, 15, 27, 30, 35, 37, 40, 41, 46, 47, 51, 52, 54, 55, 60,61,62,63, 65,66,67,68,69,70], followed by a focus on head and neck (n = 4, 8%) [31, 32, 48, 53], gynecologic (n = 3, 7%) [49, 57, 58], lung (n = 3, 7%) [39, 56, 59], and breast (n = 3, 7%) [28, 42, 64] cancers. Of the 46 ePSM studies, 33% (n = 15) explicitly used implementation science in their design, data collection, or analysis [30, 32, 35, 41, 44, 46, 50, 51, 53,54,55, 58, 60, 67, 69], while 67% (n = 31) reported on the implementation of an ePSM but did not use an implementation science approach [14, 15, 27,28,29, 31, 33, 34, 37,38,39,40, 42, 43, 47,48,49, 52, 56, 57, 59, 61,62,63,64,65,66, 68, 70,71,72]. Overall, 57% (n = 26) used a non-randomized experimental or quality improvement design [29, 31,32,33,34, 37,38,39, 41, 43, 44, 47,48,49,50, 52, 55,56,57,58,59, 62, 64, 65, 67, 68], 26% (n = 12) used a randomized experimental design [14, 15, 27, 28, 35, 40, 42, 46, 54, 70, 71, 73], 9% (n = 4) were descriptive case reports on the implementation of the intervention [53, 60, 61, 63], 4% (n = 2) used an observational design [30, 74], and 4% (n = 2) solely used a qualitative design [36, 66]. Notably, 30% (n = 14) of the studies included an additional qualitative component to their design [29, 31, 32, 35, 38, 39, 42, 45, 48, 50, 51, 67,68,69]. Within included studies, 41% (n = 19) focused exclusively on patients on active treatment [14, 15, 27, 29, 37, 40, 41, 44, 45, 47, 50, 54,55,56,57, 61, 62, 64, 65], 39% (n = 18) included patients during active treatment as well as follow-up surveillance [28, 30, 31, 33, 34, 38, 39, 43, 51,52,53, 58, 60, 63, 66,67,68,69], 11% (n = 5) were exclusively during follow-up surveillance [32, 36, 42, 46, 48], 4% (n = 2) during the postoperative period [49, 59], and 4% (n = 2) during palliative care [35, 70].

Fig. 1
figure 1

Preferred Reporting Items for Systematic Reviews and Meta-Analyses flow diagram

Table 1 Characteristics of included studies

ePSM intervention characteristics

Most interventions did not have a fixed (e.g., weekly or monthly) surveillance schedule for patients (n = 25, 54%), with most asking patients to complete ePROs at any outpatient visit [14, 15, 28, 30,31,32,33,34, 36, 41, 46, 48, 51,52,53, 56,57,58, 60, 61, 63, 64, 66, 69, 70]. Some interventions allowed clinicians to personalize the frequency of reporting for patients or asked patients to report based on their preference (n = 3, 7%) [28, 46, 70]. Of interventions with fixed surveillance schedules, reporting varied from daily (n = 6, 13%) [35, 38,39,40, 43, 59], weekly or bi-weekly (n = 11, 24%) [27, 37, 44, 45, 47, 49, 50, 54, 62, 65, 68], monthly (n = 3, 7%) [29, 55, 67], and quarterly (n = 1, 2%) [42]. The duration of surveillance ranged from 1 month (n = 2, 4%) [50, 59], greater than 1 to 6 months (n = 13, 28%) [15, 27, 28, 35, 38, 41, 43, 45, 49, 57, 64, 65, 70], greater than 6 to 2 years (n = 5, 11%) [36, 42, 54,55,56], and up to 5 years after completing treatment (n = 1, 2%) [31]. Over half of the interventions did not specify a fixed duration of surveillance (n = 25, 54%), but rather described that patients were followed until they completed treatment or were no longer being followed by the oncology team [14, 29, 30, 32,33,34, 37, 39, 40, 44, 46,47,48, 51,52,53, 58, 60,61,62,63, 66,67,68,69].

The ePSM system features specified for each study in Table 1 are further described in Additional file 5. The most common patient-targeted features included automatically providing patients with self-management material to address symptoms (n = 17, 37%) [15, 27, 28, 30, 35, 38,39,40, 43, 46, 47, 50, 54, 59, 64, 67, 68], the option to view how scores had changed over time (n = 10, 22%) [15, 27, 30, 35, 38, 40, 43, 47, 59, 69], and an automated message on remote systems informing them that their scores were not being monitored by their provider with appropriate contact information if further support was required (n = 9, 20%) [14, 27, 35, 45, 49, 50, 52, 56, 57]. Other features included the ability to message providers or administrators to ask questions or request an e-consult (n = 4, 9%) [42, 47, 59, 70], general education about treatments and potential side effects, and/or information about patients’ legal rights (n = 3, 7%) [28, 59, 66], and the ability to view their circle of care including a list of attending physicians and their contact information (n = 2, 4%) [40, 70].

The most common provider-targeted features included the option to view summary reports of patients’ symptoms, including graphs indicating symptom thresholds and severity (n = 41, 89%) [14, 15, 27, 29, 31,32,33,34,35,36,37,38,39,40,41,42, 44, 45, 47,48,49,50,51,52,53,54, 56,57,58,59,60,61,62,63,64,65,66,67,68,69,70], alerts for symptoms that had breached a specified threshold (n = 15, 33%) [14, 27, 35, 38,39,40, 43, 49, 50, 52, 54, 55, 57, 58, 67], the provision of recommended actions and referrals to facilitate symptom management (n = 5, 11%) [40, 54, 62, 63, 67], and the ability to send messages to patients, such as reminders, prescriptions, and appointment schedules (n = 3, 7%) [35, 58, 70].

Implementation theories, models, and frameworks

Ten studies (22%) reported using a theory, model, or framework to guide implementation planning or evaluation. Process models were used by six studies (14%) [39, 40, 51, 53, 73, 83], such as the Medical Research Council framework for the development of complex interventions and the Knowledge-to-Action Framework [87, 88]. Models from the quality improvement literature were also utilized by two studies (5%) [15, 63]. The integrated Promoting Action on Research Implementation in Health Services [89] and the implementation outcomes taxonomy [23] were the only determinant and evaluation frameworks utilized [50, 69]. Lastly, classic theories were used by two studies (5%) [51, 70], including the Diffusion of Innovations theory [90] and the Self-Determination Theory.

Implementation strategies

A total of 26 different implementation strategies were described within the included studies. Of these, there were a total of 153 reports of their use across the 46 interventions. The median number of discrete implementation strategies reported within interventions was 3 (interquartile range 2–4). The implementation strategies used among the included interventions are displayed in Additional file 6. Of the 153 reports of use, the strategies used most frequently were those within the cluster of train and educate stakeholders (n = 55, 36%) [14, 15, 27,28,29,30,31, 33,34,35, 39,40,41,42,43,44,45, 47, 49,50,51, 54, 56, 57, 60,61,62,63,64, 68,69,70, 72], followed by change infrastructure (n = 28, 18%) [27, 29, 31, 32, 36, 44, 48, 51,52,53, 55,56,57,58, 60,61,62,63,64,65, 67,68,69,70], engage consumers (n = 24, 16%) [15, 27, 30, 31, 37,38,39, 43, 49, 50, 53,54,55,56,57,58, 60, 62, 64,65,66,67,68], develop stakeholder interrelationships (n = 21, 14%) [27, 30, 34, 38,39,40, 50, 51, 55, 60, 63, 68, 69], use evaluative and iterative strategies (n = 12, 8%) [29, 30, 40, 51, 60, 63, 69], provide interactive assistance (n = 8, 5%) [28, 30, 34, 43, 44, 46, 55, 69], support clinicians (n = 3, 2%) [31, 50], and utilize financial strategies (n = 1, 1%) [30].

Among the 46 ePSM interventions, the most common discrete implementation strategies utilized included conduct educational meetings (n = 25, 54%) [14, 15, 28,29,30,31, 35, 39, 40, 42, 43, 45, 47, 49,50,51, 54, 56, 57, 60,61,62, 68, 69, 72], distribute educational materials (n = 20, 43%) [14, 28, 30, 33,34,35, 40, 41, 44, 47, 50, 51, 57, 60,61,62, 64, 68, 69, 72], change record systems (n = 19, 41%) [27, 29, 31, 32, 36, 44, 51,52,53, 55, 58, 60,61,62, 65, 67,68,69,70], intervene with patients to enhance adherence and uptake (n = 19, 41%) [15, 31, 37, 43, 49, 53,54,55,56,57,58, 60, 62, 64,65,66,67,68], change physical structure and equipment (n = 9, 20%) [31, 32, 48, 56, 57, 60, 61, 63, 64], and provide local technical assistance (n = 8, 17%) [28, 30, 34, 43, 44, 46, 55, 69].

Implementation outcomes

The median number of implementation outcomes measured per study was 3, ranging from 1 to 6. The most frequently reported outcomes were feasibility (n = 33, 72%) [14, 15, 27,28,29, 31,32,33,34, 37,38,39,40,41,42, 44, 45, 48, 49, 52,53,54,55,56,57, 61,62,63, 65,66,67,68,69] and acceptability (n = 31, 67%) [29, 32,33,34,35,36,37,38,39, 41, 45, 47,48,49,50,51, 54,55,56,57, 59,60,61,62,63,64,65,66,67,68,69], followed by appropriateness (n = 18, 39%) [31, 34,35,36, 38, 39, 45, 47, 50, 51, 54, 55, 60, 61, 63, 66,67,68], fidelity (n = 18, 39%) [29, 30, 35, 38, 40, 44, 46, 48, 49, 55, 56, 58, 60, 61, 65, 67, 69, 70], and penetration (n = 16, 35%) [15, 32, 33, 35, 41, 46, 48, 51,52,53, 56, 57, 60, 63, 69, 70]. Very few studies reported on cost (n = 4, 9%) [35, 40, 46], adoption (n = 2, 4%) [30, 63], or sustainability (n = 1, 2%) [63]. Studies used various approaches to measure implementation outcomes, including the use of surveys (n = 26, 57%) [27, 29,30,31,32, 34, 35, 39,40,41, 45, 47,48,49, 51, 54,55,56,57, 59, 60, 62,63,64, 68, 69], ePSM system data and analytics (n = 23, 50%) [15, 28, 29, 32,33,34,35, 37, 38, 40, 41, 44, 46, 49, 52, 56, 61,62,63, 65, 67, 68, 70], qualitative interviews or focus groups (n = 19, 41%) [27, 29, 31, 32, 35, 36, 38, 39, 42, 45, 50, 55, 61,62,63, 65, 66, 68, 69], administrative data (n = 5, 11%) [27, 32, 35, 41, 63], and field notes and observations (n = 3, 7%) [32, 34, 69].

Implementation barriers and facilitators

Operationalized definitions for each CFIR domain and construct, synthesized descriptions for the barriers and facilitators identified, and the proportion of studies coded within each construct are outlined in Table 2. The most commonly reported domains were intervention characteristics (n = 29, 63%) [27, 29,30,31,32,33,34, 36, 38, 39, 41, 45, 47, 49,50,51, 54,55,56,57, 60,61,62,63,64,65,66, 68, 69], inner setting (n = 22, 48%) [29,30,31,32, 34, 36, 38, 42, 45, 50,51,52, 55, 56, 60,61,62,63, 65, 66, 68, 69], and outer setting (n = 19, 41%) [29,30,31,32, 34, 36, 38, 39, 45, 54, 59, 61,62,63,64,65,66, 68, 69]. The characteristics of individuals (n = 16, 35%) [29,30,31,32, 34, 36, 38, 39, 45, 54, 59, 61,62,63,64,65,66, 68, 69] and process (n = 14, 30%) [30,31,32, 34, 36, 38, 39, 50, 51, 61,62,63, 68, 69] were less frequently reported. A total of 17 of the 39 CFIR constructs were identified across the 46 interventions. The barriers and facilitators in the context of the five CFIR domains and the most relevant constructs are presented below.

Table 2 Determinants to implementation reported by the included interventions

Intervention characteristics

The most common constructs for intervention characteristics (i.e., key attributes of the ePSM that is being implemented) were complexity and relative advantage. Within complexity, barriers centered on the complexity of the surveillance system design. From a provider perspective, this included a high volume of patient responses or alerts provided about patients’ symptoms [31, 32, 45, 47, 50, 54, 61, 66, 68] and interpreting symptom scores [31, 34, 41, 62]. From a patient perspective, complex systems presented challenges in understanding what was being asked of them [31, 34, 41, 62]. Difficulty in navigation of the system was a barrier for both patients and providers [32,33,34, 36, 39, 61, 62, 68]. Alternatively, facilitators included perceptions that the duration and frequency of completing the ePROs were appropriate [31, 34, 45, 50, 54, 57, 60, 65], the ability to understand the questions asked [34, 49, 54, 64], and perceptions that the system was easy to use [27, 33, 38, 39, 45, 47, 49, 54, 56, 57, 62, 64, 65].

For relative advantage, barriers included perceptions that the ePROs and/or the self-management material were redundant and/or conflicting with assessments and information provided by the oncology team during clinic visits [39, 50, 51, 55, 61, 62, 68]. Facilitators included perceptions that the ePSM improved symptom identification and management [29, 32, 33, 38, 39, 45, 47, 51, 57, 61, 62, 65, 66], improved communication and quality of discussions between patients and providers [29, 33, 45, 51, 54, 56, 57, 60, 63,64,65,66], and allowed the provider to personalize the clinic visit based on the ePRO scores [45, 61, 65, 66, 68].

Inner setting

The most common determinants within the inner setting (i.e., the specific organizational and cultural contexts in which ePSMs are implemented) were implementation climate, and readiness for implementation. The implementation climate is most often related to the compatibility between the ePSM and existing workflows. Barriers included not integrating the ePSM with the electronic medical record, as clinic staff had to log into a different system to view patients’ ePRO results [29, 45, 50, 62]. Additionally, barriers included perceptions that implementing an ePSM would result in an increased workload due to having to review ePRO results before a clinic visit, potential challenges integrating the management of symptom alerts into existing communication channels, and the potential to prolong visit times [31, 36, 38, 42, 50, 51, 61, 63, 66, 68, 69]. Alternatively, facilitators included integrating the ePSM with the electronic medical record [31, 32, 45, 52, 61, 62], and perceptions that workloads among clinic staff were not increased as a result of implementing an ePSM [42, 65].

Barriers related to the readiness for implementation involved a lack of resources to implement the ePSM. This included reports of insufficient time for clinicians to use ePRO scores during clinic visits [32, 36, 56, 62, 69], and concerns that the center would not have the necessary resources to respond to symptoms identified by the ePSM [31, 63, 69]. Studies reported a lack of information related to the ePSM to facilitate its use, such as explanations about ePRO scores and guidance for assessing and managing high scores [29, 62, 63]. Facilitators included having clear, supportive, and committed leadership from senior staff and managers [36, 51, 60, 61, 69], as well as the availability and involvement of volunteers to provide education and support to patients completing ePROs in-clinic [63].

Outer setting

Barriers and facilitators for the outer setting (i.e., the broader context within which an organization implementing an ePSM is situated) were almost exclusively related to the extent to which patients’ needs were met by the setting that implemented the ePSM (i.e., patient needs and resources). Barriers included perceptions of the lack of usefulness of the ePROs and self-management material [32, 36, 45, 66, 68], particularly when patients’ responses to the ePROs were not mentioned during their clinic visit [36, 61, 62], as well as perceptions that the ePROs and self-management material were not sufficiently tailored to the individual patient [30, 68]. Facilitators included perceptions that the ePROs were relevant and meaningful for patients [29, 34, 38, 45, 54, 59, 62, 64,65,66], as well as perceptions that using the system gave patients a sense of reassurance about their wellbeing and provided them with a sense of empowerment and control [31, 38, 39, 54, 62, 68]. Additionally, facilitators included beliefs that using the ePSM provided patients and providers with greater attention and insight into their symptoms, including the ability for patients to remember their symptoms between clinic visits and the ability for staff to provide appropriate referrals [29, 31, 32, 45, 63,64,65,66, 68].

Characteristics of individuals

The most common determinants were knowledge and beliefs, and personal attributes. Barriers identified for knowledge and beliefs included a lack of knowledge among patients and clinic staff about the ePSM features and how to complete the ePROs [29, 36, 50, 69], as well as beliefs that the use of ePROs was not valuable [34, 61]. Facilitators identified included an understanding of the content and features of the ePSM [50, 56], as well as when and how to complete the ePROs [50]. Additionally, facilitators included strong professional values for using ePROs for clinical practice and beliefs that symptom management is within a provider’s scope of responsibilities [61, 63, 69].

For other personal attributes of patients, barriers included a lack of comfort and experience with technology [30, 41, 47, 54, 72], limited access to reliable internet or electronic devices [48, 55], and feeling too ill to report symptoms [29, 57, 62]. Alternatively, facilitators included prior experience of patients using connected technologies, thus more likely to demonstrate greater usability and use of the system [48, 54, 57].

Process

The most common determinants for the process domain (i.e., the stages and active change processes used to implement ePSMs) included engaging stakeholders and patients. Barriers included perceptions that the educational strategies such as handouts were not used by patients or that information provided was not clear [39, 50, 62], perceptions that patients did not receive sufficient training and were unaware of various features of the system [39, 50], and that patients had to register through their health care provider rather than being able to self-register to the system [30]. Facilitators included engaging a broad group of stakeholders, including the involvement of respected peers [34, 51, 63, 69], perceptions that the duration and timing of the education and training for patients or staff were appropriate [34, 39, 50], and beliefs that the ePSM was clearly explained to stakeholders [32, 61, 62]. Facilitators included building patient and clinician capacity and confidence to use the system through quality education and training strategies and the availability of support to resolve practical and technical issues [34, 50, 51, 69]. Lastly, facilitators included the use of reminders for patients and clinicians to use the system [32, 38, 62].

Discussion

This scoping review synthesized 46 ePSMs to summarize the approach to implementing this intervention in routine cancer care. The findings provide a foundation for informing and improving the implementation of ePSMs, including selecting implementation strategies, planning for barriers and facilitators, and evaluating key implementation outcomes.

The use of TMFs has been strongly advocated for in implementation science to guide the planning, process, and evaluation of moving evidence-based practices into action. However, a minority of included studies reported using any. This may be partly because many included studies did not identify as implementation science studies and were rather descriptions of implementation in practice. Implementation science is a relatively young field, and we anticipate the use of these may increase in the future. Their use in future implementation efforts may provide a better understanding of the steps taken during implementation and how or why implementation was or was not successful.

Feasibility and acceptability were commonly reported implementation outcomes, while adoption, cost, and sustainability were seldom reported. Implementing an intervention involves various steps, and certain outcomes may be prioritized during different phases of implementation [99]. Capturing outcomes such as feasibility and acceptability are recommended before or during the initial implementation of an intervention [23]. The lack of reported use of adoption, cost, and sustainability can be explained in part because research on the implementation of ePSMs is still in its infancy; however, these outcomes should be a major focus in reporting future implementation efforts. Many articles in this review reported on implementing an ePSM in a single setting, rather than investigating the scale or spread across oncology clinics (i.e., adoption). While sustainability can be assessed during the early stages of implementation to identify areas that require improvement [100], research typically focuses on the early stages of implementation and little attention is paid to sustaining interventions [91]. This can also explain the lack of implementation strategies identified that were focused on the sustainability of ePSMs.

Recently published clinical practice guidelines on the role of patient-reported outcome measures (PROMs) in oncology highlight the need for improved evidence regarding optimal implementation strategies [92]. Our findings provide a list of implementation strategies used for ePSMs, their frequency of use, action targets, and when they were used in the implementation process. The most frequently used categories of implementation strategies were educating stakeholders, changing infrastructure, and engaging patients. Interestingly, there is only moderate alignment between the most used strategies and reported determinants of implementation in the included studies. Within the field of implementation science, it is recommended to use an assessment of barriers and facilitators to identify relevant implementation strategies; future studies should carefully consider local contextual determinants of implementation before embarking on an implementation project.

The most frequently reported determinant domains were the intervention characteristics, outer and inner settings. At the level of the intervention, implementors should consider the complexity of the system by ensuring patients and providers consider it clear, easy to use, and perceive the duration and frequency of reporting to be acceptable. This could be achieved using strategies within the cluster of engaging consumers, and adapting and tailoring to context; however, these were not among the most commonly reported strategies in this review. Likewise, when designing the system, it is essential to ensure repeated ePROs are displayed longitudinally using clear graphical depictions of the patient’s status over time. To address identified barriers to ePSM implementation, particular emphasis should be placed on highlighting the relative advantages of using the ePSM compared to existing clinical practices (e.g., improved symptom management through early identification and communication).

Key factors influencing implementation from the inner setting domain were related to the implementation climate and readiness for implementation; however, evaluative and iterative strategies such as conducting a local needs assessment and assessing for readiness were seldom reported in the included studies, reflecting an area of opportunity for future work. A recent systematic review emphasizes the importance of assessing the implementation climate, demonstrating that features such as organizational culture, leadership, and resources influence the implementation of interventions in healthcare settings [93]. Factors such as management support, organizational priorities, and organizational buy-in have been identified as key factors for sustaining cancer survivorship interventions [94, 95]. Therefore, implementers should consider how the implementation of an ePSM can be integrated within existing workflows and other electronic systems used in the setting and obtain support from senior leadership.

Meeting the needs of patients was another critical determinant of successful implementation. Implementors should consider whether patients find the screening questions and information relevant to their cancer care and the types and levels of resources available to patients that may require support identified by the ePSM. While sites may have concerns about a lack of dedicated programs and cancer rehabilitation clinicians to meet the needs of patients [7], as was reported in several studies in this review, directing patients to self-management resources and eHealth interventions may address many accessibility barriers to meet the needs of patients [95,96,97]. Furthermore, since many of the facilitators identified rely on a contextual understanding of patient/provider needs, preferences, and existing workflows, engaging a broad group of stakeholders throughout implementation using a flexible and iterative approach is likely key to successful ePSM implementation. Strategies in the clusters of adapting and tailoring to context, developing stakeholder interrelationships, and changing infrastructure will be important to address these determinants. Within included studies, the role of healthcare coverage and health system arrangement (as part of the outer setting) was not identified as a determinant of implementation, likely because nearly all studies were implemented within a single system. To scale and spread these interventions to other jurisdictions, one would expect these outer setting factors to be of great importance. Cost was reported as a barrier in two studies conducted in Europe; however, only four studies collected and reported on cost as an outcome. Future studies examining the implementation of an ePSM should consider capturing perspectives from patients, providers, and administrators on costs, policies, and regulatory environments that may hinder or enable implementation.

Powell et al. [22] identified implementation strategies in the “Go-Zone” quadrants—those rated as most important for implementation and the most feasible were placed into quadrant 1. Interestingly, most of these strategies fell within the cluster of use evaluative and iterative strategies, with the most highly rated for both importance and feasibility being “assess readiness and identify barriers and facilitators,” “audit and feedback,” and “purposefully re-examine implementation.” Within this review, the most often reported strategies were within the “train and educate stakeholders” cluster, although knowledge was not identified as a critical barrier, nor was education or training an important facilitator. While some strategies, such as conduct ongoing training and providing ongoing consultation, were rated as highly important by the expert group, others such as developing and distributing educational materials and conducting educational meetings and outreach visits, while highly rated for feasibility, were ranked lower in importance. It is important to note that these rankings were based on expert opinion, and to date, there is scant literature to objectively determine which implementation strategies are “best”.

Previous reviews have identified similar facilitators to implementation such as patients’ acceptability to report symptoms, the ability for PROMs to enable earlier detection of symptoms, and improving patient-provider communication [16]. Additionally, previous reviews have identified similar barriers such as patient and clinician time, knowledge to interpret and act on scores, and challenges integrating PROMs into workflows [16, 97]. However, two critical factors differentiate previous reviews from our scoping review. First, our study adds to the literature on determinants of routine use of PROMs by using a well-known implementation science framework (i.e., CFIR) to categorize barriers and facilitators. This provides a comprehensive understanding of implementation determinants and may facilitate selecting strategies to address these domains and constructs. Furthermore, our review was solely focused on electronic reporting of symptoms and included additional findings related to implementation, such as the strategies, outcomes, and TMFs used.

Given the novelty of implementation research for ePSMs, this review may not have captured every potential strategy or determinant to implementation. As many of the included studies did not identify as implementation science studies, it is likely that other implementation strategies may have been used but not reported and that the focus on acceptability and feasibility implementation outcomes, as opposed to adoption, cost, and sustainability may be explained in part by the inclusion of these studies. Additionally, wide variation was found with respect to the characteristics of the ePSMs, including the ePROs used, patient populations examined, and treatment phases under study. It is possible that different determinants and, thus, the most relevant associated strategies may vary greatly by the characteristics of the ePSM. For example, most ePSMs included in this review were designed exclusively for patients on active treatment who were receiving chemotherapy, and few studies examined use in the palliative care setting. Future research may provide insight into similarities and differences in implementation across patient populations and settings and provide recommendations for adapting the implementation of ePSMs to meet unique needs.

Conclusions

This scoping review provides a foundation for future planning and evaluation of the implementation of ePSMs in oncology. These findings can facilitate the selection of implementation strategies; however, future studies should consider testing the effectiveness of these strategies. Advancing this knowledge through high-quality implementation science research will provide robust evidence on the effects of various strategies and their mechanism of action for successful implementation [98]. The findings highlight the need to consider the use of implementation science TMFs and provide insight into implementation determinants that researchers and implementors should consider.

Availability of data and materials

The dataset used and analyzed during the current study are available from the corresponding author on reasonable request.

Abbreviations

PSM:

Prospective Surveillance Model

ePSM:

Electronic Prospective Surveillance Model

ePROs:

Electronic patient-reported outcomes

TMF:

Theory, model, or framework

ERIC:

Expert Recommendations for Implementing Change

CFIR:

Consolidated Framework for Implementation Research

PROMs:

Patient-reported outcome measures

References

  1. Fitzmaurice C, Allen C, Barber RM, Barregard L, Bhutta ZA, Brenner H, et al. Global, regional, and national cancer incidence, mortality, years of life lost, years lived with disability, and disability-adjusted life-years for 32 cancer groups, 1990 to 2015. JAMA Oncol. 2017;3:524.

    Article  PubMed  Google Scholar 

  2. de Oliveira C, Weir S, Rangrej J, Krahn MD, Mittmann N, Hoch JS, et al. The economic burden of cancer care in Canada: a population-based cost study. CMAJ Open. 2018;6:E1-10.

    Article  PubMed  PubMed Central  Google Scholar 

  3. Weaver KE, Forsythe LP, Reeve BB, Alfano CM, Rodriguez JL, Sabatino SA, et al. Mental and Physical Health-Related Quality of Life among US Cancer Survivors: Population Estimates from the 2010 National Health Interview Survey. Cancer Epidemiol Biomarkers Prev. 2012;21:2108–17.

    Article  PubMed  PubMed Central  Google Scholar 

  4. Mayer DK, Nasso SF, Earp JA. Defining cancer survivors, their needs, and perspectives on survivorship health care in the USA. Lancet Oncol. 2017;18:e11–8.

    Article  PubMed  Google Scholar 

  5. Hansen DG, Larsen PV, Holm LV, Rottmann N, Bergholdt SH, Søndergaard J. Association between unmet needs and quality of life of cancer patients: a population-based study. Acta Oncol. 2013;52:391–9.

    Article  PubMed  Google Scholar 

  6. Cheng KKF, Wong WH, Koh C. Unmet needs mediate the relationship between symptoms and quality of life in breast cancer survivors. Support Care Cancer. 2016;24:2025–33.

    Article  CAS  PubMed  Google Scholar 

  7. Stubblefield MD. The underutilization of rehabilitation to treat physical impairments in breast cancer survivors. PM&R. 2017;9:S317–23.

    Article  Google Scholar 

  8. Cheville AL, Kornblith AB, Basford JR. An examination of the causes for the underutilization of rehabilitation services among people with advanced cancer. Am J Phys Med Rehabil. 2011;90:S27-37.

    Article  PubMed  Google Scholar 

  9. Alfano CM, Cheville AL, Mustian K. Developing high-quality cancer rehabilitation programs: a timely need. Am Soc Clin Oncol Educ Book. 2016;36:241–9.

    Article  Google Scholar 

  10. Alfano CM, Pergolotti M. Next-generation cancer rehabilitation: a giant step forward for patient care. Rehabil Nurs. 2018;43:186–94.

    Article  PubMed  Google Scholar 

  11. LeBlanc TW, Abernethy AP. Patient-reported outcomes in cancer care — hearing the patient voice at greater volume. Nat Rev Clin Oncol. 2017;14:763–72 (Nature Publishing Group).

    Article  PubMed  Google Scholar 

  12. Bennett Av, Jensen RE, Basch E. Electronic patient-reported outcome systems in oncology clinical practice. CA Cancer J Clin. 2012;62:336–47.

    Article  Google Scholar 

  13. Alfano CM, Zucker DS, Pergolotti M, Ness KK, Jones LW, Price ND, et al. A precision medicine approach to improve cancer rehabilitation’s impact and integration with cancer care and optimize patient wellness. Curr Phys Med Rehabil Rep. 2017;5:64–73.

    Article  Google Scholar 

  14. Basch E, Deal AM, Kris MG, Scher HI, Hudis CA, Sabbatini P, et al. Symptom monitoring with patient-reported outcomes during routine cancer treatment: a randomized controlled trial. J Clin Oncol. 2016;34:557–65.

    Article  CAS  PubMed  Google Scholar 

  15. Berry DL, Hong F, Halpenny B, Partridge AH, Fann JR, Wolpin S, et al. Electronic self-report assessment for cancer and self-care support: Results of a multicenter randomized trial. J Clin Oncol. 2014;32:199–205.

    Article  PubMed  Google Scholar 

  16. Howell D, Molloy S, Wilkinson K, Green E, Orchard K, Wang K, et al. Patient-reported outcomes in routine cancer clinical practice: a scoping review of use, impact on health outcomes, and implementation factors. Ann Oncol. 2015;26:1846–58.

    Article  CAS  PubMed  Google Scholar 

  17. Pergolotti M, Alfano CM, Cernich AN, Yabroff KR, Manning PR, de Moor JS, et al. A health services research agenda to fully integrate cancer rehabilitation into oncology care. Cancer. 2019;125:3908–16.

    Article  PubMed  Google Scholar 

  18. Peters MDJ, Godfrey CM, Khalil H, McInerney P, Parker D, Soares CB. Guidance for conducting systematic scoping reviews. Int J Evid Based Healthc. 2015;13:141–6.

    Article  PubMed  Google Scholar 

  19. Colquhoun H, Letts L, Law M, MacDermid J, Edwards M. Feasibility of the Canadian occupational performance measure for routine use. Br J Occup Ther. 2010;73:48–54.

    Article  Google Scholar 

  20. Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. 2015;10:1–13.

    Article  Google Scholar 

  21. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10:21.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Waltz TJ, Powell BJ, Matthieu MM, Damschroder LJ, Chinman MJ, Smith JL, et al. Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: results from the Expert Recommendations for Implementing Change (ERIC) study. Implement Sci. 2015;10:1–8.

    Article  Google Scholar 

  23. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health Ment Health Serv Res. 2011;38:65–76.

    Article  Google Scholar 

  24. Stover AM, Haverman L, van Oers HA, Greenhalgh J, Potter CM, Ahmed S, et al. Using an implementation science approach to implement and evaluate patient-reported outcome measures (PROM) initiatives in routine care settings. Qual Life Res. 2021;30:3015–33.

    Article  PubMed  Google Scholar 

  25. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Tools and Templates - Consolidated Framework for Implementation Research. Cfirguide.org. https://cfirguide.org/tools/tools-and-templates/

  27. Absolom K, Warrington L, Hudson E, Hewison J, Morris C, Holch P, et al. Phase III Randomized Controlled Trial of eRAPID: eHealth Intervention During Chemotherapy. J Clin Oncol. 2021;39:734–47.

    Article  CAS  PubMed  Google Scholar 

  28. Børøsund E, Cvancarova M, Moore SM, Ekstedt M, Ruland CM. Comparing effects in regular practice of e-communication and web-based self-management support among breast cancer patients: preliminary results from a randomized controlled trial. J Med Internet Res. 2014;16:e295.

    Article  PubMed  PubMed Central  Google Scholar 

  29. Baeksted C, Pappot H, Nissen A, Hjollund NH, Mitchell SA, Basch E, et al. Feasibility and acceptability of electronic symptom surveillance with clinician feedback using the patient-reported outcomes version of common terminology criteria for adverse events (PRO-CTCAE) in Danish prostate cancer patients. J Patient Rep Outcomes. 2017;1:1–11.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Matthijs de Wit L, van Uden-Kraan CF, Lissenberg-Witte BI, Melissant HC, Fleuren MAH, Cuijpers P, et al. Adoption and implementation of a web-based self-management application “Oncokompas” in routine cancer care: a national pilot study. Support Care Cancer. 2019;27:2911–20.

    Article  CAS  PubMed  Google Scholar 

  31. Dronkers EAC, Baatenburg de Jong RJ, van der Poel EF, Sewnaik A, Offerman MPJ. Keys to successful implementation of routine symptom monitoring in head and neck oncology with “Healthcare Monitor” and patients’ perspectives of quality of care. Head Neck. 2020;42:3590–600.

    Article  PubMed  PubMed Central  Google Scholar 

  32. Duman-Lubberding S, van Uden-Kraan CF, Jansen F, Witte BI, Eerenstein SEJ, van Weert S, et al. Durable usage of patient-reported outcome measures in clinical practice to monitor health-related quality of life in head and neck cancer patients. Support Care Cancer. 2017;25:3775–83.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  33. Erharter A, Giesinger J, Kemmler G, Schauer-Maurer G, Stockhammer G, Muigg A, et al. Implementation of computer-based quality-of-life monitoring in brain tumor outpatients in routine clinical practice. J Pain Symptom Manage. 2010;39:219–29.

    Article  PubMed  Google Scholar 

  34. Fernández-Méndez R, Rastall RJ, Sage WA, Oberg I, Bullen G, Charge AL, et al. Quality improvement of neuro-oncology services: Integrating the routine collection of patient-reported, health-related quality-of-life measures. Neurooncol Pract. 2019;6:226–36.

    PubMed  Google Scholar 

  35. Hackett J, Allsop MJ, Taylor S, Bennett MI, Bewick BM. Using information and communication technologies to improve the management of pain from advanced cancer in the community: qualitative study of the experience of implementation for patients and health professionals in a trial. Health Informatics J. 2020;26:2435–45.

    Article  PubMed  Google Scholar 

  36. Thestrup Hansen S, Kjerholt M, Friis Christensen S, Brodersen J, Hølge-Hazelton B. Nurses’ experiences when introducing patient-reported outcome measures in an outpatient clinic: an interpretive description study. Cancer Nurs. 2021;44:E108–20.

    Article  PubMed  Google Scholar 

  37. Hauth F, Bizu V, App R, Lautenbacher H, Tenev A, Bitzer M, et al. Electronic patient-reported outcome measures in radiation oncology: initial experience after workflow implementation. JMIR Mhealth Uhealth. 2019;7:1–10.

    Article  Google Scholar 

  38. Maguire R, Connaghan J, Arber A, Klepacz N, Blyth KG, McPhelim J, et al. Advanced symptom management system for patients with malignant pleural mesothelioma (ASyMSmeso): Mixed Methods study. J Med Internet Res. 2020;22:e19180.

    Article  PubMed  PubMed Central  Google Scholar 

  39. Maguire R, Ream E, Richardson A, Connaghan J, Johnston B, Kotronoulas G, et al. Development of a novel remote patient monitoring system: the advanced symptom management system for radiotherapy to improve the symptom experience of patients with lung cancer receiving radiotherapy. Cancer Nurs. 2015;38:E37-47.

    Article  PubMed  Google Scholar 

  40. Maguire R, McCann L, Kotronoulas G, Kearney N, Ream E, Armes J, et al. Real time remote symptom monitoring during chemotherapy for cancer: European multicentre randomised controlled trial (eSMART). The BMJ. 2021;374:1–14.

    Google Scholar 

  41. Mouillet G, Falcoz A, Fritzsch J, Almotlak H, Jacoulet P, Pivot X, et al. Feasibility of health-related quality of life (HRQoL) assessment for cancer patients using electronic patient-reported outcome (ePRO) in daily clinical practice. Qual Life Res. 2021;30:3255–66.

    Article  PubMed  Google Scholar 

  42. Riis CL, Stie M, Bechmann T, Jensen PT, Coulter A, Moller S, et al. ePRO-based individual follow-up care for women treated for early breast cancer: impact on service use and workflows. J Cancer Surviv. 2021;15:485–96.

    Article  PubMed  Google Scholar 

  43. Sundberg K, Wengström Y, Blomberg K, Hälleberg-Nyman M, Frank C, Langius-Eklöf A. Early detection and management of symptoms using an interactive smartphone application (Interaktor) during radiotherapy for prostate cancer. Support Care Cancer. 2017;25:2195–204.

    Article  PubMed  PubMed Central  Google Scholar 

  44. Taarnhøj GA, Lindberg H, Dohn LH, Omland LH, Hjøllund NH, Johansen C, et al. Electronic reporting of patient-reported outcomes in a fragile and comorbid population during cancer therapy - a feasibility study. Health Qual Life Outcomes. 2020;18:1–9.

    Article  Google Scholar 

  45. Tolstrup LK, Pappot H, Bastholt L, Zwisler AD, Dieperink KB. Patient-reported outcomes during immunotherapy for metastatic melanoma: mixed methods study of patients’ and clinicians’ experiences. J Med Internet Res. 2020;22:1–11.

    Article  Google Scholar 

  46. van der Hout A, van Uden-Kraan CF, Holtmaat K, Jansen F, Lissenberg-Witte BI, Nieuwenhuijzen GAP, et al. Role of eHealth application Oncokompas in supporting self-management of symptoms and health-related quality of life in cancer survivors: a randomised, controlled trial. Lancet Oncol. 2020;21:80–94.

    Article  PubMed  Google Scholar 

  47. van Eenbergen MC, van den Hurk C, Mols F, van de Poll-Franse Lv. Usability of an online application for reporting the burden of side effects in cancer patients. Support Care Cancer. 2019;27:3411–9.

    Article  PubMed  Google Scholar 

  48. Zebralla V, Muller J, Wald T, Boehm A, Wichmann G, Berger T, et al. Obtaining patient-reported outcomes electronically with “OncoFunction” in head and neck cancer patients during aftercare. Front Oncol. 2020;10:549915.

    Article  PubMed  PubMed Central  Google Scholar 

  49. Cowan RA, Suidan RS, Andikyan V, Rezk YA, Einstein MH, Chang K, et al. Electronic patient-reported outcomes from home in patients recovering from major gynecologic cancer surgery: a prospective study measuring symptoms and health-related quality of life. Gynecol Oncol. 2016;143:362–6.

    Article  PubMed  PubMed Central  Google Scholar 

  50. Biran N, Kouyate RA, Yucel E, McGovern GE, Schoenthaler AM, Durling OG, et al. Adaptation and evaluation of a symptom-monitoring digital health intervention for patients with relapsed and refractory multiple myeloma: pilot mixed methods implementation study. JMIR Form Res. 2020;4:1–14.

    Article  Google Scholar 

  51. Howell D, Rosberger Z, Mayer C, Faria R, Hamel M, Snider A, et al. Personalized symptom management: a quality improvement collaborative for implementation of patient reported outcomes (PROs) in “real-world” oncology multisite practices. J Patient Rep Outcomes. 2020;4:47.

    Article  PubMed  PubMed Central  Google Scholar 

  52. Garcia SF, Wortman K, Cella D, Wagner LI, Bass M, Kircher S, et al. Implementing electronic health record–integrated screening of patient-reported symptoms and supportive care needs in a comprehensive cancer center. Cancer. 2019;125:4059–68.

    Article  PubMed  Google Scholar 

  53. Strachna O, Cohen MA, Allison MM, Pfister DG, Lee NY, Wong RJ, et al. Case study of the integration of electronic patient-reported outcomes as standard of care in a head and neck oncology practice: Obstacles and opportunities. Cancer. 2021;127:359–71.

    Article  PubMed  Google Scholar 

  54. Basch E, Stover AM, Schrag D, Chung A, Jansen J, Henson S, et al. Clinical utility and user perceptions of a digital system for electronic patient-reported symptom monitoring during routine cancer care: findings from the PRO-TECT Trial. JCO Clin Cancer Inform. 2020;4:947–57.

    Article  PubMed  Google Scholar 

  55. Naughton MJ, Salani R, Peng J, Lustberg M, DeGraffinreid C, Moon J, et al. Feasibility of implementing a text-based symptom-monitoring program of endometrial, ovarian, and breast cancer patients during treatment. Qual Life Res. 2020;14:14.

    Google Scholar 

  56. Basch E, Iasonos A, Barz A, Culkin A, Kris MG, Artz D, et al. Long-term toxicity monitoring via electronic patient-reported outcomes in patients receiving chemotherapy. J Clin Oncol. 2007;25:5374–80.

    Article  PubMed  Google Scholar 

  57. Basch E, Artz D, Dulko D, Scher K, Sabbatini P, Hensley M, et al. Patient online self-reporting of toxicity symptoms during chemotherapy. J Clin Oncol. 2005;23:3552–61.

    Article  PubMed  Google Scholar 

  58. Wagner LI, Schink J, Bass M, Patel S, Diaz Mv, Rothrock N, et al. Bringing PROMIS to practice: brief and precise symptom screening in ambulatory cancer care. Cancer. 2015;121:927–34.

    Article  PubMed  Google Scholar 

  59. Kneuertz PJ, Jagadesh N, Perkins A, Fitzgerald M, Moffatt-Bruce SD, Merritt RE, et al. Improving patient engagement, adherence, and satisfaction in lung cancer surgery with implementation of a mobile device platform for patient reported outcomes. J Thorac Dis. 2020;12:6883–91.

    Article  PubMed  PubMed Central  Google Scholar 

  60. Li M, Macedo A, Crawford S, Bagha S, Leung YW, Zimmermann C, et al. Easier said than done: Keys to successful implementation of the distress assessment and response tool (DART) program. J Oncol Pract. 2016;12:e513–26.

    Article  PubMed  Google Scholar 

  61. Rotenstein LS, Agarwal A, O’Neil K, Kelly A, Keaty M, Whitehouse C, et al. Implementing patient-reported outcome surveys as part of routine care: lessons from an academic radiation oncology department. J Am Med Inform Assoc. 2017;24:964–8.

    Article  PubMed  PubMed Central  Google Scholar 

  62. Wu AW, White SM, Blackford AL, Wolff AC, Carducci MA, Herman JM, et al. Improving an electronic system for measuring PROs in routine oncology practice. J Cancer Surviv. 2016;10:573–82.

    Article  PubMed  Google Scholar 

  63. Dudgeon D, King S, Howell D, Green E, Gilbert J, Hughes E, et al. Cancer Care Ontario’s experience with implementation of routine physical and psychological symptom distress screening. Psychooncology. 2012;21:357–64.

    Article  PubMed  Google Scholar 

  64. Abernethy AP, Herndon JE, Wheeler JL, Day JM, Hood L, Patwardhan M, et al. Feasibility and acceptability to patients of a longitudinal system for evaluating cancer-related symptoms and quality of life: pilot study of an e/tablet data-collection system in academic oncology. J Pain Symptom Manage. 2009;37:1027–38.

    Article  PubMed  Google Scholar 

  65. Zylla DM, Gilmore GE, Steele GL, Eklund JP, Wood CM, Stover AM, et al. Collection of electronic patient-reported symptoms in patients with advanced cancer using Epic MyChart surveys. Support Care Cancer. 2020;28:3153–63.

    Article  PubMed  Google Scholar 

  66. Mark TL, Johnson G, Fortner B, Ryan K. The benefits and challenges of using computer-assisted symptom assessments in oncology clinics: results of a qualitative assessment. Technol Cancer Res Treat. 2008;7:401–6.

    Article  PubMed  Google Scholar 

  67. Girgis A, Durcinoska I, Arnold A, Descallar J, Kaadan N, Koh ES, et al. Web-Based Patient-Reported Outcome Measures for Personalized Treatment and Care (PROMPT-Care): multicenter pragmatic nonrandomized trial. J Med Internet Res. 2020;22:1–14.

    Article  Google Scholar 

  68. Girgis A, Durcinoska I, Levesque Jv, Gerges M, Sandell T, Arnold A, et al. eHealth system for collecting and utilizing patient reported outcome measures for personalized treatment and care (PROMPT-Care) among cancer patients: Mixed methods approach to evaluate feasibility and acceptability. J Med Internet Res. 2017;19:1–13.

    Article  Google Scholar 

  69. Roberts NA, Janda M, Stover AM, Alexander KE, Wyld D, Mudge A, et al. The utility of the implementation science framework “Integrated Promoting Action on Research Implementation in Health Services” (i-PARIHS) and the facilitator role for introducing patient-reported outcome measures (PROMs) in a medical oncology outpatient. Qual Life Res. 2020;21:21.

    Google Scholar 

  70. Bacorro WR, Balid-Attwell SA, Sogono PG, Escuadra CJT, Reyes-Gibby C, Que JC, et al. Factors in sustained compliance to a symptom-reporting mobile application: implications for clinical implementation. J Hosp Manag Health Policy. 2018;2:19–19.

    Article  Google Scholar 

  71. Tolstrup LK, Bastholt L, Dieperink KB, Möller S, Zwisler AD, Pappot H. The use of patient-reported outcomes to detect adverse events in metastatic melanoma patients receiving immunotherapy: a randomized controlled pilot trial. J Patient Rep Outcomes. 2020;4:8.

    Article  Google Scholar 

  72. Thestrup Hansen S, Kjerholt M, Friis Christensen S, Brodersen J, Holge-Hazelton B. “I am sure that they use my PROM data for something important.” A qualitative study about patients’ experiences from a hematologic outpatient clinic. Cancer Nurs. 2020;43:E273–82.

    Article  PubMed  Google Scholar 

  73. Roberts NA, Mudge A, Alexander K, Wyld D, Janda M. The iPROMOS protocol: a stepped-wedge study to implement routine patient-reported outcomes in a medical oncology outpatient setting. BMJ Open. 2019;9:e027046.

    Article  PubMed  PubMed Central  Google Scholar 

  74. Howell D, Li M, Sutradhar R, Gu S, Iqbal J, O’Brien MA, et al. Integration of patient-reported outcomes (PROs) for personalized symptom management in “real-world” oncology practices: a population-based cohort comparison study of impact on healthcare utilization. Support Care Cancer. 2020;28:4933–42.

    Article  PubMed  Google Scholar 

  75. Absolom K, Holch P, Warrington L, Samy F, Hulme C, Hewison J, et al. Electronic patient self-Reporting of adverse-events: Patient information and advice (eRAPID): a randomised controlled trial in systemic cancer treatment. BMC Cancer. 2017;17:1–16.

    Article  Google Scholar 

  76. Holch P, Warrington L, Bamforth LCA, Keding A, Ziegler LE, Absolom K, et al. Development of an integrated electronic platform for patient self-report and management of adverse events during cancer treatment. Ann Oncol. 2017;28:2305–11.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  77. Berry DL, Blonquist TM, Patel RA, Halpenny B, McReynolds J. Exposure to a patient-centered, web-based intervention for managing cancer symptom and quality of life issues: Impact on symptom distress. J Med Internet Res. 2015;17:e136.

    Article  PubMed  PubMed Central  Google Scholar 

  78. Andikyan V, Rezk Y, Einstein MH, Gualtiere G, Leitao MM, Sonoda Y, et al. A prospective study of the feasibility and acceptability of a Web-based, electronic patient-reported outcome system in assessing patient recovery after major gynecologic cancer surgery. Gynecol Oncol. 2012;127:273–7.

    Article  PubMed  PubMed Central  Google Scholar 

  79. Green E, Yuen D, Chasen M, Amernic H, Shabestari O, Brundage M, et al. Oncology nurses’ attitudes toward the Edmonton Symptom Assessment System: results from a large cancer care ontario study. Oncol Nurs Forum. 2017;44:116–25.

    Article  PubMed  Google Scholar 

  80. Pereira JL, Chasen MR, Molloy S, Amernic H, Brundage MD, Green E, et al. Cancer care professionals’ attitudes toward systematic standardized symptom assessment and the edmonton symptom assessment system after large-scale population-based implementation in Ontario. Canada J Pain Symptom Manage. 2016;51:662-672.e8.

    Article  PubMed  Google Scholar 

  81. Girgis A, Delaney GP, Arnold A, Miller AA, Levesque Jv, Kaadan N, et al. Development and feasibility testing of PROMPT-Care, an eHealth system for collection and use of patient-reported outcome measures for personalized treatment and care: a study protocol. JMIR Res Protoc. 2016;5:e227.

    Article  PubMed  PubMed Central  Google Scholar 

  82. Girgis A, Durcinoska I, Gerges M, Kaadan N, Arnold A, Descallar J, et al. Study protocol for a controlled trial of an eHealth system utilising patient reported outcome measures for personalised treatment and care: PROMPT-Care 20. BMC Cancer. 2018;18:845.

    Article  PubMed  PubMed Central  Google Scholar 

  83. Allsop MJ, Wright-Hughes A, Black K, Hartley S, Fletcher M, Ziegler LE, et al. Improving the management of pain from advanced cancer in the community: study protocol for a pragmatic multicentre randomised controlled trial. BMJ Open. 2018;8:e021965.

    Article  PubMed  PubMed Central  Google Scholar 

  84. Maguire R, Fox PA, McCann L, Miaskowski C, Kotronoulas G, Miller M, et al. The eSMART study protocol: a randomised controlled trial to evaluate electronic symptom management using the advanced symptom management system (ASyMS) remote technology for patients with cancer. BMJ Open. 2017;7:e015016.

    Article  PubMed  PubMed Central  Google Scholar 

  85. Furlong E, Darley A, Fox P, Buick A, Kotronoulas G, Miller M, et al. Adaptation and implementation of a mobile phone-based remote symptom monitoring system for people with cancer in europe. JMIR Cancer. 2019;5:e10813.

    Article  PubMed  PubMed Central  Google Scholar 

  86. Snyder CF, Blackford AL, Wolff AC, Carducci MA, Herman JM, Wu AW. Feasibility and value of PatientViewpoint: a web system for patient-reported outcomes assessment in clinical practice. Psychooncology. 2013;22:895–901.

    Article  PubMed  Google Scholar 

  87. Craig P, Dieppe P, Macintyre S, Mitchie S, Nazareth I, Petticrew M. Developing and evaluating complex interventions: the new Medical Research Council guidance. BMJ. 2008;337:979–83.

    Google Scholar 

  88. Graham ID, Logan J, Harrison MB, Straus SE, Tetroe J, Caswell W, et al. Lost in knowledge translation: time for a map? J Contin Educ Health Prof. 2006;26:13–24.

    Article  PubMed  Google Scholar 

  89. Harvey G, Kitson A. PARIHS revisited: From heuristic to integrated framework for the successful implementation of knowledge into practice. Implement Sci. 2016;11:33.

    Article  PubMed  PubMed Central  Google Scholar 

  90. Rogers EM. Diffusion of innovations. 5th ed. Free Press; 2003.

  91. Proctor E, Luke D, Calhoun A, McMillen C, Brownson R, McCrary S, et al. Sustainability of evidence-based healthcare: Research agenda, methodological advances, and infrastructure support. Implement Sci. 2015;10:1–13.

    Article  Google Scholar 

  92. di Maio M, Basch E, Denis F, Fallowfield LJ, Ganz PA, Howell D, et al. The role of patient-reported outcome measures in the continuum of cancer clinical care: ESMO Clinical Practice Guideline. Ann Oncol. 2022;33:878–92.

    Article  PubMed  Google Scholar 

  93. Li SA, Jeffs L, Barwick M, Stevens B. Organizational contextual features that influence the implementation of evidence-based practices across healthcare settings: a systematic integrative review. Syst Rev. 2018;7:72.

    Article  PubMed  PubMed Central  Google Scholar 

  94. Urquhart R, Kendell C, Cornelissen E, Powell BJ, Madden LL, Kissmann G, et al. Identifying factors influencing sustainability of innovations in cancer survivorship care: a qualitative study. BMJ Open. 2021;11:1–11.

    Article  Google Scholar 

  95. Jansen F, van Uden-Kraan CF, van Zwieten V, Witte BI, Verdonck-de Leeuw IM. Cancer survivors’ perceived need for supportive care and their attitude towards self-management and eHealth. Support Care Cancer. 2015;23:1679–88.

    Article  PubMed  Google Scholar 

  96. Haberlin C, O’Dwyer T, Mockler D, Moran J, O’Donnell DM, Broderick J. The use of eHealth to promote physical activity in cancer survivors: a systematic review. Support Care Cancer. 2018;26:3323–36.

    Article  PubMed  Google Scholar 

  97. Nguyen H, Butow P, Dhillon H, Sundaresan P. A review of the barriers to using Patient-Reported Outcomes (PROs) and Patient-Reported Outcome Measures (PROMs) in routine cancer care. J Med Radiat Sci. 2021;68(2):186–95.

  98. Wolfenden L, Foy R, Presseau J, Grimshaw JM, Ivers NM, Powell BJ, et al. Designing and undertaking randomised implementation trials: guide for researchers. BMJ. 2021;372:m3721.

    Article  PubMed  PubMed Central  Google Scholar 

  99. Bauer MS, Damschroder L, Hagedorn H, Smith J, Kilbourne AM. An introduction to implementation science for the non-specialist. BMC Psychol BMC Psychology. 2015;3:1–12.

    Google Scholar 

  100. NHS Institute for Innovation and Improvement. Sustainability Model and Guide. 2010. https://www.england.nhs.uk/improvement-hub/wp-content/uploads/sites/44/2017/11/NHS-Sustainability-Model-2010.pdf. Accessed 7 Mar 2023.

Download references

Acknowledgements

None.

Funding

This manuscript was supported by funding from the Canadian Cancer Society/Canadian Institutes for Health Research, Grant/Award Number: 706699 (CCS), 02022–000 (CIHR) (Contact: Jennifer Jones, PhD, Princess Margaret Cancer Centre. Jennifer.jones@uhn.ca). The Canadian Cancer Society/Canadian Institutes for Health Research had no role in the preparation of or the decision to submit this report for publication. The statements made here are those of the authors.

Author information

Authors and Affiliations

Authors

Contributions

CL and SNS conceptualized the purpose of this scoping review. CL, SNS, and JMJ developed the study protocol. RF developed and conducted the search across all electronic databases. CL, SNS, KT, MA, AB, and JK carried out all aspects of data collection. CL, KT, and SNS carried out data analysis and synthesis. CL and SNS wrote the first draft of the manuscript, and revisions were made together with KT. JMJ, DML, KLC, TR, JG, MA, AB, and RF assisted with minor revisions. All authors approved the final version and revisions.

Corresponding author

Correspondence to Christian J. Lopez.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for Scoping Reviews Checklist.

Additional file 2.

Search Strategies.

Additional file 3.

Adapted ERIC descriptions for implementation strategies used in the included interventions.

Additional file 4.

Adapted descriptions for implementation outcomes reported by the included interventions.

Additional file 5.

Description of electronic prospective surveillance model system features.

Additional file 6.

Clusters and discrete implementation strategies used in the included interventions.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lopez, C.J., Teggart, K., Ahmed, M. et al. Implementation of electronic prospective surveillance models in cancer care: a scoping review. Implementation Sci 18, 11 (2023). https://doi.org/10.1186/s13012-023-01265-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13012-023-01265-4

Keywords