Skip to main content

Advertisement

Pediatric eMental healthcare technologies: a systematic review of implementation foci in research studies, and government and organizational documents

Article metrics

Abstract

Background

Researchers, healthcare planners, and policymakers convey a sense of urgency in using eMental healthcare technologies to improve pediatric mental healthcare availability and access. Yet, different stakeholders may focus on different aspects of implementation. We conducted a systematic review to identify implementation foci in research studies and government/organizational documents for eMental healthcare technologies for pediatric mental healthcare.

Methods

A search of eleven electronic databases and grey literature was conducted. We included research studies and documents from organization and government websites if the focus included eMental healthcare technology for children/adolescents (0–18 years), and implementation was studied and reported (research studies) or goals/recommendations regarding implementation were made (documents). We assessed study quality using the Mixed Methods Appraisal Tool and document quality using the Appraisal of Guidelines for Research & Evaluation II. Implementation information was grouped according to Proctor and colleagues’ implementation outcomes—acceptability, adoption, appropriateness, cost, feasibility, fidelity, penetration, and sustainability—and grouped separately for studies and documents.

Results

Twenty research studies and nine government/organizational documents met eligibility criteria. These articles represented implementation of eMental healthcare technologies in the USA (14 studies), United Kingdom (2 documents, 3 studies), Canada (2 documents, 1 study), Australia (4 documents, 1 study), New Zealand (1 study), and the Netherlands (1 document). The quality of research studies was excellent (n = 11), good (n = 6), and poor (n = 1). These eMental health studies focused on the acceptability (70%, n = 14) and appropriateness (50%, n = 10) of eMental healthcare technologies to users and mental healthcare professionals. The quality of government and organizational documents was high (n = 2), medium (n = 6), and low (n = 1). These documents focused on cost (100%, n = 9), penetration (89%, n = 8), feasibility (78%, n = 7), and sustainability (67%, n = 6) of implementing eMental healthcare technology.

Conclusion

To date, research studies have largely focused on acceptability and appropriateness, while government/organizational documents state goals and recommendations regarding costs, feasibility, and sustainability of eMental healthcare technologies. These differences suggest that the research evidence available for pediatric eMental healthcare technologies does not reflect the focus of governments and organizations. Partnerships between researchers, healthcare planners, and policymakers may help to align implementation research with policy development, decision-making, and funding foci.

Introduction

The global prevalence of mental disorders in children and adolescents is reported to be as high as 30% [1,2,3,4]. Under-diagnosis and under-treatment of childhood mental disorders are well-documented concerns [2, 5,6,7]. The current distribution, demand, structure, and costs that underpin pediatric mental healthcare services make them relatively unavailable to many of those who need them [8]. Electronic mental healthcare (eMental healthcare) technologies, which broadly include Internet-, mobile-, and, computer-based programs and resources as well as mobile phone applications, are considered promising approaches to enable more efficient use of mental healthcare resources, lower access barriers to traditional face-to-face mental healthcare, and provide flexibility in terms of standardization and personalization, interactivity, and consumer engagement [9,10,11,12,13,14].

Researchers (e.g., those developing and/or evaluating eMental healthcare technologies), healthcare planners (e.g., administrators in agencies either using or desiring to use eMental healthcare technologies), and policymakers (e.g., individuals with authority to set eMental healthcare policy for a healthcare organization or system) all convey interest and a sense of urgency in using eMental healthcare technologies to improve pediatric mental healthcare availability and access. To date, these three stakeholder groups have focused on different aspects of implementation. Researchers have studied user satisfaction [13, 15, 16] to determine that eMental healthcare technologies are acceptable to children and adolescents, and their parents and mental healthcare professionals. Healthcare planners and policymakers have discussed issues such as the cost and feasibility of eMental healthcare technologies to deliver pediatric mental healthcare [17]. Current priorities that are relevant to all three stakeholders groups are generating evidence to demonstrate how eMental healthcare technologies can be optimally incorporated within an existing healthcare system and how technology implementation can be supported within an organization or system (e.g., governance, policy, funding) [18,19,20]. These priorities may be optimally achieved through collaborations between researchers, healthcare planners, and policymakers that aim to generate evidence for integrating and supporting eMental healthcare technologies in healthcare systems. To provide recommendations for such collaborations, we conducted a systematic review to identify what aspects of implementation have been studied and reported on for pediatric eMental healthcare technologies and what implementation goals/recommendations are present in government and organizational documents relating to eMental healthcare technologies for pediatric mental healthcare. Our aim was to identify and compare current areas of focus among research studies and government/organizational documents and to use these areas of focus to propose recommendations for implementation research, policies, and funding.

Methods

Design

We systematically reviewed the literature to identify research studies and government and organizational documents with information regarding the implementation of eMental healthcare technologies for children and adolescents. We used a protocol that was developed a priori to define the objective, outline the search strategy, establish selection (inclusion/exclusion) criteria, determine implementation findings, guide the data collection process, and define the analysis. We followed the PRISMA statement checklist for reporting [21].

Search strategy

A research librarian developed and implemented the systematic search strategies using language (English) restrictions. The search was conducted in 11 electronic bibliographic databases: Medline, CINAHL, Embase, EBM Reviews, ProQuest Theses and Dissertations, Ovid MEDLINE In-Process & Other Non-Indexed Citations, OVID HealthStar, Cochrane Database of Systematic Reviews, Health Technology Assessment Database, ACP Journal Club, and SocIndex. The final Medline strategy is provided (see Additional file 1). Search terms focused on population and technology parameters used to screen for study eligibility. We also included terms related to “attitudes,” “preferences,” and “diffusion of innovation.” Thus, the search strategy was broad in order to identify potentially eligible studies that may not have been indexed using specific implementation science terms. The search was executed in each database from inception to September 30, 2015. Although the search was executed from inception to September 30, 2015, we restricted inclusion to studies and documents published after 2005.

To identify unpublished research and research-in-progress, we searched Google, the U.S. National Institutes of Health Clinical Trials database, the Australian New Zealand Clinical Trials Registry, the International Clinical Trials Registry Platform, and the UK Clinical Trials Gateway. To identify government and organizational documents, we conducted a two-pronged search: (1) targeted Google searches for relevant government, health, and technology organizations having clearly stated goals and/or funding relating to eHealth and behavioural technologies and (2) recommendations from members of our team. Overall, we created a list of 38 government and organizational websites, which was reviewed by the research team (see Additional file 2 for the full list). We also sought documents from key contacts responsible for leadership, policy, research, and information technology considered to be influential in the use of eHealth technologies. These contacts represented Canada’s Mental Health Commission e-mental health steering committee and the Ontario Centre of Excellence for Child and Youth Mental Health, the Australian Government’s Mental Health Commission and Department of Health and Ageing e-mental health expert advisory committee, the Netherland’s Dutch Association of Mental Health and Addiction Care, New Zealand’s National Health IT Board, and United Kingdom’s National Collaborating Centre for Mental Health and Mental Health Network. Reference lists of included studies and documents were also searched.

Criteria for including studies and documents for this review

We restricted the study and government/organizational document inclusion to countries from the largest English speaking eHealth markets [22]. These countries were Australia, Canada, the Netherlands, New Zealand, the United Kingdom, and the USA. Studies of any design were eligible for inclusion.

Population of interest

Studies and government/organizational documents that included children and/or adolescents (0–18 years) as participants (studies only) or a population of interest (documents) were considered for inclusion. Government and organizational documents that focused on eMental healthcare technologies for all ages (including children and/or adolescents) were also eligible for inclusion.

eMental healthcare technology

Studies and government/organizational documents were eligible for inclusion if they evaluated/focused on eMental healthcare technology that met our definition: Internet-, computer-, or mobile-based programs and applications (‘apps’). Studies and documents of eMental healthcare technologies focused exclusively on phone calls or teleconferencing were excluded from the review as these technologies did not meet our definition of eMental healthcare. Studies and documents that focused on eMental healthcare technology use with parents of children with a mental health need or pediatric healthcare professionals were eligible for inclusion.

Implementation findings, goals, and/or recommendations

All outcomes, goals, and recommendations relating to implementation were considered. We used Proctor and colleagues’ eight outcome categories for implementation research [23] as a framework to identify outcomes, objectives, goals, and recommendations of interest in the studies and documents. To be included, a study needed to evaluate and report on at least one of the eight categories, and a document needed to contain at least one goal and/or recommendation that related to a category. The eight categories are as follows: acceptability, adoption, appropriateness, cost, feasibility, fidelity, penetration, and sustainability [23]. These outcomes have shaped traditional mental health services integration in routine care [24,25,26,27,28] and provide a common taxonomy for examining eMental healthcare technology implementation. For the purpose of this review, the categories were defined as follows: acceptability, a measure of satisfaction with the technology (including attitudes, functionality, preferences, and user experience); adoption, the intention, initial decision, or action to try or employ technology (e.g., uptake and utilization); appropriateness, the perceived fit, relevance, usefulness/helpfulness, or compatibility of the technology for a given practice setting, professional, or user and/or perceived fit of a technology to address a particular issue or problem; cost, the financial impact of an implementation effort (including measures of cost-effectiveness or cost-benefit); feasibility, the extent to which a technology could be successfully used or carried out within a setting (including utility, compatibility, and barriers); fidelity, the degree to which a technology was implemented as it was intended such as adherence; penetration, the integration of a practice within a service setting and its subsystems (e.g., “spread” or “reach”); and sustainability, the extent to which a newly implemented treatment is maintained or integrated within a service setting’s ongoing, stable operations [23].

Screening for eligibility

Studies and documents were organized and screened using EndNote X7.2.1. Studies and documents were first screened at the title and abstract level (stage 1 screening) to determine whether they met the inclusion criteria. At stage 1, two reviewers (NDG, AS) independently screened the title and abstract for the first 100 studies/documents in the library and subsequently calculated inter-rater agreement with the kappa statistic [29]. The agreement was not sufficiently high (Cohen’s kappa, κ = 0.70), and we determined that “implementation” was an unclear term to aid in determining whether studies and documents provided implementation information on implementation outcomes, objectives, goals, and/or recommendations. We introduced Procter’s implementation outcomes framework [23] to the review protocol at this time, and two reviewers independently screened another 100 studies/documents in the library. This screening resulted in “almost perfect agreement” (Cohen’s kappa, κ = 0.84) [30], which indicated consensus on the definition of implementation. The remaining studies/documents in the library were then divided in two with each reviewer taking a respective half to screen using the title and abstract. Any studies/documents where it could not be determined whether they met inclusion criteria using the article’s title and abstract progressed to a review of the full-text (stage 2 screening) to determine eligibility. Any discrepancies were discussed between the reviewers and taken to a third party (ASN) if no agreement could be reached.

Data extraction

Data were extracted into a standardized form (Microsoft Excel; Microsoft, Redmond, Washington, USA). Extracted data were: (1) key article characteristics (e.g., author, date of publication, country); (2) study or document objectives; (3) technology type(s) (Internet-, mobile-, or computer-based) and services/treatments delivered; (4) details about research study design; (5) target population/study participants; and (6) implementation outcomes and findings from studies and implementation goals and/or recommendations from government/organizational documents according to Proctor et al. (acceptability, adoption, appropriateness, cost, feasibility, fidelity, penetration, and sustainability) [23]. Included studies and documents were divided between two reviewers (NDG, AS) who extracted the data from their respective half and then checked the other’s extraction for accuracy and completeness. Discrepancies were resolved by discussion and/or by contacting corresponding authors of included studies/documents for clarification.

Quality assessment

The quality of the research studies was assessed using the Mixed Methods Appraisal Tool (MMAT) [31]. The MMAT is applicable to quantitative, qualitative, and mixed methods studies. The scoring scale ranges from 0 (low quality) to 100 (high quality) and has been pilot tested for reliability in systematic reviews [32]. Ratings are specific to particular methodologies and are based on control of confounding factors, completeness of outcome data, minimization of selection bias, representativeness of sample, appropriateness of measures, response and withdrawal rates, appropriateness of study design to answer the research questions, and consideration of limitations. Two reviewers (NDG, AS) independently completed the MMAT for each included study and inter-rater agreement was considered “almost perfect” (Cohen’s kappa, κ = 0.81) [30]. Discrepancies were resolved by a third party (ASN).

The quality of government and organizational documents was assessed using the Appraisal of Guidelines for Research & Evaluation II (AGREE II) [33]. AGREE II assesses six domains: scope and purpose, stakeholder involvement, rigor of development, clarity of presentation, applicability, and editorial independence. While originally intended for clinical practice guidelines, most domains are applicable to government and organizational documents. Two domains, “clarity of presentation” and “applicability,” have criteria specific to guideline recommendations; however, because we were interested in document recommendations, these domains were applicable to our use. The AGREE II provided an overall quality score from 0% (low quality) to 100% (high quality), and a recommendation as to whether the document was recommended: (1) for use, (2) for use with modifications, or (3) not for use. Two reviewers (NDG, AS) independently completed the AGREE II for each document. Domain and overall quality scores were calculated by summing up the scores given by two reviewers [34]. Recommendations for document use were decided by consensus between the two reviewers. Discrepancies in recommendations for document use were resolved by a third party (ASN).

Data analysis

Data analysis followed two approaches. First, two reviewers (NDG, AS) conducted a narrative (descriptive) synthesis [35] to produce a summary of the research studies and government/organizational documents included in the review. This summary encompassed the five domains for which we extracted data. Second, implementation data from each research study and government/organizational document were grouped according to the eight implementation outcomes framework [23]. A cell remained empty if there was no relevant data pertaining to an implementation outcome. Data categorization was reviewed and discussed by NDG, ASN, and LW until consensus was achieved, and all data were coded into appropriate outcomes.

As a final step, a template approach to text analysis [36] was undertaken by two reviewers (NDG, AS). This approach involved bringing together the narrative synthesis and grouped implementation data under the implementation outcomes framework. This analytic step allowed the research team to identify the implementation foci of research studies as compared to government/organizational documents so that recommendations for implementation research, policies, and funding could be formulated.

Results

Literature search and selection

As shown in Fig. 1, after the removal of duplicates, the literature search identified 3818 articles for screening; 3737 research articles and 81 government and organizational documents. A total of 3058 articles were excluded after screening the titles and abstracts. The full texts of the remaining 760 articles were reviewed. Of these, 29 articles were included in the review: 9 government/organizational documents and 20 research studies.

Fig. 1
figure1

Literature search flow diagram

Research study and government/organizational document characteristics

As shown in Table 1, of the nine government and organizational documents, five focused primarily on eMental healthcare services for youth [17, 37,38,39,40] while the remaining discussed eMental healthcare for the general population, including youth [41,42,43]. Although some reports contained research elements (i.e., literature reviews, stakeholder interviews, surveys) [38,39,40, 42, 44], they were categorized as government/organizational documents given their affiliations. Four documents originated from Australia [37, 38, 43, 44], two from the United Kingdom [39, 42], two from Canada [17, 40], and one from the Netherlands [41]. Following quality assessment, one document was found to be of poor quality (25%) and was not to be recommended for use [41], six documents were found to be of medium quality (range 58–83%) and were recommended for use given modification [17, 37,38,39, 42, 44], and two documents were found to be of high quality (92%) and were recommended for use without modifications [40, 43] (see Additional file 3).

Table 1 Government and organizational reports on eMental healthcare implementation

Table 2 outlines the 20 research studies included in the review. Fourteen studies were from the USA [45,46,47,48,49,50,51,52,53,54,55,56,57,58], three from the United Kingdom [59,60,61], and one each from Australia [62], Canada [63], and New Zealand [64]. Across the studies, fourteen examined eMental healthcare technologies to be used by children and adolescents [45, 46, 48,49,50,51, 54,55,56, 59,60,61,62, 64], three examined technologies to be used by healthcare professionals when interacting with pediatric patients [47, 53, 63], and three examined technologies to be used by parents on behalf of their child [52, 57, 58]. One study was assessed as being of extremely poor quality, receiving a MMAT score of 0 [50], two studies were of poor quality and received a score of 50 [48, 49], six were of good quality and received a score of 75 [47, 51, 53, 56, 58, 61], and the remaining 11 were of excellent quality with a score of 100 [45, 46, 52, 54, 55, 57, 59, 60, 62,63,64]. An additional file shows details of the quality assessment (see Additional file 4).

Table 2 Research studies that have examined the implementation of eMental healthcare technologies, listed in order of publication date

Implementation outcomes

Table 3 presents the implementation outcomes and findings from research studies and implementation goals and recommendations from government and organizational documents, organized according to Proctor and colleagues’ implementation outcomes.

Table 3 Implementation outcomes investigated by research studies and addressed/recommended in government and organizational documents

Acceptability

Fourteen research studies (70%) examined acceptability (ten quantitatively [46, 49,50,51,52,53,54, 56, 58, 64], three [48, 59, 62] qualitatively, and one using both qualitative and quantitative methods [45]). The majority of studies were of good or excellent quality [45, 46, 51,52,53,54, 56, 58, 59, 62, 64]; three were of poor quality [48,49,50]. How acceptability was defined varied considerably across studies: satisfaction with the technology [46, 56, 64], functionality of the technology [62], attitudes towards the technology [50], technology preferences [51, 59], user experience using the technology [52, 56, 58, 59], and acceptability of the technology [45, 48, 54]. Studies of acceptability reported that participants responded favorably to eMental healthcare technologies [45, 46, 48,49,50, 53, 54, 58, 62, 64] and liked that they provided autonomy, convenience, anonymity, and accessibility [51, 52, 56, 59, 62, 64]—particularly when technologies were designed specifically for the youth [54, 56, 64].

Five government/organizational documents (56%) included goals or recommendations relating to the acceptability of eMental healthcare technology. Documents were mainly of medium quality (range 58–83%) [17, 37, 39, 44]; although, one document was of high quality [43]. These documents described the need for better incorporation of consumer preference during design planning to ensure such services are user-centered and individualized [17, 39, 43, 44] and emphasized the need to increase public awareness of the acceptability of eMental healthcare technology [17, 37, 43].

Adoption

Five studies (25%), of good or excellent quality, measured adoption of eMental health technology for pediatric mental healthcare [46, 47, 53, 55, 57]: pre-post measurements of treatment attendance [46], screening rates [47, 55], number of intake appointments [57], and uptake into healthcare practice [53]. Studies showed that children and adolescents receiving the eMental healthcare technology demonstrated significantly higher rates of attendance [46], and electronic technologies were associated with improved clinician completion rates [57]; however, studies concerned with healthcare professional adoption showed moderate screening rates and uptake into practice [47, 53, 55]. Government and organizational documents did not address actual eMental healthcare technology adoption; although, documents did discuss the intent to use technology to provide mental healthcare.

Appropriateness

Appropriateness was examined in ten studies (50%) [46, 47, 50, 52, 59,60,61,62,63,64]. One study was of poor quality [50], and the remaining were of good or excellent quality [46, 47, 52, 59,60,61,62,63,64]. The definition of appropriateness varied considerably across studies: appropriateness [46, 47, 50, 52, 60, 61, 63], relevance [59, 63], usefulness [59, 61,62,63,64], suitability [59, 61, 64], and perceived fit [64]. Of these studies, five measured this construct qualitatively [47, 52, 59, 60, 62], four quantitatively [46, 50, 63, 64], and one using both qualitative and quantitative measures [61]. Most studies examined the appropriateness of eMental healthcare technology for healthcare professionals and their settings [46, 47, 50, 52, 60,61,62,63], while some studies examined the appropriateness of the technology for children and adolescents [59, 61, 64]. Computer-based treatments and online management systems were deemed appropriate to mental healthcare practices in that the technologies allowed for less preparation time and provided facilitation and appointment planning [52, 62]. Adolescents found eMental healthcare helpful for improving appointment attendance [46], and one study demonstrated healthcare professional competency and compatibility at employing the technology within practice [50]. Healthcare professionals regarded a tool’s use and appropriateness as very important for successful implementation [63] and believed eMental healthcare had the potential to be helpful for children and adolescents, especially for mild to moderate problems [61]. However, some studies showed that healthcare professionals perceived interference with the therapeutic relationship due to eMental healthcare technologies [47, 60]. Similarly to acceptability, the accessibility, flexibility, and anonymity of eMental healthcare were factors that influenced treatment preference for web-based over face-to-face interventions [59] and eMental healthcare was at least as good a treatment as usual in primary healthcare sites [64].

Five government and organizational documents (56%) addressed appropriateness. One document was of high quality [40], while the remaining were of medium quality (range 58–83%) [17, 38, 42, 44]. Documents considered appropriateness from the perspectives that (1) additional research and development should be conducted to assure healthcare quality and safety standards are not compromised [17, 40, 42, 44], (2) new eMental healthcare technologies should be integrated and evaluated within existing health and technology policies (including IT aspects) [17, 40, 42, 44], and (3) technologies need to be accessible to rural, regional, and indigenous communities [38, 40] to ensure appropriateness to these populations.

Cost

Of the 20 research studies, one qualitative study (5%) of good quality considered cost issues from system and organization perspectives and described the need to address cost coverage in terms of who covers treatment costs and how to integrate third party payers [56].

All nine government and organizational documents (100%) cited cost as a major component of future eMental healthcare implementation efforts [17, 37,38,39,40,41,42,43,44]. One of these documents one was of low quality [41], four were of medium quality (range 58–83%) [37, 38, 42, 44], and one was of excellent quality [40]. Across the government and organizational documents, start-up costs for developing or implementing technology into practice [17, 37, 38, 40], reimbursement to consumers from health insurance companies [17, 39, 44], the cost-effectiveness of blended care models [41], and billing requirements for healthcare professionals providing eMental healthcare services were discussed [17, 39, 44]. Overall, the government/organizational literature recommended establishing and evaluating a sustainable funding model to address high development and continuing maintenance costs [17, 37, 39, 42,43,44] and allocating more government funds to further support research and development [42].

Feasibility

Two studies (10%) quantitatively measured eMental healthcare technology feasibility [48, 63]. These studies were of poor quality [48] and excellent quality [63]. The definition of feasibility did not vary between the studies. While one of the studies found eMental healthcare technology to be feasible [48], the other found that healthcare professionals perceived professional development and workload as feasibility challenges (for example, rate changes to workload may be required when adopting an eMental healthcare technology within a professional’s clinical workflow) [63].

Seven government/organizational documents (78%) of excellent [40, 43] and medium quality (range 58–83%) [17, 37, 38, 42, 44] described the need to address feasibility if implementation of eMental healthcare technologies are to be successful [17, 37, 38, 40, 42,43,44]. Document recommendations included ensuring training and education programs for healthcare professionals and organizations as a means to ensure technology feasibility. That is, although a technology may be appropriate for a given setting, it may not be feasible to implement it if resources and training are not available [17, 37, 38, 40, 42,43,44].

Fidelity

Three studies (15%) examined fidelity by quantifying adherence to the technology [48, 57] and reasons for non-completion [64]. Two of these studies were of excellent quality [57, 64] and one was of poor quality [48]. Studies reporting on fidelity found either perfect [48] or good [57, 64] adherence to the technology protocol, finding reasons for non-completion such as technical glitches and lack of time and/or interest [64].

Recommendations from seven government and organizational documents (67%) concentrated on the importance of fidelity in implementation evaluations [17, 38,39,40,41, 43, 44]. Of these documents one was of poor quality [41], four were of medium quality (range 58–83%) [17, 38, 39, 44], and two were of excellent quality [40, 43]. Government and organizational documents also described concerns about dropout and lack of follow-up with the use of eMental healthcare technologies [17, 38], and the need for evaluations to determine if such technologies are being delivered as intended and are as innovative as described [17, 38,39,40,41, 43, 44].

Penetration and sustainability

Penetration and sustainability of eMental healthcare technologies were not considered in the research studies included in this review, but were addressed in government and organizational documents. Eight documents (89%) described recommendations related to the penetration of eMental healthcare technologies [17, 37,38,39,40, 42,43,44], while six described sustainability recommendations [17, 38, 40, 42,43,44]. Six of the documents recommending penetration outcomes were of medium quality (range 58–83%) [17, 37,38,39, 42, 44] and two were of excellent quality [40, 43]. Four documents examining sustainability outcomes were of medium quality (range 58–83%) [17, 38, 42, 44], and two were of excellent quality [40, 43]. Penetration recommendations included linking traditional and eMental healthcare services together at multiple points using a stepped care model to allow for cross-referral [17, 37,38,39,40, 42, 43] and conducting further research, development, and evaluation in routine clinical settings [39, 44] with large sample sizes to ensure scalability [17]. Sustainability recommendations included the need for policy reform by way of establishing standards for privacy and security [17, 40, 42, 44], devising a national eMental healthcare strategy/protocol and creating stricter governance to facilitate technology implementation [38, 42, 43], and using technology to foster collaboration [17].

Discussion

There is an increasing interest and sense of urgency from the perspectives of researchers, healthcare planners, and policymakers to use eMental healthcare technologies in pediatric mental healthcare to improve healthcare availability and access. To date, however, these stakeholders have tended to focus on different aspects of implementation. If effective eMental healthcare services are to become a core component of routine service delivery, these different areas of focus need to be identified so that alignment of research, policies, and funding can occur. We undertook a systematic review to identify research studies and government and organizational documents that describe eMental healthcare technology implementation in healthcare systems for children and adolescents and explore what areas of implementation researchers, healthcare planners, and policymakers have historically focused on. This approach allowed us to identify areas of focus in the current eMental healthcare landscape in terms of implementation outcomes among different stakeholder groups and to use these areas of focus to propose recommendations for implementation research, policies, and funding. The takeaway points of this review are as follows:

Implementation foci differ between researchers, healthcare planners, and policymakers

We found multiple differences in implementation outcome foci between government and organizational literature and research studies. Consistent with other reviews [18, 65], the research studies in this review predominantly focused on patient and clinician outcomes (e.g., use, satisfaction, acceptability). Areas of focus in government and organizational reports were cost, penetration, and feasibility. Differences in underlying positions between stakeholder groups may lead to opposing criteria for successful eMental healthcare implementation and/or may jeopardize stakeholder engagement [66]. The task of disseminating evidence on the success of these initiatives does not occur in separate asocial and apolitical bubbles [67]. They are often produced by, and in turn feed back into, the political process of deciding priorities and allocating resources to pursue them.

Distinct knowledge holders often have differing mentalities; governing bodies are often highly incentivized by the lowest cost for the most efficient use of resources, while research tends to be incentivized by innovation and/or validating their work [68, 69]. Thus, the differences in underlying priorities may reflect different values and goals of governing bodies and researchers. For example, due to underlying incentives, government and organization priorities may naturally gravitate towards long-term benefits of eMental healthcare (i.e., penetration and sustainability) as this aligns with the mandate of cost versus benefits. Given that eMental healthcare is a relatively new field, research may be appropriately focused on establishing an eMental healthcare innovation as acceptable and appropriate to users before studying other implementation constructs. As the field matures, there is an opportunity to study implementation outcomes, such as cost, penetration, and feasibility, that would be of interest and value to governments and organizations [23, 70]. Moving forward, alignment of funding interests and partnerships between stakeholder groups may allow for the development of funding models to study outcomes that require longer-term investigation and substantial effort such as fidelity, penetration, and sustainability. Future research endeavors may also benefit from using a structured, theory-driven methodology to compile, evaluate, and integrate eMental healthcare information such as the Health Information Technologies—Academic and Commercial Evaluation (HIT-ACE) methodology [71].

A taxonomy for implementation outcomes is needed for the eMental healthcare technology field

Consistent with the broader literature, this review reflects a nonstandard usage of terminology and lack of consensus on a taxonomy relating to implementation metrics [72,73,74]. For example, there was overlap and inconsistency in the operationalization and discussion of appropriateness and acceptability; adoption was often used interchangeably with uptake; fidelity was often referred to as adherence; feasibility described as acceptability; and sustainability was reflected in the literature by varying, but congruent definitions, such as maintenance, incorporation, and integration [23, 75,76,77,78].

Findings from a recent review to identify instruments to measure implementation in mental healthcare included an uneven distribution of instruments across implementation outcomes [79]. Although constructs such as acceptability and adoption are well recognized among instruments, other constructs are either underdeveloped or do not lend themselves easily to instrumentation, yielding few instruments to measure a range of implementation outcomes. More groundwork is needed to enable consistency in the definition and measurement of implementation constructs rather than considering them at an abstract level. This approach will help to align areas of focus and discussions between researchers, healthcare planners, and policymakers.

Limitations

This systematic review has several limitations. First, our research study search results may be limited given “behavioral health” or its derivatives were not used in our search string. While not all eMental healthcare studies will be indexed solely with these terms, there may be some that are. These studies would not have been included in our review. Additionally, while our search to identify unpublished research and research-in-progress was extensive, we did not search other databases, such as the NIH Reporter, which may have yielded additional eMental health technology studies. While some of the studies in the NIH Reporter may have been additionally registered in the registries we searched, some may not have been.

Another limitation is our decision to restrict inclusion of government and organizational reports from countries with known expertise and knowledge in the area of eMental healthcare. Thus, this review’s findings may not generalize to low and middle-income countries where mental healthcare systems may be organized differently and technologies used in different ways. That none of our included government/organizational documents originated in the USA, while 15 of our research studies did, could display an unrepresented sample in the review, and may explain the lack of alignment between the groups of literature we studied. Further, not all of the government and organizational websites we reviewed may have made implementation information available on the web. It was not feasible for the research team to contact all organizations by email or phone to request information that may have been in paper-based form only.

That our review was limited to English language documents also limits the literature we included in the review, and thus the generalizability of the results. We also believe that there are eMental healthcare technologies being deployed in healthcare systems that have not been scientifically investigated; thus, important implementation data for these technologies would not be available for our review.

The decision to use Proctor and colleagues’ implementation outcomes as a guiding framework was not applied a priori. Therefore, specific implementation outcome terms were not included in our search string and could have resulted in potentially relevant studies not being identified in the search results if they were indexed solely using implementation outcome terms. However, the search strategy was developed using terms specific to our population and interventions of interest. Studies of eMental healthcare technologies for children and/or adolescents are more likely to be solely indexed according to these terms and may or may not include indexing with implementation terms. Our descriptive approach to synthesizing implementation may also be criticized; however, in doing so, we believe we were able to identify an important perspective of how eMental healthcare is currently being discussed at the governmental/organizational level alongside research developments.

Finally, the inconsistent use of terminology [72,73,74] across the literature in this review required us to make judgment calls regarding how to categorize implementation outcomes. There also remains conceptual ambiguity and overlap among implementation outcomes (e.g., acceptability and appropriateness), which could have resulted in some factors being arguably different constructs [23, 79]. However, the difficulty in grouping these metrics within the implementation outcomes further confirms the lack of agreement regarding constructs hypothesized to affect implementation success and the identifiable measures of these constructs [80].

Conclusion

This systematic review identified differing implementation foci between research studies and government/organizational documents, and a lack of consistent implementation taxonomies and metrics. These differences mean that the research evidence available to support eMental healthcare technologies for pediatric mental healthcare is not well aligned with the focus of governments and organizations. Using the results of this review as a guide, partnerships between researchers, healthcare planners, and policymakers may help to align implementation research, policies, and funding foci.

References

  1. 1.

    Belfer ML. Child and adolescent mental disorders: the magnitude of the problem across the globe. J Child Psychol Psychiatry. 2008;49(3):226–36. doi:10.1111/j.1469-7610.2007.01855.x.

  2. 2.

    Waddell C, McEwan K, Shepherd CA, Offord DR, Hua JM. A public health strategy to improve the mental health of Canadian children. Can J Psychiatry. 2005;50(4):226–33. doi:10.1177/070674370505000406.

  3. 3.

    Steel Z, Marnane C, Iranpour C, et al. The global prevalence of common mental disorders: a systematic review and meta-analysis 1980-2013. Int J Epidemiol. 2014;43(2):476–93. doi:10.1093/ije/dyu038.

  4. 4.

    Polanczyk GV, Salum GA, Sugaya LS, Caye A, Rohde LA. Annual research review: a meta-analysis of the worldwide prevalence of mental disorders in children and adolescents. J Child Psychol Psychiatry. 2015;56(3):345–65. doi:10.1111/jcpp.12381.

  5. 5.

    Merikangas KR, He J, Brody D, Fisher PW, Bourdon K, Koretz DS. Prevalence and treatment of mental disorders among US children in the 2001-2004 NHANES. Pediatrics. 2010;125(1):75–81. doi:10.1542/peds.2008-2598.

  6. 6.

    World Health Organization. Mental Health Atlas 2014. Geneva, Switzerland: World Health Organization; 2013. http://www.who.int/entity/mental_health/evidence/atlas/executive_summary_en.pdf?ua=1, Accessed 8 Jul 2016.

  7. 7.

    Pelletier L, O’Donnell S, Dykxhoorn J, McRae L, Patten S. Under-diagnosis of mood disorders in Canada. Epidemiol Psychiatr Sci. 2016;6:1–10. doi:10.1017/S2045796016000329.

  8. 8.

    Hickie IB, McGorry PD. Increased access to evidence-based primary mental health care: will the implementation match the rhetoric? Med J Aust. 2007;187(2):100–3.

  9. 9.

    Riper H, Andersson G, Christensen H, Cuijpers P, Lange A, Eysenbach G. Theme issue on e-mental health: a growing field in internet research. J Med Internet Res. 2010;12(5):e74. doi:10.2196/jmir.1713.

  10. 10.

    Christensen H, Hickie IB. E-mental health: a new era in delivery of mental health services. Med J Aust. 2010;192(11 Suppl):S2–3.

  11. 11.

    Christensen H, Hickie IB. Using e-health applications to deliver new mental health services. Med J Aust. 2010;192(11 Suppl):S53–6.

  12. 12.

    Lal S, Adair CE. E-mental health: a rapid review of the literature. Psychiatr Serv. 2014;65(1):24–32. doi:10.1176/appi.ps.201300009.

  13. 13.

    Boydell KM, Hodgins M, Pignatiello A, Teshima J, Edwards H, Willis D. Using technology to deliver mental health services to children and youth: a scoping review. J Can Acad Child Adolesc Psychiatry. 2014;23(2):87–99.

  14. 14.

    Hollis C, Morriss R, Martin J, et al. Technological innovations in mental healthcare: harnessing the digital revolution. Br J Psychiatry. 2015;206(4):263–5. doi:10.1192/bjp.bp.113142612.

  15. 15.

    Burns JM, Birrell E, Bismark M, et al. The role of technology in Australian youth mental health reform. Aust Health Rev. 2016. doi:10.1071/AH15115.

  16. 16.

    Struthers A, Charette C, Bapuji SB, et al. The acceptability of e-mental health services for children, adolescents, and young adults: a systematic search and review. Can J Commun Ment Health. 2015;34(2):1. doi:10.7870/cjcmh-2015-006.

  17. 17.

    Mental Health Commission of Canada. e-Mental health in Canada: transforming the mental health system using technology. A briefing document. Mental Health Comission of Canada. 2014. http://www.mentalhealthcommission.ca/sites/default/files/MHCC_E-Mental_Health-Briefing_Document_ENG_0.pdf. Accessed 13 Jan 2016.

  18. 18.

    Meurk C, Leung J, Hall W, Head BW, Whiteford H. Establishing and governing e-Mental health care in Australia: a systematic review of challenges and a call for policy-focussed research. J Med Internet Res. 2016;18(1):e10. doi:10.2196/jmir.4827.

  19. 19.

    Whiteford H, Harris M, Diminic S. Mental health service system improvement: translating evidence into policy. Aust N Z J Psychiatry. 2013;47(8):703–6. doi:10.1177/0004867413494867.

  20. 20.

    Young S. Evidence-based policy: a practical guide to doing it better. Pol Stud Rev. 2014;12(2):299–300. doi:10.1111/1478-9302.12053_82.

  21. 21.

    Moher D, Liberati A, Tetzlaff J, Altman DG. Preferred Reporting Items for Systematic Reviews and Meta-Analyses: the PRISMA statement. Int J Surg. 2010;8:336–41. doi:10.1016/j.ijsu.2010.02.007.

  22. 22.

    World Health Organization. Atlas of eHealth country profiles: The use of eHealth in support of universal health coverage: based on the findings of the third global survey on eHealth 2015. Geneva, Switerland: World Health Organization 2016. http://apps.who.int/iris/bitstream/10665/204523/1/9789241565219_eng.pdf?ua=1. Accessed 18 Jun 2016.

  23. 23.

    Proctor E, Silmere H, Raghavan R, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. 2011;38(2):65–76. doi:10.1007/s10488-010-0319-7.

  24. 24.

    Torrey WC, Bond GR, McHugo GJ, Swain K. Evidence-based practice implementation in community mental health settings: the relative importance of key domains of implementation activity. Adm Policy Ment Health. 2012;39(5):353–64. doi:10.1007/s10488-011-0357-9.

  25. 25.

    McMillen JC, Lenze SL, Hawley KM, Osborne VA. Revisiting practice-based research networks as a platform for mental health services research. Adm Policy Ment Health. 2009;36(5):308–21. doi:10.1007/s10488-009-022-2.

  26. 26.

    Brooks H, Pilgrim D, Rogers A. Innovation in mental health services: what are the key components of success? Implement Sci. 2011;6(1):1. doi:10.1186/1748-5908-6-120.

  27. 27.

    Southam-Gerow MA, Rodriguez A, Chorpita BF, Daleiden EL. Dissemination and implementation of evidence based treatments for youth: challenges and recommendations. Prof Psychol Res Pr. 2012;43(5):527–34.

  28. 28.

    Stanhope V, Tuchman E, Sinclair W. The implementation of mental health evidence based practices from the educator, clinician and researcher perspective. Clin Soc Work J. 2011;39(4):369–78. doi:10.1007/s10615-010-0309-y.

  29. 29.

    Altman D. Practical statistics for medical research. London, UK: Chapman and Hall; 1991.

  30. 30.

    Viera AJ, Garrett JM. Understanding interobserver agreement: the kappa statistic. Fam Med. 2005;37(5):360–3.

  31. 31.

    Pluye P, Robert E, Cargo M, et al. A mixed methods appraisal tool for systematic mixed studies reviews. Montreal: McGill University; 2011. p. 1–8.

  32. 32.

    Pace R, Pluye P, Bartlett G, et al. Testing the reliability and efficiency of the pilot Mixed Methods Appraisal Tool (MMAT) for systematic mixed studies review. Int J Nurs Stud. 2012;49:47–53. doi:10.1016/j.ijnurstu.2011.07.002.

  33. 33.

    Brouwers MC, Kho ME, Browman GP, et al. AGREE II: advancing guideline development, reporting and evaluation in healthcare. Can Med Assoc J. 2010;182(18):E839–42. doi:10.1503/cmaj.090449.

  34. 34.

    The AGREE Next Steps Consortium. Appraisal of Guideelines for Research & Evaluation II: AGREE II Instrument. 2013. http://www.agreetrust.org/wp-content/uploads/2013/10/AGREE-II-Users-Manual-and-23-item-Instrument_2009_UPDATE_2013.pdf. Accessed 5 Aug 2016.

  35. 35.

    Popay J, Roberts H, Sowden A, et al. Guidance on the conduct of narrative synthesis in systematic reviews: a product from the ESRC methods programme. 2006.

  36. 36.

    Crabtree B, Miller W. A template approach to text analysis: developing and using codebooks. In: Miller WL, editor. Doing qualitative research: research methods for primary care. Newbury Park, CA: Sage Publications, Inc.; 1992. p. 93–109.

  37. 37.

    Hosie A, Vogl G, Carden J, Hoddinott J, Lim S. A way forward: equipping Australia’s mental health system for the next generation: Reach Out Australia EY. 2015. http://www.ey.com/Publication/vwLUAssets/Equipping-Australias-mental-health-system/$FILE/EY-Equipping-Australias-mental-health-system.pdf. Accessed 13 Jan 2016.

  38. 38.

    The Sax Institute. Strategies for adopting and strengthening e-mental health: Mental Health Commission of New South Wales. 2014. http://nswmentalhealthcommission.com.au/sites/default/files/assets/File/Report%20-%20The%20Sax%20Institute%20E-Mental%20Health%20Evidence%20Review%20cover%20page.pdf. Accessed 13 Jan 2016.

  39. 39.

    National Collaborting Centre for Mental Health. E-therapies systematic review for children and young people with mental health problems: National Collaborating Centre for Mental Health; MindEd E-portal Consortium; 2014. https://www.minded.org.uk. Accessed 13 Jan 2016.

  40. 40.

    Boydell KM, Hodgins M, Pignatiello A, Edwards H, Teshima J, Willis D. Using technology to deliver mental health services to children and youth in Ontario: Ontario Centre of Excellence for Child and Youth Mental Health. 2013. http://www.excellenceforchildandyouth.ca/sites/default/files/policy_using_technology_0.pdf. Accessed 13 Jan 2016.

  41. 41.

    GGZNederland. E-mental health in the Netherlands: Dutch Association of Mental Health and Addiction Care; 2013. http://www.ggznederland.nl/uploads/assets/20130514%20Factsheet%20eHealth.pdf. Accessed 13 Jan 2016.

  42. 42.

    Mental Health Network. The future’s digital: mental health and technology: Mental Health Network NHS Confederation; 2014. http://www.nhsconfed.org/~/media/Confederation/Files/Publications/Documents/the-futures-digital.pdf. Accessed 13 Jan 2016.

  43. 43.

    Christensen H, Proudfoot J, Woodward A, et al. e-mental health services in Australia 2014: current and future. 2014. https://emhalliance.fedehealth.org.au/wp-content/uploads/sites/42/2014/10/e-Mental-Health-in-Australia-2014.pdf. Accessed 13 Jan 2016.

  44. 44.

    Department of Health and Ageing. The eHealth readiness of Australia’s medical specialists. Canberra, Australia: Commonwealth of Australia; 2011. http://www.health.gov.au/internet/publications/publishing.nsf/Content/ehealth-readiness-medical-specialists-toc. Accessed 13 Jan 2016.

  45. 45.

    Gonzales R, Douglas Anglin M, Glik DC. Exploring the feasibility of text messaging to support substance abuse recovery among youth in treatment. Health Educ Res. 2014;29(1):13–22. doi:10.1093/her/cyt094.

  46. 46.

    Branson CE, Clemmey P, Mukherjee P. Text message reminders to improve outpatient therapy attendance among adolescents: a pilot study. Psychol Serv. 2013;10(3):298–303. doi:10.1037/a0026693.

  47. 47.

    John R, Buschman P, Chaszar M, Honig J, Mendonca E, Bakken S. Development and evaluation of a PDA-based decision support system for pediatric depression screening. Stud Health Technol Inform. 2007;129(2):1382–6.

  48. 48.

    Reuland MM, Teachman BA. Interpretation bias modification for youth and their parents: a novel treatment for early adolescent social anxiety. J Anxiety Disord. 2014;28(8):851–64. doi:10.1016/janxdis.2014.09.011.

  49. 49.

    Gladstone T, Marko-Holguin M, Henry J, Fogel J, Diehl A, Van Voorhees BW. Understanding adolescent response to a technology-based depression prevention program. J Clin Child Adolesc Psychol. 2014;43(1):102–14. doi:10.1080/15374416.2013.850697.

  50. 50.

    Eisen JC, Marko-Holguin M, Fogel J, et al. Pilot study of implementation of an internet-based depression prevention intervention (CATCH-IT) for adolescents in 12 US primary care practices: clinical and management/organizational behavioral perspectives. Prim Care Companion CNS Disord. 2013;15(6). doi:10.4088/PCC.10m01065

  51. 51.

    Iloabachie C, Wells C, Goodwin B, et al. Adolescent and parent experiences with a primary care/Internet-based depression prevention intervention (CATCH-IT). Gen Hosp Psychiatry. 2011;33(6):543–55. doi:10.1016/j.genhosppsych.2011.08.004.

  52. 52.

    Fothergill KE, Gadomski A, Solomon BS, Olson AL, Gaffney CA, Wissow LS. Assessing the impact of a web-based comprehensive somatic and mental health screening tool in pediatric primary care. Acad Pediatr. 2013;13(4):340–7. doi:10.1016/j.acap.2013.04.005.

  53. 53.

    Han C, Voils C, Williams J. Uptake of web-based clinical resources from the MacArthur Initiative on depression and primary care. Community Ment Health J. 2013;49(2):166–71. doi:10.1007/s10597-011-9461-2.

  54. 54.

    Diamond G, Levy S, Bevans KB, et al. Development, validation, and utility of internet-based, behavioral health screen for adolescents. Pediatrics. 2010;126(1):e163–70. doi:10.1542/peds.2009-3272.

  55. 55.

    Fein JA, Pailler ME, Barg FK, et al. Feasibility and effects of a web-based adolescent psychiatric assessment administered by clinical staff in the pediatric emergency department. Arch Pediatr Adolesc Med. 2010;164(12):1112–7. doi:10.1001/archpediatrics.2010.213.

  56. 56.

    Salloum A, Crawford EA, Lewin AB, Storch EA. Consumers’ and providers’ perceptions of utilizing a computer-assisted cognitive behavioral therapy for childhood anxiety. Behav Cogn Psychother. 2015;43(01):31–41. doi:10.1017/S1352465813000647.

  57. 57.

    Murphy JM, Masek B, Babcock R, et al. Measuring outcomes in outpatient child psychiatry: the contribution of electronic technologies and parent report. Clin Child Psychol Psychiatry. 2011;16(1):146–60. doi:10.1177/1359104509352895.

  58. 58.

    Horwitz SM, Hoagwood KE, Garner A, et al. No technological innovation is a panacea: a case series in quality improvement for primary care mental health services. Clin Pediatr (Phila). 2008;47(7):685–92. doi:10.1177/0009922808315215.

  59. 59.

    Pretorius N, Rowlands L, Ringwood S, Schmidt U. Young people’s perceptions of and reasons for accessing a web-based cognitive behavioural intervention for bulimia nervosa. Eur Eat Disord Rev. 2010;18(3):197–206. doi:10.1002/erv.985.

  60. 60.

    Hanley T. Developing youth-friendly online counselling services in the United Kingdom: a small scale investigation into the views of practitioners. J Psycol Psychother Res. 2006;6(3):182–5. doi:10.1080/14733140600857535.

  61. 61.

    Stallard P, Richardson T, Velleman S. Clinicians’ attitudes towards the use of computerized cognitive behaviour therapy (cCBT) with children and adolescents. Behav Cogn Psychother. 2010;38(5):545–60. doi:10.1017/S1352465810000421.

  62. 62.

    Hetrick SE, Dellosa MK, Simmons MB, Phillips L. Development and pilot testing of an online monitoring tool of depression symptoms and side effects for young people being treated for depression. Early Interv Psychiatry. 2015;9(1):66–9. doi:10.1111/eip.12127.

  63. 63.

    Ahmad F, Norman C, O’Campo P. What is needed to implement a computer-assisted health risk assessment tool? An exploratory concept mapping study. BMC Med Inform Decis Mak. 2012;12:149-. doi:10.1186/1472-6947-12-149

  64. 64.

    Merry SN, Stasiak K, Shepherd M, Frampton C, Fleming T, Lucassen MF. The effectiveness of SPARX, a computerised self help intervention for adolescents seeking help for depression: randomised controlled non-inferiority trial. BMJ. 2012;344:e2598. doi:10.1136/bmj.e2598.

  65. 65.

    Mair FS, May C, O’Donnell C, Finch T, Sullivan F, Murray E. Factors that promote or inhibit the implementation of e-health systems: an explanatory systematic review. Bull World Health Organ. 2012;90(5):357–64. doi:10.2471/blt.11.099424.

  66. 66.

    Owens C, Ley A, Aitken P. Do different stakeholder groups share mental health research priorities? A four‐arm Delphi study. Health Expect. 2008;11(4):418–31. doi:10.1111/j.1369-7625.2008.00492.x.

  67. 67.

    Greenhalgh T, Robert G, Bate P, Macfarlane F, Kyriakidou O. Diffusion of innovations in health service organisations: a systematic literature review. Milbank Q. 2004;82(4):581-629. doi: 10.1111/j.0887-378X.2004.00325.x

  68. 68.

    Jansen MW, Van Oers HA, Kok G, De Vries NK. Public health: disconnections between policy, practice and research. Health Res Policy Syst. 2010;8(1):37. doi:10.1186/1478-4505-8-37.

  69. 69.

    Choi BC, Pang T, Lin V, et al. Can scientists and policy makers work together? J Epidemiol Community Health. 2005;59(8):632–7. doi:10.1136/jech.2004.031765.

  70. 70.

    Peters DH, Tran NT, Adam T. Implementation research in health: a practical guide. Alliance for Health Policy and Systems Research, World Health Organization; 2013.

  71. 71.

    Lyon AR, Lewis CC, Melvin A, Boyd M, Nicodimos S, Liu FF, Jungbluth N. Health Information Technologies-Academic and Commercial Evaluation (HIT-ACE) methodology: description and application to clinical feedback systems. Implement Sci. 2016;11(1):128. doi:10.1186/s13012-016-0495-2.

  72. 72.

    Dixon BE, Zafar A, McGowan JJ. Development of a taxonomy for health information technology. Stud Health Technol Inform. 2007;129(2):616–20.

  73. 73.

    Pagliari C, Sloan D, Gregor P, et al. What is eHealth (4): a scoping exercise to map the field. J Med Internet Res. 2005;7(1):e9. doi:10.2196/jmir.7.1.e9.

  74. 74.

    Oh H, Rizo C, Enkin M, Jadad A. What is eHealth (3): a systematic review of published definitions. J Med Internet Res. 2005;7(1):e1. doi:10.2196/jmir.7.1.e1.

  75. 75.

    Johnson K, Hays C, Center H, Daley C. Building capacity and sustainable prevention innovations: a sustainability planning model. Eval Program Plann. 2004;27(2):135–49.

  76. 76.

    Turner KM, Sanders MR. Dissemination of evidence-based parenting and family support strategies: learning from the Triple P—Positive Parenting Program system approach. Aggress Violent Behav. 2006;11(2):176–93.

  77. 77.

    Rabin BA, Brownson RC, Haire-Joshu D, Kreuter MW, Weaver NL. A glossary for dissemination and implementation research in health. J Public Health Manag Pract. 2008;14(2):117–23. doi:10.1097/01.PHH.000031888.06252.bb.

  78. 78.

    Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health. 1999;89(9):1322–7.

  79. 79.

    Lewis CC, Fischer S, Weiner BJ, Stanick C, Kim M, Martinez RG. Outcomes for implementation science: an enhanced systematic review of instruments using evidence-based rating criteria. Implement Sci. 2015;10:155. doi:10.1186/s13012-015-0342-x.

  80. 80.

    Chaudoir SR, Dugan AG, Barr CH. Measuring factors affecting implementation of health innovations: a systematic review of structural, organizational, provider, patient, and innovation level measures. Implement Sci. 2013;8(1):22. doi:10.1186/1748-5908-8-22.

Download references

Acknowledgements

The authors would like to thank the librarian who helped develop the search strategy. The authors would also like to thank the corresponding authors of included studies who responded to inquiries requesting further information about their research.

Funding

Funding for this project was provided by the Canadian Institutes of Health Research (CIHR 201404KRS). Drs. Newton and Hartling hold the CIHR New Investigator Awards. Dr. McGrath held a Tier I Canada Research Chair.

Availability of data and materials

The data from included studies and reports analyzed during this review are available from the corresponding author on reasonable request.

Authors’ contributions

All authors contributed extensively to the review and were involved in the design of this work. NDG, AS, and ASN were responsible for the screening, data extraction, and analysis. NDG, ASN, and AS were responsible for the initial drafting of the manuscript and LW, KB, AH, LH, MPD, and PM provided comments on it at all stages. All authors have read and approved the final version of the manuscript.

Competing interests

The authors declare that they have no competing interests.

Consent for publication

Not applicable.

Ethics approval and consent for publication

This systematic review did not require ethics approval nor does it contain any individual person’s data in any form.

Financial disclosure

The authors have no financial relationships relevant to this article to disclose.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Author information

Correspondence to Amanda S. Newton.

Additional files

Additional file 1:

Medline search strategy. (DOCX 16 kb)

Additional file 2:

List of government websites and healthcare organizations searched. (DOCX 18 kb)

Additional file 3:

AGREE domain scores for included government and organizational reports. (DOCX 20 kb)

Additional file 4:

Quality assessment scores of research studies using the Mixed Methods Appraisal Tool (MMAT). (DOCX 54 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Gehring, N.D., McGrath, P., Wozney, L. et al. Pediatric eMental healthcare technologies: a systematic review of implementation foci in research studies, and government and organizational documents. Implementation Sci 12, 76 (2017) doi:10.1186/s13012-017-0608-6

Download citation

Keywords

  • eHealth
  • Mental health
  • Implementation science
  • Healthcare planning
  • Organizational innovation
  • Decision-making
  • Healthcare organizations