Skip to main content

Table 1 Factors promoting the likelihood of acceptance or rejection from Implementation Science and Implementation Science Communications by manuscript type

From: Implementation Science and Implementation Science Communications: our aims, scope, and reporting expectations

Type of manuscript

Factors promoting the likelihood of acceptance in Implementation Science

Factors promoting the likelihood of rejection from Implementation Science

Required reporting guideline checklist

Possibility of acceptance in Implementation Science Communications

Effectiveness

Studies that fit our journal scope and that employ rigorous experimental or quasi-experimental designs (i.e., designs eligible for inclusion in Cochrane EPOC reviews) and evaluate the implementation of an evidence-based practice or policy, or de-implementation of those demonstrated to be of low or no clinical benefit

Studies which lack a rigorous experimental study design such as quality improvement reports, service evaluations, or uncontrolled before-after studies

CONSORT for trials

Observational outcome studies, for example, those that use case study design, smaller pilot studies, pre-implementation studies, studies focused on dissemination using innovative approaches, and/or descriptive studies

Studies evaluating the effectiveness of novel clinical, organizational, public health, or policy interventions

Economic evaluation

Any cost-effectiveness analysis that compares the costs and outcomes of two or more implementation strategies

Cost and cost consequences’ analysis where disaggregated costs and outcomes are presented

CHEERS

Costing analyses that do not provide clear effectiveness findings but exemplify methods for economic study in implementation and dissemination; descriptive cost analyses

Implementation intervention development reports

Prepared and submitted prior to the reporting of the effectiveness of the intervention

Post hoc submission (submitted after the reporting of the effectiveness of the intervention)

TIDIER, STARI

Implementation Science Communications will also consider reports of intervention development after reporting on effectiveness in some cases

Plans for (robust) evaluation are made explicit

No plans for (robust) evaluation

Providing empirical and/or theoretical rationale, typically using a stepwise approach

Non-transparent linkages between interventions and preceding analysis

Methodology

Articles that present methods which may either be completely new or offer an improvement to an existing method

Descriptive accounts of largely established methods without any associated novel methodological insights

N/A

Descriptive implementation methods, which offer high-quality application of existing models, theories, and frameworks within specific health settings

Articles reporting empirical comparisons of one or more methodological approaches or which clearly state what they add to the existing literature

Implementation pilot and feasibility studies

Studies that fit our journal scope and are conducted with the explicit purpose of assessing feasibility and planning for an intervention that is expected to contribute to existing knowledge

No justification for conduct

CONSORT pilot and feasibility study checklist

Well-conducted pilot and feasibility studies providing important pilot outcomes: effect size estimates, contextual factors, assessment of determinants of implementation, feasibility, acceptability, and other implementation-focused outcomes

Over claim on the basis of results

Studies indicating how a subsequent study will draw from the pilot study

Clear plans for further evaluation or where there are clear reasons for not

Implementation process evaluation

Studies that fit our journal scope and are submitted contemporaneously with or following reports of intervention effectiveness and that take account of the main evaluation outcomes

Process evaluations submitted in advance of the conduct of the main effectiveness analysis (it cannot be clear if they are explaining an effect or the absence of an effect)

 

Implementation process evaluation reports that reflect lessons learned that may generalize to other work

Process evaluations of complex clinical or preventive interventions, which have substantial implementation challenges (e.g., in the context of clinical trials) may also be published

Studies evaluating the fidelity of implementation, mechanisms of impact, and or contextual influences on implementation and outcomes

Process evaluations that do not take account of the main evaluation outcomes

Process evaluations of clinical or preventive interventions

Protocols

Protocols for innovative or very large scale studies that fit our journal scope and inclusion criteria for rigorous study designs with an emphasis on experimental design, that have been through a competitive peer review process to receive funding from a nationally or internationally recognized research agency, that have received appropriate ethics review board approval, and that have been submitted within 12 months of ethics approval

Protocols that have not been the subject of peer review by a national or international research agency

SPIRIT

Protocols for pilot and feasibility studies and smaller scale studies that fit our journal scope and inclusion criteria for study designs, including quasi-experimental and other study designs. We may accept protocols for multi-site quality improvement or service evaluations if they meet other criteria below, that have been through a competitive peer review process to receive funding from a regionally, nationally, or internationally recognized research agency, that have received appropriate ethics review board (or equivalent) approval, and that have been submitted within three possible time points: (1) within 3 months of ethics approval, (2) prior to enrolment of the first participant/cluster, and (3) before the end of participant/cluster recruitment (i.e., prior to the commencement of data cleaning or analysis)

Protocols that have not received ethics review board approval

Protocols for quality improvement or service evaluations, which lack a rigorous study design

Protocols for pilot or feasibility studies

Protocols for systematic reviews and other types of synthesis focused on implementation or dissemination research

Protocols that are submitted for studies where data cleaning and analysis have begun

Qualitative and mixed methods studies

Studies that fit the journal scope and meet applicable criteria for quality and validity

Studies where there are doubts about whether planned data saturation has been achieved

COREQ or RATS

Studies that focus on smaller samples or rely on descriptive qualitative methods only; mixed methods studies with appropriate design

Single-site case studies with limited typicality or transferability

Studies that fail to link to relevant theory or, without contextualization and with little reference to previous relevant qualitative studies or reviews

Short reports

Brief reports of data from original research which present relatively modest advances in knowledge or methods

Reports of meetings, “doing implementation” or “lessons learned”

N/A

The initial scope of Implementation Science Communications may include meeting short reports and brief descriptions of lessons learned

Systematic reviews and other syntheses

Systematic reviews and other types of synthesis (such as rapid, realist, or scoping) that fit our journal scope and which may cover issues such as the effects of implementation interventions and influences on the uptake of evidence

Non-systematic or narrative literature reviews that fail to use explicit methods to identify, select, and critically appraise relevant research

PRISMA

Narrative and other types of reviews may be accepted

RAMESES for realist reviews

Reviews and syntheses that fail to adhere to recognized quality and reporting standards

Research on education in implementation science

Empirical evaluation of training programs and materials for Implementation Science

Description of educational programs and materials

 

Implementation Science Communications may also publish pilot and small studies in this domain

Debate

Papers which question or challenge existing implementation policies, practices, evidence, or theory and suggest modifications or alternatives; clearly contextualized in the current literature

Papers which fail to contextualize in the literature or demonstrate how they build upon the existing implementation research literature; unlikely to be accepted in Implementation Science if not based on a systematic review of the literature

No checklist required; however, note preference for reviews over pure debate

Non-systematic review-based papers may be accepted in Implementation Science Communications if interesting concepts are discussed and/or the authors make the case that a systematic review is not feasible or appropriate