Skip to main content

Exploring the interpersonal-, organization-, and system-level factors that influence the implementation and use of an innovation-synoptic reporting-in cancer care



The dominant method of reporting findings from diagnostic and surgical procedures is the narrative report. In cancer care, this report inconsistently provides the information required to understand the cancer and make informed patient care decisions. Another method of reporting, the synoptic report, captures specific data items in a structured manner and contains only items critical for patient care. Research demonstrates that synoptic reports vastly improve the quality of reporting. However, synoptic reporting represents a complex innovation in cancer care, with implementation and use requiring fundamental shifts in physician behaviour and practice, and support from the organization and larger system. The objective of this study is to examine the key interpersonal, organizational, and system-level factors that influence the implementation and use of synoptic reporting in cancer care.


This study involves three initiatives in Nova Scotia, Canada, that have implemented synoptic reporting within their departments/programs. Case study methodology will be used to study these initiatives (the cases) in-depth, explore which factors were barriers or facilitators of implementation and use, examine relationships amongst factors, and uncover which factors appear to be similar and distinct across cases. The cases were selected as they converge and differ with respect to factors that are likely to influence the implementation and use of an innovation in practice. Data will be collected through in-depth interviews, document analysis, observation of training sessions, and examination/use of the synoptic reporting tools. An audit will be performed to determine/quantify use. Analysis will involve production of a case record/history for each case, in-depth analysis of each case, and cross-case analysis, where findings will be compared and contrasted across cases to develop theoretically informed, generalisable knowledge that can be applied to other settings/contexts. Ethical approval was granted for this study.


This study will contribute to our knowledge base on the multi-level factors, and the relationships amongst factors in specific contexts, that influence implementation and use of innovations such as synoptic reporting in healthcare. Such knowledge is critical to improving our understanding of implementation processes in clinical settings, and to helping researchers, clinicians, and managers/administrators develop and implement ways to more effectively integrate innovations into routine clinical care.

Peer Review reports


Cancer treatment and management have become increasingly complex over the past two decades, with therapeutic decisions often based on input from a multidisciplinary team that consists of radiologists, surgeons, pathologists, and oncologists [1]. For patients with suspected or confirmed cancer, clear and thorough recording of diagnostic and surgical procedures and findings support accurate diagnosis and staging. Such recording also facilitates more accurate prognosis estimates, post-operative management, and adjuvant treatment planning. The dominant method of reporting findings from diagnostic tests/procedures, surgery, and pathology examinations is the narrative report, which is a free text, descriptive account of the procedure, suspected or confirmed findings, and proposed treatment. Physicians dictate this report, often through automated telephone systems, and professional transcriptionists transcribe the oral description into a written document that is eventually placed into a patient's medical record. Research has demonstrated that narrative reports inconsistently provide the information required to understand the disease and make informed patient care decisions [27].

Another method of reporting, the synoptic report, captures data items in a structured manner and contains only items critical for understanding the disease and subsequent impacts on patient care. There is a spectrum of what is generally considered a synoptic report [1], from synoptic-like structured templates without scientifically validated elements to sophisticated electronic systems with drop-down menus, discrete data fields, standardized language, automated coding processes, and strong evidentiary basis. A landmark study in the early 1990s, which audited pathology practice patterns at 532 institutions in three countries, found that the one practice associated with completeness of pathology reporting for colorectal cancer specimens was use of a standardized report or checklist [8]. Since that time, researchers have consistently demonstrated that synoptic reports (even paper-based 'checklist' formats) vastly improve the quality of pathology reporting in colorectal [1, 2, 913], breast [1, 9, 1416], lung [1, 17], prostate [1], pancreatic [18], melanoma [19], and hematolymphoid cancers [20]. More recently, synoptic reporting has been shown to improve the quality of surgical reporting for a variety of malignancies, including colorectal [7], breast [21], thyroid [22], and pancreatic cancers [23], as well as non-malignant operative procedures [24, 25].

Electronic synoptic reporting tools also lead to health system efficiencies compared to the dominant, dictated method of reporting [2527]. Laflamme et al.[25] showed that use of synoptic templates accelerated the mean time for a verified surgical report to reach the patient's medical record by 800-fold compared to narrative reporting (28 minutes versus > 14 days, respectively). Moreover, the mean time from the end of the surgery to initiating the report was substantially less when using synoptic templates (0.43 hours) versus dictation (9.7 hours). Similar efficiencies were demonstrated in subsequent studies [26, 27]. In a Canadian study, for example, 97% of synoptic reports were finalized, placed in the patient's medical record, and sent to all health professionals involved in the patient's care within 24 hours of surgery compared to a mean of 90 days for narrative reports [26]. Researchers have also estimated considerable cost-savings through the elimination of transcription services [25, 26].

Beyond improving completeness of reporting and availability/immediacy of reports, synoptic reporting tools have the potential to improve quality of care by integrating practice guidelines/best evidence into report templates [21, 26] and providing an efficient, real-time mechanism to generate data from the diagnostic and peri-operative periods [21, 26, 28, 29]. These data may be used to provide real-time performance feedback to physicians and surgeons as well as enable improved process and outcomes measurement. International jurisdictions are increasingly endorsing synoptic reporting, including actively supporting/funding the implementation of synoptic templates [3032] and providing commendation status to pathology labs that include a synoptic synopsis of scientifically validated data elements in their reports [33]. In addition, the professional pathology colleges in Canada, US, UK, and Australia have formalized a collaboration to develop common, internationally agreed-upon, standardized cancer reporting protocols [34].

The synoptic report represents a complex innovation (i.e., new knowledge, tool, or practice) in cancer care, with its implementation and use requiring fundamental shifts in physician behaviour and practice culture [35] as well as support from the organization (e.g., changes in institutional policies/processes) and larger system (e.g., governance arrangements, integration with health information technology infrastructure). Despite the demonstrated benefits, some physicians have reported reluctance to use synoptic reporting tools, with concerns including lack of flexibility in reporting complex procedures/cases [36, 37], the prospect of being monitored [36], and discomfort with using information technology [37, 38]. Changing physician reporting practice is a complex undertaking that requires comprehensive approaches at different levels of the health system [39]. This may be particularly true for narrative reporting, a practice that has existed for millennia [40].

Implementing new practices in healthcare organizations

Knowledge translation (KT) research has largely focused on potentially useful strategies (e.g., opinion leaders, academic detailing, reminder systems) for improving the adoption and uptake of evidence (e.g., clinical practice guidelines) into practice [41]. Most of these strategies fall within the realm of individual-level interventions [39, 4244], with the target being 'autonomous' clinicians who are deemed to be more-or-less independent in their capacity to assemble and apply knowledge to modify their practices [41]. Despite the sizable amount of literature in this area, however, numerous systematic reviews have been unable to demonstrate which of these strategies work best, or even consistently, across clinical settings [4345]. Many researchers have emphasized the unpredictable, slow, and haphazard nature of research implementation and use processes, with interventions working some of the time in some situations, but not at other times in seemingly similar situations [41, 42, 46], and the reasons for these differences unclear [47].

In reality, many organizational and socio-political (e.g., inter-organizational networks, funding arrangements) factors affect whether individuals in clinical settings actually make changes in their practice [41, 4750]. Much research has demonstrated the importance of organizational characteristics (e.g., culture, leadership, management support, evaluation/feedback mechanisms, and presence of champions) to implementation efforts in healthcare settings [41, 5165]. Moreover, many of the defining features of healthcare systems, including the range and diversity of stakeholders, complex governance/resource arrangements, and professional autonomy and specialization of many of its staff, result in many different cultures and norms as well as high levels of interdependency amongst professionals in the system [66, 67].

Consequently, many implementation processes in healthcare organizations will also be characterized by a high degree of interdependency amongst organizational members [68, 69]. Indeed, many innovations introduced in healthcare will require coordinated use by many individuals and professional groups to achieve benefits (electronic medical records are one example). These individuals are situated in organizational relationships wherein the implementation and use of a new tool or practice will ultimately be influenced by many interpersonal processes, including 'coalition building,' rhetoric, and persuasion [70, 71]. Thus, while individual-level interventions are important to change clinical practice, the complex nature of healthcare organizations means individual-level interventions alone cannot change clinical practice in a widespread, sustainable way [39, 48, 7275].

Understanding the dynamics of innovations in organizations has a long history in management and organizational sciences [76]. Rogers [77] has conceptualized the innovation-decision process as one that unfolds in distinct stages whereby an organization moves from initial awareness or knowledge of an innovation to eventually successfully integrating the innovation into ongoing processes (or, alternatively, rejecting the innovation). Contrary to this perspective, extensive longitudinal study of innovation processes led Van de Ven et al.[78] to describe the 'innovation journey' as a non-linear cycle during which ideas are developed (or adapted) and put into practice by people who, through their relationships and negotiations with others, make the changes necessary to implement the innovation within a specific organizational context. They highlight that people and relationships are instrumental to this journey, which is characterized by many divergent and convergent activities wherein the initial idea often leads to multiple ideas/actions, setbacks and delays occur frequently, staff experience high levels of elation and frustration, notions of success change, and new interdependencies are established that affect the wider organization. This broader 'systems' perspective [78, 79] has recently made its way into KT dialogue and debate [49], challenging our thinking of a linear view of KT (e.g., researcher-push model) and moving us toward one that is much more contextual, relational, and 'living' in nature.

Research objective

The objective of this study is to examine the key interpersonal, organizational, and system-level factors (hereafter referred to as 'multi-level' factors) that influence the implementation and use of synoptic reporting in three specific cases of cancer care. The interpersonal level relates to the relational aspects at the level of the implementation team/program: e.g., teamwork and team dynamics, communication, partner engagement, coalition building, power dynamics, and use of rhetoric and persuasion to accomplish goals/tasks. The organizational level relates to institutional (i.e., hospital) factors that influence implementation and behaviour change: e.g., organizational culture, leadership, management, intra-organizational relationships, evaluation capacity/mechanisms, implementation policies and practices, infrastructure, and presence of champions. The system level refers to the broader sociopolitical context: e.g., policies such as financial incentives/disincentives, resource and governance arrangements, and inter-organizational norms and networks.

This study involves three initiatives (the cases) in Nova Scotia, Canada, that have implemented a synoptic reporting tool within their departments/programs. The examination of each case will involve answering the following specific research questions:

  1. 1.

    What, if any, common factors affected implementation and use across cases? How was it that these factors 'transcended' the different contexts (setting, timing, and 'actors' involved)?

  2. 2.

    Are there context-specific factors within each case, which were not found in other cases, that affected implementation and use? If so, what are they and what are their specific relationships to the setting, timing, and actors?

The outcome of this study will be a descriptive and explanatory account of the multi-level factors that influence the implementation and use of synoptic reporting in cancer care.


Case study methodology (CSM) [80, 81] will be used to study the three synoptic reporting cases in-depth, explore which factors were barriers or facilitators of implementation and use, examine relationships amongst factors, and uncover which factors appear to be similar (and distinct) across cases. CSM permits the rigorous study of a contemporary phenomenon within its real-life context [81], and of the complex interactions between the social actors and their actions and environments [82]. Case studies typically focus on 'how' and 'why' questions and explore multiple dimensions of some particular phenomenon. Flyvbjerg [83] argues that such in-depth study (of real cases in specific contexts) may be pivotal to transitioning from a novice to an expert understanding of the phenomenon.

This complexity means that case study researchers deal with distinct contexts whereby there are more variables of interest than data points. As a result, case studies rely on multiple sources of evidence and benefit from knowledge of the literature and existing theoretical perspectives [81]. The use of multiple sources is vital to CSM, as it permits corroboration (i.e., triangulation) of findings and resultant interpretations [81]. The use of existing theoretical perspectives helps guide data collection and analysis. Without a prior theoretical understanding, researchers risk spending considerable time and effort gathering basic information and 'providing description without meaning' (Hartley, cited in [84]).


This research will examine the implementation and use of synoptic reporting tools for cancer care in Nova Scotia, Canada, using an explanatory multiple-case design. Explanatory case studies present data to explain how and why events happened; the researcher interprets phenomena by answering questions of how and why drawing upon a theoretical basis [81].

Theoretical perspectives

The use of theoretical frameworks/perspectives provides structure to the inquiry and analysis, and helps ensure the findings from the case study are connected to and informed by the broader body of literature. This research is informed by the empirical and theoretical literature on research implementation and the diffusion/management of innovations. In particular, three theoretical frameworks/perspectives have largely informed the design of this study (see Table 1): Promoting Action on Research Implementation in Health Services [50, 60]; Organizational framework of innovation implementation [59]; and 'Systems' thinking/change [49]

Table 1 Description of the three theoretical perspectives guiding the case study

Importantly, these perspectives were not identified with the aim of determining which is 'best' at explaining implementation and use processes in the cases selected for study. Rather, these perspectives, when taken together, present a range of interpersonal, organizational, and system influences on practice change and thus identify potentially important factors to study.


In case study research, limiting one's study to three or four cases will help ensure that a researcher is able to study each case in sufficient detail and depth [80, 89]. In this study, three cases will be studied: Synoptic reporting in the Nova Scotia Breast Screening Program (NSBSP); Synoptic reporting in the Colon Cancer Prevention Program (CCPP); and Synoptic reporting in the Surgical Synoptic Reporting Tools Project (SSRTP).

These cases have been sampled on the basis of replication logic [81] as well as Stake's criteria [80]: relevance to the phenomenon; provision of diversity; and provision of good learning opportunities. Using replication logic, these cases were selected as they converge and differ with respect to factors that, based on the literature and theoretical perspectives, are likely to influence the implementation and use of an innovation in clinical practice. For example, the implementation of all three initiatives has involved formal leadership, relatively small implementation teams, clinical champions, and the development of monitoring and feedback mechanisms. At the same time, the cases represent diverse contexts, including differences in relevant professional groups (e.g., specialties, disease sites), institutions (e.g., academic/tertiary care centres, community hospitals), mode of change (e.g., top-down, bottom-up), implementation support and resource characteristics, and history/timing.

Data collection procedures

This study will use multiple data collection procedures, gathering evidence across cases as well as across the various levels (interpersonal, organization, system) of each case, to gain rich, detailed information about each case and to increase the likelihood of achieving triangulation of data.

Interviews with key informants

One-on-one semi-structured interviews will be conducted with key informants at the different levels of each case. For each case, a minimum of 14 to 16 key informants will be interviewed (see Table 2): users of the synoptic reporting system (e.g., radiologists, gastroenterologists, surgeons); individuals directly involved in planning or carrying out the implementation; organizational members relevant to the initiative; individuals involved at the system level (e.g., funders, policy-makers); and users of the final synoptic report (e.g., oncologists, coders). While the latter group may not have been directly involved in implementation efforts, their acceptance and use of the synoptic report is important to widespread implementation and use. Some informants may be asked to partake in several interviews (e.g., initial and follow-up interviews) depending on the case and data collected.

Table 2 Proposed key informants

Patton [90] and Rubin and Rubin [91] will be used to guide the interview design and research questions. Interview questions will be adapted based on each case's unique context as well as the person being interviewed and his/her role in the implementation. The semi-structured format will permit the interviewer to remain focused so that the research goals are achieved and the participant's time is used efficiently, yet also provide the freedom to probe additional issues that may be pertinent to the current research, but are not specifically addressed by the interview script [90]. Following each interview, the questions and responses will be reviewed to determine whether or not the issues were answered in sufficient depth and, if not, questions will be revised before the next interview [91]. Though theoretical perspectives have been used to guide this study, when information arises that conflicts with these perspectives, we will depart from the interview script and explore that particular concept/issue further. In subsequent interviews, that issue will be integrated into the script, if relevant in the context of that specific informant.

One investigator (RU) will conduct all interviews. Each interview will be audiotaped to ensure the data are retrievable and captured in true form, and will be transcribed verbatim by an experienced research coordinator.

Non-participant observation

Non-participant observation [90] will be utilized to observe training sessions (format, quality of training) and initial surgeon reactions to viewing/using the innovation. Thus, these sessions will provide another opportunity to collect data on surgeons' perspectives on the innovation and any barriers that surgeons perceive at the time of training. These sessions will be conducted for one case only (SSRTP) since the implementation of the surgical synoptic report is ongoing, permitting prospective observation of user training and early support activities.

Document analysis

Document information will be sought out and analyzed for each case. This includes project plans, team/organizational records related to synoptic reporting, training/support manuals, agendas and meeting minutes, formal/informal evaluations conducted, and media or professional articles/newsletters on initiative. These records will be reviewed to gain an historical and contextual perspective on the initiative and to corroborate and augment evidence from both interviews and observations [81]. Where documentary evidence conflicts with findings from other sources, we will attempt to resolve these contradictions through further inquiry (e.g., follow-up with informants, contact with implementation team).

Physical artifacts

Each synoptic reporting tool will be examined to gain insight into the technical operations related to using the system. This will entail inputting 'test' cases into the system to experience tool use as well as viewing the final synoptic report to observe its design/format. Field notes/perceptions related to these experiences will be used to corroborate and augment evidence (specifically related to system/tool issues) from other sources.

Tool audits

Tool audits will be conducted to determine the proportion of eligible clinicians using the synoptic reporting tool and the proportion of eligible procedures at each institution that were reported using the synoptic reporting tool. This will entail an audit of the synoptic reporting system/database as well as the relevant institutional administrative system (e.g., admission/discharge/transfer or operating room scheduling systems). The latter is required to determine the number of eligible procedures (e.g., endoscopies, surgeries) performed at that institution in a specified period of time. Eligibility criteria for the audits include the physician/surgeon is a registered user on the synoptic reporting system and a synoptic template is in use for the specific procedure (e.g., lumpectomy for a malignant breast tumour).


Yin [81] describes a number of important strategies of case study analysis: developing case descriptions, relying on theoretical frameworks/perspectives, using data from multiple sources to augment and triangulate findings, and examining rival explanations (i.e., other plausible explanations for the findings; one rival explanation is that psychological theories, such as the Theory of Planned Behaviour [92], better explain implementation in one or more of the cases studied). In this study, data analysis will involve a three-stage process: production of a case record/history for each case; in-depth analysis of each case; and cross-case analysis. Like other qualitative methodologies, analysis will begin with the first data collected.

The first stage in the analytic process involves case description. That is, a detailed case record (or history) will be constructed for each case, including an in-depth description of the history and context of the initiative (including the impetus for the initiative, timeline, key milestones and activities, and organization of the project and implementation). This descriptive record will also involve situating the case within its socio-political context, particularly as it pertains to the provincial healthcare environment at the time of implementation.

The second stage will attempt to gain an in-depth understanding of each synoptic reporting initiative and how its experiences relate to the research objective as well as the theoretical literature. This stage will involve four analytic steps, which will be performed separately for each of the three cases:

  1. 1.

    Thematic analysis for each of the following evidence sources: interviews, documentary evidence, and observation. This analysis will follow the thematic analysis approach presented by Braun and Clarke [93], involving coding, collating codes, and generating, reviewing, and refining themes. This approach is similar to the analysis steps outlined by other researchers [89, 91, 94]. NVivo 9 (QSR International, Australia) will be used to help manage the data and aid in coding processes.

  2. 2.

    Cross-source analysis of themes. This analytic process will compare, contrast, and synthesize findings from each source to gain an understanding of how the data from each source corroborates and augments data from other sources, and to identify any areas of inconsistency and potential contradiction.

  3. 3.

    Explanation building to integrate evidence, link the data to theoretical perspectives/literature, and develop a deeper understanding of what occurred [81]. This technique involves iteratively and flexibly moving back and forth between prior knowledge (theoretical perspectives, other literature, research objective) and emerging, case-specific data to get a deeper understanding and theoretically-sound explanation of what actually happened and what was important throughout the process. An important aspect of this process is considering and questioning other explanations. This process will enable us to explain the implementation processes and the multi-level factors that influenced implementation and use in each case, and to examine existing theoretical constructs and determine their appropriateness to these contexts. In this way, we are able to explore existing theoretical perspectives and revise theory, when appropriate.

  4. 4.

    Presentation of findings in relation to the overall objective of the study, the theoretical perspectives, and rival explanations.

The final stage will be to conduct a cross-case analysis to compare and contrast themes between the cases. Each case will be treated as a separate study and findings (similarities and differences) will be compared across cases to develop theoretically informed, generalisable knowledge on implementing innovations in clinical practice that can be applied to other settings and contexts [81].

One critique of CSM is that case studies are subject to confirmation bias, specifically toward confirmation of preconceived ideas [83, 95]. That is, researchers can selectively describe and explain the studied events 'to support a favoured theory by underplaying evidence inconsistent with the theory or supporting an alternative' (p. 164) [95]. To minimize confirmation bias in this study, all members of the research team will participate in components of the analysis and compare their findings. The focus will be to attend to all the evidence collected, display and present the evidence and interpretation(s) separately, consider other plausible interpretations, and seek out additional evidence where inconsistencies or contradictions exist [81]. Moreover, the research team will strive to increase the 'trustworthiness' of this study through detailed documentation and description, including development and maintenance of a case study database (consisting of a complete set of all the data collected, along with the treatment of the data during the research process), maintenance of a chain of evidence (or audit trail), and rich descriptions of each case and its context.


Well-conducted case studies can make significant contributions to the knowledge base of a particular area [96] and have the potential to transform practice, either within the case(s) being studied or across similar situations where individuals can learn from the findings [97]. Beyond informing the adoption and implementation of synoptic reporting, we anticipate this study will add to the development and application of theoretical knowledge (particularly 'systems' perspectives) in the growing KT field, and contribute to our knowledge base on the multi-level factors, and the relationships amongst factors in specific contexts, that influence implementation and use of innovations in healthcare organizations. Both contributions are important to improving our understanding of implementation processes in clinical settings, and helping researchers, clinicians, and managers/administrators develop and implement ways to more effectively integrate innovations into routine clinical care. This is especially relevant in the present healthcare environment wherein new knowledge and technologies are growing and changing rapidly, and the treatment and management of many diseases are increasingly complex and multidisciplinary.



Colon Cancer Prevention Program


Case study methodology


Knowledge translation


Nova Scotia Breast Screening Program


Surgical Synoptic Reporting Tools Project.


  1. 1.

    Srigley JR, McGowan T, Maclean A, Raby M, Ross J, Kramer S, Sawka C: Standardized synoptic cancer pathology reporting: a population-based approach. J Surg Oncol. 2009, 99 (8): 517-524. 10.1002/jso.21282.

    PubMed  Google Scholar 

  2. 2.

    Beattie GC, McAdam TK, Elliott S, Sloan JM, Irwin ST: Improvement in quality of colorectal cancer pathology reporting with a standardized proforma-a comparative study. Colorectal Dis. 2003, 5 (6): 558-562. 10.1046/j.1463-1318.2003.00466.x.

    CAS  PubMed  Google Scholar 

  3. 3.

    Bull AD, Biffin AH, Mella J, Radcliffe AG, Stamatakis JD, Steele RJ, Williams GT: Colorectal cancer pathology reporting: a regional audit. J Clin Pathol. 1997, 50 (2): 138-142. 10.1136/jcp.50.2.138.

    CAS  PubMed  PubMed Central  Google Scholar 

  4. 4.

    Lefter LP, Walker SR, Dewhurst F, Turner RW: An audit of operative notes: facts and ways to improve. ANZ J Surg. 2008, 78 (9): 800-802. 10.1111/j.1445-2197.2008.04654.x.

    PubMed  Google Scholar 

  5. 5.

    Donahoe L, Bennett S, Temple W, Hilchie-Pye A, Dabbs K, MacIntosh E, Porter G: Completeness of dictated operative reports in breast cancer-the case for synoptic surgical reporting. J Surg Oncol. 2012, doi: 10.1002/jso.23031 [Epub ahead of print],

    Google Scholar 

  6. 6.

    Verleye L, Ottevanger PB, Kristensen GB, Ehlen T, Johnson N, van der Burg ME, Reed NS, Verheijen RH, Gaarenstroom KN, Mosgaard B: Quality of pathology reports for advanced ovarian cancer: are we missing essential information? An audit of 479 pathology reports from the EORTC-GCG 55971/NCIC-CTG OV13 neoadjuvant trial. Eur J Cancer. 2011, 47 (1): 57-64. 10.1016/j.ejca.2010.08.008.

    PubMed  Google Scholar 

  7. 7.

    Edhemovic I, Temple WJ, de Gara CJ, Stuart GC: The computer synoptic operative report-a leap forward in the science of surgery. Ann Surg Oncol. 2004, 11 (10): 941-947. 10.1245/ASO.2004.12.045.

    PubMed  Google Scholar 

  8. 8.

    Zarbo RJ: Interinstitutional assessment of colorectal carcinoma surgical pathology report adequacy. A College of American Pathologists Q-Probes study of practice patterns from 532 laboratories and 15,940 reports. Arch Pathol Lab Med. 1992, 116 (11): 1113-1119.

    CAS  PubMed  Google Scholar 

  9. 9.

    Branston LK, Greening S, Newcombe RG, Daoud R, Abraham JM, Wood F, Dallimore NS, Steward J, Rogers C, Williams GT: The implementation of guidelines and computerised forms improves the completeness of cancer pathology reporting. The CROPS project: a randomised controlled trial in pathology. Eur J Cancer. 2002, 38 (6): 764-772. 10.1016/S0959-8049(01)00258-1.

    CAS  PubMed  Google Scholar 

  10. 10.

    Cross SS, Feeley KM, Angel CA: The effect of four interventions on the informational content of histopathology reports of resected colorectal carcinomas. J Clin Pathol. 1998, 51 (6): 481-482. 10.1136/jcp.51.6.481.

    CAS  PubMed  PubMed Central  Google Scholar 

  11. 11.

    Rigby K, Brown SR, Lakin G, Balsitis M, Hosie KB: The use of a proforma improves colorectal cancer pathology reporting. Ann R Coll Surg Engl. 1999, 81 (6): 401-403.

    CAS  PubMed  PubMed Central  Google Scholar 

  12. 12.

    Chapuis PH, Chan C, Lin BP, Armstrong K, Armstrong B, Spigelman AD, O'Connell D, Leong D, Dent OF: Pathology reporting of resected colorectal cancers in New South Wales in 2000. ANZ J Surg. 2007, 77 (11): 963-969. 10.1111/j.1445-2197.2007.04291.x.

    PubMed  Google Scholar 

  13. 13.

    Messenger DE, McLeod RS, Kirsch R: What impact has the introduction of a synoptic report for rectal cancer had on reporting outcomes for specialist gastrointestinal and nongastrointestinal pathologists?. Arch Pathol Lab Med. 2011, 135 (11): 1471-1475. 10.5858/arpa.2010-0558-OA.

    PubMed  Google Scholar 

  14. 14.

    Wilkinson NW, Shahryarinejad A, Winston JS, Watroba N, Edge SB: Concordance with breast cancer pathology reporting practice guidelines. J Am Coll Surg. 2003, 196 (1): 38-43. 10.1016/S1072-7515(02)01627-7.

    PubMed  Google Scholar 

  15. 15.

    Hammond EH, Flinner RL: Clinically relevant breast cancer reporting: using process measures to improve anatomic pathology reporting. Arch Pathol Lab Med. 1997, 121 (11): 1171-1175.

    CAS  PubMed  Google Scholar 

  16. 16.

    Austin R, Thompson B, Coory M, Walpole E, Francis G, Fritschi L: Histopathology reporting of breast cancer in Queensland: the impact on the quality of reporting as a result of the introduction of recommendations. Pathology. 2009, 41 (4): 361-365. 10.1080/00313020902884469.

    CAS  PubMed  Google Scholar 

  17. 17.

    Chamberlain DW, Wenckebach GF, Alexander F, Fraser RS, Kolin A, Newman T: Pathological examination and the reporting of lung cancer specimens. Clin Lung Cancer. 2000, 1 (4): 261-268. 10.3816/CLC.2000.n.008.

    CAS  PubMed  Google Scholar 

  18. 18.

    Gill AJ, Johns AL, Eckstein R, Samra JS, Kaufman A, Chang DK, Merrett ND, Cosman PH, Smith RC, Biankin AV: Synoptic reporting improves histopathological assessment of pancreatic resection specimens. Pathology. 2009, 41 (2): 161-167. 10.1080/00313020802337329.

    PubMed  Google Scholar 

  19. 19.

    Karim RZ, van den Berg KS, Colman MH, McCarthy SW, Thompson JF, Scolyer RA: The advantage of using a synoptic pathology report format for cutaneous melanoma. Histopathology. 2008, 52 (2): 130-138.

    CAS  PubMed  Google Scholar 

  20. 20.

    Mohanty SK, Piccoli AL, Devine LJ, Patel AA, William GC, Winters SB, Becich MJ, Parwani AV: Synoptic tool for reporting of hematological and lymphoid neoplasms based on World Health Organization classification and College of American Pathologists checklist. BMC Cancer. 2007, 7: 144-10.1186/1471-2407-7-144.

    PubMed  PubMed Central  Google Scholar 

  21. 21.

    Temple WJ, Francis WP, Tamano E, Dabbs K, Mack LA, Fields A: Synoptic surgical reporting for breast cancer surgery: an innovation in knowledge translation. Am J Surg. 2010, 199 (6): 770-775. 10.1016/j.amjsurg.2009.07.037.

    PubMed  Google Scholar 

  22. 22.

    Chambers AJ, Pasieka JL, Temple WJ: Improvement in the accuracy of reporting key prognostic and anatomic findings during thyroidectomy by using a novel Web-based synoptic operative reporting system. Surgery. 2009, 146 (6): 1090-1098. 10.1016/j.surg.2009.09.032.

    PubMed  Google Scholar 

  23. 23.

    Park J, Pillarisetty VG, Brennan MF, Jarnagin WR, D'Angelica MI, Dematteo RP, GC D, Janakos M, Allen PJ: Electronic synoptic operative reporting: assessing the reliability and completeness of synoptic reports for pancreatic resection. J Am Coll Surg. 2010, 211 (3): 308-315. 10.1016/j.jamcollsurg.2010.05.008.

    PubMed  Google Scholar 

  24. 24.

    Harvey A, Zhang H, Nixon J, Brown CJ: Comparison of data extraction from standardized versus traditional narrative operative reports for database-related research and quality control. Surgery. 2007, 141 (6): 708-714. 10.1016/j.surg.2007.01.022.

    CAS  PubMed  Google Scholar 

  25. 25.

    Laflamme MR, Dexter PR, Graham MF, Hui SL, McDonald CJ: Efficiency, comprehensiveness and cost-effectiveness when comparing dictation and electronic templates for operative reports. AMIA Annu Symp Proc. 2005, 2005: 425-429.

    PubMed Central  Google Scholar 

  26. 26.

    Mack LA, Dabbs K, Temple WJ: Synoptic operative record for point of care outcomes: a leap forward in knowledge translation. Eur J Surg Oncol. 2010, 36 (Suppl 1): S44-49.

    PubMed  Google Scholar 

  27. 27.

    Cowan DA, Sands MB, Rabizadeh SM, Amos CS, Ford C, Nussbaum R, Stein D, Liegeois NJ: Electronic templates versus dictation for the completion of Mohs micrographic surgery operative notes. Dermatol Surg. 2007, 33 (5): 588-595. 10.1111/j.1524-4725.2007.33120.x.

    CAS  PubMed  Google Scholar 

  28. 28.

    Caines JS, Schaller GH, Iles SE, Woods ER, Barnes PJ, Johnson AJ, Jones GR, Borgaonkar JN, Rowe JA, Topp TJ: Ten years of breast screening in the Nova Scotia Breast Screening Program, 1991-2001. Experience: use of an adaptable stereotactic device in the diagnosis of screening-detected abnormalities. Can Assoc Radiol J. 2005, 56 (2): 82-93.

    PubMed  Google Scholar 

  29. 29.

    Rayson D, Payne JI, Abdolell M, Barnes PJ, Macintosh RF, Foley T, Younis T, Burns A, Caines J: Comparison of clinical-pathologic characteristics and outcomes of true interval and screen-detected invasive breast cancer among participants of a canadian breast screening program: a nested case-control study. Clin Breast Cancer. 2011, 11 (1): 27-32.

    PubMed  Google Scholar 

  30. 30.

    Royal College of Pathologists of Australasia: Structured pathology reporting of cancer. []

  31. 31.

    Canadian Partnership Against Cancer: Synoptic reporting (surgery). []

  32. 32.

    Cancer Care Ontario: Pathology reporting project. []

  33. 33.

    American College of Surgeons Commission on Cancer: Cancer program standards 2009, revised edition. 2009, Chicago, IL: American College of Surgeons

    Google Scholar 

  34. 34.

    Canadian Partnership Against Cancer: International Collaboration on Cancer Reporting: communique. []

  35. 35.

    Urquhart R, Grunfeld E, Porter GA: Synoptic reporting and the quality of cancer care: a review of evidence and Canadian initiatives. Oncology Exchange. 2009, 8 (1): 28-31.

    Google Scholar 

  36. 36.

    Bjugn R, Casati B, Norstein J: Structured electronic template for histopathology reports on colorectal carcinomas: a joint project by the Cancer Registry of Norway and the Norwegian Society for Pathology. Hum Pathol. 2008, 39 (3): 359-367. 10.1016/j.humpath.2007.06.019.

    PubMed  Google Scholar 

  37. 37.

    Cancer Surgery Alberta: WebSMR Benefits Evaluation. Cancer Surgery Alberta Quarterly. Volume 1. 2008, Calgary, AB, Winter: 1-6.

    Google Scholar 

  38. 38.

    Praxia Information Intelligence: Canadian Partnership Against Cancer: Synoptic Reporting Tools Project Evaluation. Final Report. 2011, Toronto, ON

    Google Scholar 

  39. 39.

    Grol R, Grimshaw J: From best evidence to best practice: effective implementation of change in patients' care. Lancet. 2003, 362 (9391): 1225-1230. 10.1016/S0140-6736(03)14546-1.

    PubMed  Google Scholar 

  40. 40.

    Foster RS: Breast cancer detection and treatment: a personal and historical perspective. Arch Surg. 2003, 138 (4): 397-408. 10.1001/archsurg.138.4.397.

    PubMed  Google Scholar 

  41. 41.

    Stetler CB: Role of the organization in translating research into evidence-based practice. Outcomes Manag. 2003, 7 (3): 97-103.

    PubMed  Google Scholar 

  42. 42.

    Eccles M, Grimshaw J, Walker A, Johnston M, Pitts N: Changing the behavior of healthcare professionals: the use of theory in promoting the uptake of research findings. J Clin Epidemiol. 2005, 58 (2): 107-112. 10.1016/j.jclinepi.2004.09.002.

    PubMed  Google Scholar 

  43. 43.

    Grimshaw J, Eccles M, Thomas R, MacLennan G, Ramsay C, Fraser C, Vale L: Toward evidence-based quality improvement. Evidence (and its limitations) of the effectiveness of guideline dissemination and implementation strategies 1966-1998. J Gen Intern Med. 2006, 21 (Suppl 2): 14-20.

    Google Scholar 

  44. 44.

    Grimshaw JM, Shirran L, Thomas RE, Mowatt G, Fraser C, Bero L, Grilli R, Harvey EL, Oxman AD, O'Brien MA: Changing provider behaviour: An overview of systematic reviews of interventions. Med Care. 2001, 39 (8 Suppl 2): II2-45.

    CAS  PubMed  Google Scholar 

  45. 45.

    Bero LA, Grilli R, Grimshaw J, Harvey E, Oxman AD, Thomson MA: Closing the gap between research and practice: an overview of systematic reviews of interventions to promote the implementation of research findings. The Cochrane Effective Practice and Organization of Care Review Group. BMJ. 1998, 317 (7156): 465-468. 10.1136/bmj.317.7156.465.

    CAS  PubMed  PubMed Central  Google Scholar 

  46. 46.

    Improved Clinical Effectiveness through Behavioural Research Group (ICEBeRG): Designing theoretically-informed implementation interventions. Implement Sci. 2006, 1: 4-

    Google Scholar 

  47. 47.

    Pawson R, Tilley N: Realist Evaluation. 1997, London: SAGE Publications

    Google Scholar 

  48. 48.

    Grol R, Bosch MC, Hulscher M, Eccles MP, Wensing M: Planning and studying improvement in patient care: the use of theoretical perspectives. The Milbank Quarterly. 2007, 85 (1): 93-138. 10.1111/j.1468-0009.2007.00478.x.

    PubMed  PubMed Central  Google Scholar 

  49. 49.

    Kitson AL: The need for systems change: reflections on knowledge translation and organizational change. J Adv Nurs. 2009, 65 (1): 217-228. 10.1111/j.1365-2648.2008.04864.x.

    PubMed  Google Scholar 

  50. 50.

    Kitson AL, Rycroft-Malone J, Harvey G, Mccormack B, Seers K, Titchen A: Evaluating the successful implementation of evidence into practice using the PARiHS framework: theoretical and practical challenges. Implement Sci. 2008, 3 (1): 1-10.1186/1748-5908-3-1.

    PubMed  PubMed Central  Google Scholar 

  51. 51.

    Bradley EH, Herrin J, Mattera J, Holmboe ES, Wang Y, Frederick P, Roumanis SA, Radford MJ, Krumholz HM: Quality improvement efforts and hospital performance: rates of beta-blocker prescription after acute myocardial infarction. Med Care. 2005, 43: 282-292. 10.1097/00005650-200503000-00011.

    PubMed  Google Scholar 

  52. 52.

    Bradley EH, Holmboe ES, Mattera J, Roumanis SA, Radford MJ, Krumholz HM: The roles of senior management in quality improvement efforts: what are the key components?. J Healthc Manag. 2003, 48: 15-29.

    PubMed  Google Scholar 

  53. 53.

    Bradley EH, Holmboe ES, Mattera JA, Roumanis SA, Radford MJ, Krumholz HM: A qualitative study of increasing beta-blocker use after myocardial infarction: Why do some hospitals succeed?. JAMA. 2001, 285 (20): 2604-2611. 10.1001/jama.285.20.2604.

    CAS  PubMed  Google Scholar 

  54. 54.

    Cummings GG, Estabrooks CA, Midodzi WK, Wallin L, Hayduk L: Influence of organizational characteristics and context on research utilization. Nurs Res. 2007, 56 (4 Suppl): S24-39.

    PubMed  Google Scholar 

  55. 55.

    Damschroder LJ, Banaszak-Holl J, Kowalski CP, Forman J, Saint S, Krein SL: The role of the 'champion' in infection prevention: results from a multisite qualitative study. Qual Saf Health Care. 2009, 18 (6): 434-440. 10.1136/qshc.2009.034199.

    CAS  PubMed  Google Scholar 

  56. 56.

    Ferlie EB, Shortell SM: Improving the quality of health care in the United Kingdom and the United States: A framework for change. Milbank Q. 2001, 79 (2): 281-315. 10.1111/1468-0009.00206.

    CAS  PubMed  PubMed Central  Google Scholar 

  57. 57.

    Gale BV, Schaffer MA: Organizational readiness for evidence-based practice. J Nurs Adm. 2009, 39 (2): 91-97. 10.1097/NNA.0b013e318195a48d.

    PubMed  Google Scholar 

  58. 58.

    Greenhalgh T, Robert G, MacFarlane F, Bate P, Kyriakidou O: Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004, 82 (4): 581-629. 10.1111/j.0887-378X.2004.00325.x.

    PubMed  PubMed Central  Google Scholar 

  59. 59.

    Helfrich CD, Weiner BJ, Mckinney MM, Minasian L: Determinants of Implementation Effectiveness: Adapting a Framework for Complex Innovations. Med Care Res Rev. 2007, 64 (3): 279-303. 10.1177/1077558707299887.

    PubMed  Google Scholar 

  60. 60.

    Kitson A, Harvey G, McCormack B: Enabling the implementation of evidence based practice: a conceptual framework. Qual Health Care. 1998, 7 (3): 149-158. 10.1136/qshc.7.3.149.

    CAS  PubMed  PubMed Central  Google Scholar 

  61. 61.

    Mitchell JP: Guideline implementation in the department of defense. Chest. 2000, 118 (2 Suppl): 65S-69S.

    CAS  PubMed  Google Scholar 

  62. 62.

    Rycroft-Malone J, Harvey G, Seers K, Kitson A, Mccormack B, Titchen A: An exploration of the factors that influence the implementation of evidence into practice. J Clin Nurs. 2004, 13 (8): 913-924. 10.1111/j.1365-2702.2004.01007.x.

    PubMed  Google Scholar 

  63. 63.

    Soo S, Berta W, Baker GR: Role of champions in the implementation of patient safety practice change. Healthc Q. 2009, 12: 123-128.

    PubMed  Google Scholar 

  64. 64.

    Stetler CB, Mcqueen L, Demakis J, Mittman BS: An organizational framework and strategic implementation for system-level change to enhance research-based practice: QUERI Series. Implement Sci. 2008, 3 (1): 30-10.1186/1748-5908-3-30.

    PubMed  PubMed Central  Google Scholar 

  65. 65.

    West E: Management matters: the link between hospital organisation and quality of patient care. Qual Health Care. 2001, 10 (1): 40-48. 10.1136/qhc.10.1.40.

    CAS  PubMed  PubMed Central  Google Scholar 

  66. 66.

    Iles V, Sutherland K: Organizational change: A review for health care managers, professionals and researchers. 2001, National Health Service, []

    Google Scholar 

  67. 67.

    Pollitt C: The struggle for quality: the case of the NHS. Policy and Politics. 1993, 21 (3): 161-170. 10.1332/030557393782331119.

    Google Scholar 

  68. 68.

    Havelock RG: Planning for Innovation through Dissemination and Utilization of Knowledge. 1969, Ann Arbor, MI: The University of Michigan Institute for Social Research

    Google Scholar 

  69. 69.

    Leviton LC: Evaluation use: Advances, challenges and applications. American Journal of Evaluation. 2003, 24 (4): 525-535.

    Google Scholar 

  70. 70.

    Russell J, Greenhalgh T, Byrne E, McDonnell J: Recognizing rhetoric in health care policy analysis. J Health Serv Res Policy. 2008, 13 (1): 40-46. 10.1258/jhsrp.2007.006029.

    PubMed  Google Scholar 

  71. 71.

    Van de Ven AH, Schomaker MS: Commentary: The rhetoric of evidence-based medicine. Health Care Manage Rev. 2002, 27 (3): 89-91.

    PubMed  Google Scholar 

  72. 72.

    Contandriopoulos D, Lemire M, Denis JL, Tremblay E: Knowledge exchange processes in organizations and policy arenas: a narrative systematic review of the literature. Milbank Q. 2010, 88 (4): 444-483. 10.1111/j.1468-0009.2010.00608.x.

    PubMed  PubMed Central  Google Scholar 

  73. 73.

    Dobrow MJ, Goel V, Upshur RE: Evidence-based health policy: context and utilisation. Soc Sci Med. 2004, 58 (1): 207-217. 10.1016/S0277-9536(03)00166-7.

    PubMed  Google Scholar 

  74. 74.

    Gabbay J, le May A: Evidence based guidelines or collectively constructed 'mindlines?' Ethnographic study of knowledge management in primary care. BMJ. 2004, 329 (7473): 1013-10.1136/bmj.329.7473.1013.

    PubMed  PubMed Central  Google Scholar 

  75. 75.

    Mitton C, Adair CE, McKenzie E, Patten SB, Waye Perry B: Knowledge transfer and exchange: review and synthesis of the literature. Milbank Q. 2007, 85 (4): 729-768. 10.1111/j.1468-0009.2007.00506.x.

    PubMed  PubMed Central  Google Scholar 

  76. 76.

    Lawrence PR: How to deal with resistance to change. Harvard Business Review. 1954, Jan-Feb: 4-12.

    Google Scholar 

  77. 77.

    Rogers EM: Diffusion of innovations. 2003, New York, NY: Free Press, 5

    Google Scholar 

  78. 78.

    Van de Ven AH, Polley DE, Garud R, Venkataraman S: The Innovation Journey. 1999, Oxford: Oxford University Press

    Google Scholar 

  79. 79.

    Senge PM: The Fifth Discipline: The Art and Practice of the Learning Organization. 1990, New York, NY: Doubleday

    Google Scholar 

  80. 80.

    Stake R: Multiple Case Study Analysis. 2006, New York, NY: Guilford Press

    Google Scholar 

  81. 81.

    Yin RK: Case study research: Design and methods. 2009, Thousand Oaks, CA: Sage, 4

    Google Scholar 

  82. 82.

    Hamel J, Dufour S, Fortin D: Case study methods. 1993, Thousand Oaks, CA: SAGE Publications

    Google Scholar 

  83. 83.

    Flyvbjerg B: Five misunderstandings about case-study research. Qual Inq. 2006, 12 (2): 219-245. 10.1177/1077800405284363.

    Google Scholar 

  84. 84.

    Meyer CB: A case in case study methodology. Field Methods. 2001, 13: 329-352. 10.1177/1525822X0101300402.

    Google Scholar 

  85. 85.

    Rycroft-Malone J, Seers K, Titchen A, Harvey G, Kitson A, Mccormack B: What counts as evidence in evidence-based practice?. J Adv Nurs. 2004, 47 (1): 81-90. 10.1111/j.1365-2648.2004.03068.x.

    PubMed  Google Scholar 

  86. 86.

    McCormack B, Kitson A, Harvey G, Rycroft-Malone J, Titchen A, Seers K: Getting evidence into practice: the meaning of 'context. J Adv Nurs. 2002, 38 (1): 94-104. 10.1046/j.1365-2648.2002.02150.x.

    PubMed  Google Scholar 

  87. 87.

    Klein KJ, Conn AB, Sorra JS: Implementing computerized technology: An organizational analysis. J Appl Psychol. 2001, 86 (5): 811-824.

    CAS  PubMed  Google Scholar 

  88. 88.

    Klein KJ, Sorra JS: The challenge of innovation implementation. Acad Manage Rev. 1996, 21 (4): 1055-1080.

    Google Scholar 

  89. 89.

    Creswell JW: Qualitative inquiry and research design: choosing among five approaches. 2007, Thousand Oaks, CA: SAGE Publications

    Google Scholar 

  90. 90.

    Patton MQ: Qualitative research & evaluation methods. 2002, Thousand Oaks, CA: SAGE Publications, 3

    Google Scholar 

  91. 91.

    Rubin H, Rubin I: Qualitative interviewing: the art of hearing data. 1995, Thousand Oaks, CA: Sage Publications

    Google Scholar 

  92. 92.

    Ajzen I: The theory of planned behaviour. Organ Behav Hum Decis Process. 1991, 50 (2): 179-211. 10.1016/0749-5978(91)90020-T.

    Google Scholar 

  93. 93.

    Braun V, Clarke V: Using thematic analysis in psychology. Qual Res Psychol. 2006, 3 (2): 77-101. 10.1191/1478088706qp063oa.

    Google Scholar 

  94. 94.

    Boyatzis RE: Transforming Qualitative Information: Thematic Analysis and Code Development. 1998, Thousand Oaks, CA: Sage Publications

    Google Scholar 

  95. 95.

    Odell SJ: Case study methods in international political economy. Int Stud Perspect. 2001, 2: 161-176. 10.1111/1528-3577.00047.

    Google Scholar 

  96. 96.

    Sorin-Peters R: The case for qualitative case study methodology in aphasia: an introduction. Aphasiology. 2004, 18: 937-949. 10.1080/02687030444000444.

    Google Scholar 

  97. 97.

    Corcoran PB, Walker KE, Wals AEJ: Case studies, make-your-case studies, and case stories: a critique of case-study methodology in sustainability in higher education. Environmental Education Research. 2004, 10 (7-21):

Download references


We gratefully acknowledge Dr. Lois Jackson for her valuable input and guidance related to this study. We also acknowledge Margaret Jorgensen (coordinator) for her assistance with this study and Cynthia Kendell for her helpful review of and suggestions on this manuscript. This study is funded by the CIHR/CCNS Team in Access to Colorectal Cancer Services in Nova Scotia (funders: Canadian Institutes of Health Research, Cancer Care Nova Scotia, Nova Scotia Department of Health and Wellness, Capital District Health Authority; Dalhousie University Faculty of Medicine; Dalhousie Medical Research Foundation). Robin Urquhart has also received funding from the Nova Scotia Health Research Foundation to carry out this work. The funding bodies have no role in the design, collection, analysis, and interpretation of data; in the writing of the manuscript; and in the decision to submit the manuscript for publication.

Author information



Corresponding author

Correspondence to Robin Urquhart.

Additional information

Competing interests

GAP is the Project Lead in Nova Scotia for the Surgical Synoptic Reporting Tools Project. He receives no financial compensation for this work. The remaining authors have no conflicts of interest to declare.

Authors' contributions

RU conceived the idea for this study, led the intellectual development and protocol writing, and will be primarily responsible for its conduct. GAP, EG, and JS all contributed to the drafting and development of the study. All authors reviewed and agreed on the final manuscript.

Rights and permissions

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and Permissions

About this article

Cite this article

Urquhart, R., Porter, G.A., Grunfeld, E. et al. Exploring the interpersonal-, organization-, and system-level factors that influence the implementation and use of an innovation-synoptic reporting-in cancer care. Implementation Sci 7, 12 (2012).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI:


  • Cancer
  • Synoptic report
  • Innovation
  • Implementation