Skip to main content

Investigation of factors influencing the implementation of two shared decision-making interventions in contraceptive care: a qualitative interview study among clinical and administrative staff

Abstract

Background

There is limited evidence on how to implement shared decision-making (SDM) interventions in routine practice. We conducted a qualitative study, embedded within a 2 × 2 factorial cluster randomized controlled trial, to assess the acceptability and feasibility of two interventions for facilitating SDM about contraceptive methods in primary care and family planning clinics. The two SDM interventions comprised a patient-targeted intervention (video and prompt card) and a provider-targeted intervention (encounter decision aids and training).

Methods

Participants were clinical and administrative staff aged 18 years or older who worked in one of the 12 clinics in the intervention arm, had email access, and consented to being audio-recorded. Semi-structured telephone interviews were conducted upon completion of the trial. Audio recordings were transcribed verbatim. Data collection and thematic analysis were informed by the 14 domains of the Theoretical Domains Framework, which are relevant to the successful implementation of provider behaviour change interventions.

Results

Interviews (n = 29) indicated that the interventions were not systematically implemented in the majority of clinics. Participants felt the interventions were aligned with their role and they had confidence in their skills to use the decision aids. However, the novelty of the interventions, especially a need to modify workflows and change behavior to use them with patients, were implementation challenges. The interventions were not deeply embedded in clinic routines and their use was threatened by lack of understanding of their purpose and effect, and staff absence or turnover. Participants from clinics that had an enthusiastic study champion or team-based organizational culture found these social supports had a positive role in implementing the interventions.

Conclusions

Variation in capabilities and motivation among clinical and administrative staff, coupled with inconsistent use of the interventions in routine workflow contributed to suboptimal implementation of the interventions. Future trials may benefit by using implementation strategies that embed SDM in the organizational culture of clinical settings.

Peer Review reports

Background

Shared decision-making (SDM) is a process in which providers and patients exchange information, deliberate about available options together, identify the patient’s preferences, and incorporate those preferences in choosing the option or treatment plan [1]. In contraceptive care, SDM has the potential to improve the quality of patient-provider communication, promote patient autonomy, and enhance health and well-being by supporting the patient to choose the contraception option that matches their informed preferences.

SDM is also a model for operationalizing World Health Organization recommendations to offer “evidence-based, comprehensive information, education and counselling to ensure informed choice,” so that “every individual is ensured an opportunity for their own use of modern contraception…without discrimination” [2].

Patients who reported that “the provider and me together” decided what contraceptive method they would use were more satisfied with the decision-making process than those who reported other roles in decision-making [3]. Other research has found that SDM in contraceptive care is suited to the intimacy and complexity of this particular decision [4]. However, despite the suggested benefits of SDM interventions, and increasing calls from policy makers to use SDM as a strategy to promote patient-centered care [5,6,7], SDM has not been widely implemented in contraceptive care nor in other settings.

Implementation researchers have attempted to identify systematically the behaviors that may influence SDM adoption at the individual, organizational, and policy level [7,8,9,10,11,12,13,14]. For instance, in an evaluation of implementation of cancer screening SDM interventions in 12 California primary care practices, the clinician’s role was observed to be the most important factor for implementation, in combination with supportive infrastructure and the practice’s dedication to the goal of SDM [15]. However in the absence of supportive infrastructure (such as an electronic system for automatically mailing decision aids to eligible patients), implementation may be more challenging and require behavioral interventions that encourage adoption at the individual or team level [16,17,18].

The aim of this qualitative study was to explore the feasibility and acceptability of two interventions for facilitating SDM about contraceptive methods with a particular focus on factors that influenced their implementation by clinical and administrative staff. The study was embedded within a 2 × 2 factorial cluster randomized controlled trial (RCT), the Right For Me study, and conducted in 16 primary care and reproductive healthcare clinics in the Northeast United States [19].

Methods

Design

This qualitative study involved semi-structured, one-on-one telephone interviews with staff at clinics involved in the Right For Me trial (ClinicalTrials.gov Identifier: NCT02759939). Methods for the trial are published elsewhere [19]. This study was approved by the Dartmouth College Committee for the Protection of Human Subjects (STUDY00029945).

Theoretical framework

This qualitative study was informed by the Theoretical Domains Framework (TDF), an integrative framework based on psychological theories and key theoretical constructs related to behavior change [20, 21]. In operationalizing the TDF for this research, the study team used the framework in the design, data collection, and analysis of the qualitative investigation of the implementation, acceptability, feasibility, and sustainability of the Right For Me interventions.

The TDF consists of 14 domains describing processes underlying successful behavior change. These domains map onto the capability, opportunity, motivation-behavior (COM-B) implementation model developed by Michie and colleagues [22]. In this “behavior system,” capability, opportunity, and motivation interact to generate behavior, which in turn feeds back and influences those system components: “the skills necessary to perform the behaviour, a strong intention to perform the behaviour, and no environmental constraints that make it impossible to perform the behavior” [22].

Interventions

The two interventions comprised a patient-targeted intervention (video + prompt card) and a provider-targeted intervention (decision aids + training) in English and Spanish. These interventions are described in detail elsewhere [19] and summarized in Table 1.

Table 1 Right For Me shared decision-making (SDM) interventions

Implementation context and strategy

The 16 participating clinics were located in the northeastern United States, provided contraceptive counseling, and included Planned Parenthood affiliated clinics. Four were randomly allocated to the control arm (usual care), four to the patient-targeted intervention, four to the provider-targeted intervention, and four to both interventions. The trial was conducted during a 9-month period, with interventions implemented during the final 6 months.

Each clinic had an identified contact whose role as a senior staff member was to liaise with the research team and facilitate implementation of the interventions in their clinic. At the outset of the trial, a member of the research team (KD) visited each clinic and provided a group orientation on the trial objectives, scope, and data collection procedures. After clinics were randomized to trial arms, those clinics assigned to deliver one or more interventions received a similar, follow-up group orientation about the intervention(s). This orientation was facilitated by a presentation slide deck (see Additional file 1) and included instruction on intervention objectives, target audience, supplies provided, parties responsible for implementation-related activities, and intervention maintenance. Rather than providing strong direction on how to integrate interventions into the clinic workflow, this orientation provided examples of possible approaches and encouraged each clinic to collaboratively develop their own implementation strategy, considering the clinic workflow and other routinely used patient or counseling materials. The slide deck remained accessible to clinics via the trial website.

Interview participants

Participants included clinical and administrative (e.g., front desk) staff aged 18 years or older who worked in one of the 12 intervention arm clinics, had email access, and consented to being audio-recorded. Depending on the clinic, physicians, physician assistants, nurse practitioners, and/or healthcare associates provided contraceptive counseling. A sampling frame was developed by KD identifying each study clinic, the names, demographics, and roles of staff in each clinic, and the project contact. The 12 clinics assigned to an intervention arm had approximately 70 staff potentially involved in implementation.

Each project contact contacted their colleagues directly and via posters in the clinic spaces and asked permission to share their email addresses with the research staff. Interested staff were emailed the study information and invited to participate in one telephone interview. Recruitment and interviews were conducted by a qualitative researcher (SM), who sampled participants first based on their role in the clinic (clinical or administrative) by referencing the sampling frame, and then purposively to seek variation in age, gender, race, profession, and years of experience. We also sought extreme case examples of clinic staff that reported having no memory of seeing or using the interventions despite having a clinic role where they would have been expected to have used them as part of their work. We did so by returning to the sampling frame and identifying participants who had either left their position or joined the clinic during the study period and were likely to be less familiar with the interventions.

Study materials

The interview guide was based on the TDF domains, constructs, and definitions provided by Michie et al. [23] and Cane et al. [20]. Development consisted of three stages: (1) drafting a preliminary list of open-ended questions that explored each of the 14 TDF domains and a 7-item demographic questionnaire, (2) adding probes that explored domain constructs, and (3) pilot testing with three health professionals from the Right For Me research team for comprehensibility and relevance. At each development stage, additional feedback was sought from the research team.

Data collection

Telephone interviews were conducted by SM from January to April 2017, immediately after the intervention implementation period. SM was not involved in the design or data collection of the trial or in the development of the interventions undergoing evaluation and had no relationships with the staff invited to participate. Participants were compensated with a $30 Amazon gift card. Data collection continued until (a) all participants in the sampling frame had an opportunity to respond to the request for an interview, (b) the sample demonstrated sufficient variation, and (c) data saturation was achieved (the interviews did not generate new insights regarding implementation factors). Interviews were audio-recorded and transcribed by a professional transcription service. Each was assigned a numeric identifier (e.g., “001”) and minor edits were made to remove potentially identifying information about staff and their clinics.

Data analysis

Thematic analysis principles [24] were used to guide data analysis, which sought to identify the domains most relevant to the implementation, feasibility, acceptability, and sustainability of the interventions in practice. SM led the analysis with support from two public health researchers (KD, RM) and a nurse-researcher (DA). Each researcher first reviewed one quarter of the transcripts while listening to the audio recordings to become immersed in the data, check the transcription for accuracy, and remove names and other potential identifiers. Verification strategies were pursued throughout to ensure validity and avoid subjective bias, including having multiple researchers involved in codebook development and analysis, and constant comparison. We kept memos throughout to facilitate concurrent data collection and analysis, to maintain a data trail, and document our interpretive choices.

Codebook development

Codebook development at the TDF domain level

At the outset of the study, concurrent with the development of the interview guide, we developed a codebook consisting of the a priori TDF domains and theoretical constructs [20]. We first coded a random sample of transcripts with the codebook and met to discuss our interpretations. Through discussion, we distilled the list of 84 constructs to 39 that were most relevant in influencing clinic staff behavior. A construct was deemed relevant if it was (a) frequent, (b) participants demonstrated conflicting attitudes and beliefs about the construct, and/or (c) the construct was associated with strong attitudes and beliefs [21].

Operationalizing context-specific descriptions

To provide consistency in our deductive coding and interpretation, we operationalized the TDF constructs into context-specific descriptions [21]. These descriptions resulted from inductive analysis of the transcripts and were iteratively refined until reliability was achieved between the coders. Our final codebook consisted of both the TDF domains (themes) and context-specific descriptions (sub-themes).

Coding the transcripts

The interviewer (SM) coded all transcripts using the finalized codebook, facilitated by Atlas.ti qualitative analysis software (version 1.6.0 for Mac). RM, KD, and DA then received one-third of the coded transcripts, which they independently coded in duplicate to determine if the TDF domains and constructs were interpreted consistently, and to suggest additional codes. We compared coding and resolved disagreements through iterative discussion. Discrepancies in coding were generally due to different interpretations of constructs or lapses in attention. We met as a team through regular teleconferences to discuss, compare, synthesize, and map relationships between findings; compare our findings to our theoretical framework [20, 23]; and generate interpretive insights about the data. We discussed the data collection and analysis and sought feedback on results in progress with the larger research team via our recurring monthly teleconference, one-on-one phone calls, and a face-to-face workshop with patient and stakeholder partners.

Results

Overview

Interviews were conducted with 29 clinic staff from 11 of the 12 intervention clinics (see Table 2). All clinic staff identified as female and were predominantly White, reflecting the demographics of the region, and majority gender of professionals in contraceptive care. The majority had a clinical (n = 16, 55%) or both clinical and administrative role (n = 4, 14%) that involved providing contraceptive counseling to patients. Interviews were 20 to 40 min in length. Staff invited to participate from Clinic 4 (decision aids + training) either did not respond to invitations or declined to participate. Reasons for declining included being too busy, feeling they had nothing meaningful to contribute, or being a recent hire to the clinic with no interaction with the interventions.

Table 2 Characteristics of the clinic staff participant sample (n = 29)

Each clinic chose to implement the interventions using a different approach (see Table 3). Some clinics chose to keep the decision aids in an easily accessible space, such as on a desk or in a wall display holder in exam rooms (Clinic 1, 9–12). In most clinics, the video and prompt cards were handed to patients in the waiting room (Clinics 6, 11, 12) or private exam room (Clinic 7–10) prior to the visit.

Table 3 Characteristics of the clinic settings (n = 11)

When analyzed through the lens of the Theoretical Domains Framework, we observed that each implementation approach was the result of dynamic interaction between multiple domains. It was necessary first for clinic staff to have the capability to implement the interventions, primarily knowledge, skills, memory, and behavioral regulation to use them routinely. Their behavior was also motivated by their social/professional role and identity, beliefs about consequences, and goals. Implementation was also modified by factors that influenced the opportunity to engage in the implementation behavior: social influences (of patients) and their environmental context and resources (see Table 4).

Table 4 Mapping of the Right For Me findings to the Theoretical Domains Framework (TDF)

Capability

A necessary antecedent to implementation was first being aware of the interventions (“Knowledge”) and how to use them (“Procedural Knowledge”) (see Table 5 for representative quotations). While most staff were aware of the decision aids and patient video, some reported that they never watched the video or viewed the prompt card, were absent during the study team orientation, or did not notice the video and prompt card until partway through the implementation period. This led some to be unclear about the purpose of these interventions. In contrast, all clinical staff knew about the purpose of the decision aids and reported tending to use them with fidelity, following the steps provided in the online training (“Explain it, Give it, Use it”). In the context of Clinic 2, however, staff reported using the decision aid after the patient had made a choice, as a way to discuss the benefits and harms of the contraception method:

So they choose the method. We review the medical history with them. Then I’ll say, ‘Oh, we’re going to bring you upstairs to see the practitioner, or we’re going to bring you upstairs to see the nurse and I’m also going to give you our handout to go home with as well, which says a little bit more about the rare side effects and complications of the methods.’ (015, Clinic 2, Clinical role, Decision aids + Training)

Clinical staff perceived that the group orientation and decision aid training reinforced their proficiency in patient communication (“Skills”). There was vast heterogeneity in how each staff member engaged with the training; some clinics had all staff, both clinical and administrative, complete the training as a team at the outset of implementation, while others asked them to complete it during their own time, and a minority was unaware that they could have accessed the training themselves.

Table 5 Factors related to “Capability” and representative quotations

Clinical staff across all 12 clinics had a strong perception that SDM was already a part of their contraceptive counseling approach (“Professional Role”), was appropriate for their clinical context (“Organizational Culture/Climate”), and was facilitated by their use of existing contraceptive counseling resources (“Resources/Material Resources”):

We didn't need a whole lot of training, because this is what I do all the time. So, I have different decision tools, I found them useful, and I didn't – I've been doing it for 20 years ... I didn't need a lot of training to know how to use these with women. Because I also have been trained in shared decision making, and motivational interviewing, and all of that. (002, Clinic 1, Clinical role, Decision aids + Training)

Descriptions of their contraceptive counseling approaches suggested that they successfully used the decision aids to engage in components of SDM. For instance, participants from reproductive health care clinics explained that they typically placed the decision aid providing an overview of all contraceptive methods on a desk between themselves and the patient and used it as a visual aid while asking open-ended questions to elicit the patient’s preferences and identify the features of a birth control method that would suit the patient’s needs. Despite this, health care providers did not always report taking the next step of asking patients to indicate their preferences in writing (“Procedural Knowledge”). For some, this was due to perceived patient disinterest or discomfort with this step (“Social Influences”) and, in these contexts, patient influences were a factor in staff use of the decision aids: “It makes me think that perhaps the staff stopped even utilizing that aspect of that suggestion over time because people weren’t taking them up on it.” (015, Clinic 2, Clinical, Decision aids + Training).

Clinical staff perceived that getting into the habit of using the decision aids (“Behavioral Regulation”) was relatively easy because of their similarity to existing contraceptive counseling resources. However, participants explained that providing a tablet computer to patients and asking them to watch a video before their clinical visit required a change to existing routines for administrative staff. Three clinics (7, 11, and 12) got into the habit of implementing the prompt cards routinely by handing them out to each patient prior to the visit, while others clinics (5, 6, and 10) chose to place a stack of cards in the waiting room or exam rooms and did not interact with the cards thereafter. Notably, staff from clinics that did not have a plan for using the prompt cards and did not make them part of staff routines were the same ones unaware that the prompt cards existed (see Table 3).

Participants that reported gaining proficiency in how to use the interventions typically perceived clear leadership from the project contact, who acted as a liaison with the research team. For instance, after experiencing significant staff turnover, the project contact in Clinic 9 began providing one-on-one, in-person training to staff in how to use the decision aids interactively, using a role-playing technique. Other project contacts had staff review all of the interventions to create common understanding (“Knowledge”), while others worked with staff to develop a plan for how to use them (“Action Planning”).

Clinics implemented the interventions in a way aligned with their work style, contextual needs, and team dynamics (see Table 3). Most clinics developed a plan at the outset of the study for how, where, and when they would integrate the interventions into clinic workflow, mental reminders, and talking points (“Action Planning”), as one participant described:

The biggest thing was I think creating the space for the [interventions], so like physically and mentally. So rearranging the rooms in a way so that we have those sort of stackable file holders, and putting all of the tear-off sheets in those in an order and in a way that made sense. (016, Clinic 9, Administrative role, Both interventions)

Handing the interventions directly to patients and explaining their purpose was more acceptable and feasible than relying on the patient to use them if interested. In contrast, in the context of Clinic 5, participants used a passive strategy for the video and prompt card because staff perceived it was the patient’s responsibility to engage with them:

We just kind of allowed them to watch the video without any intervention on our end to say, ‘Feel free to watch this or we’d like you to watch this beforehand,’ or ‘You have to watch this on your visit.’ It was kind of freeform. (003, Clinic 5, Administrative role, Video + prompt card)

Motivation

Clinic staff were motivated to implement the interventions if they felt that they were aligned with their existing roles and responsibilities (“Professional Role”) and added value to their work (“Reinforcement”) (see Table 6 for representative quotations). The content of the decision aids reflected what clinical staff would typically discuss with patients, and provided a textual cue to reinforce their “talking points.” One key contextual factor in Clinic 10 was that the project contact changed her implementation approach and encouraged nurses to use the decision aids as part of their responsibilities after observing that they had not adopted them into their professional role.

Table 6 Factors related to “Motivation” and representative quotations

While decision aids were perceived to help clinical staff exercise their existing responsibilities for contraceptive counseling, handing out the videos and prompt cards did not help front desk staff complete their administrative tasks, and this misalignment was a barrier to implementation (“Reinforcement”).

Administrative staff did not know what took place between patients and clinical staff after the patient had watched the video, while clinical staff reflected that they did not know which patients had watched the video in the waiting room: “I’d be curious to know what the study shows as far as patients who took the survey who also said they watch the video. It’s just a tool that was much more hands off for me” (013, Clinic 11, Clinical role, Both interventions). Staff who watched the video had some criticisms of the content, namely that the featured patient was not representative of their patients and sounded “rehearsed,” negatively impacting their motivation to implement it. In contrast, clinical staff involved in contraceptive counseling illustrated the motivating value of directly observing or experiencing a positive effect as a result of using the decision aids (“Reinforcement”). For instance, some staff shared that, after using a decision aid, patients chose a method of contraception that seemed best aligned with their preferences.

Opportunity

The physical context of implementation (“Environmental Context and Resources”) influenced staff members’ motivation and plans to use the interventions. The implementation process was perceived to be easiest in clinics with a self-described “small team” and low caseload, and/or where staff felt they had flexible procedures and infrastructure to adapt the interventions to fit existing routines and their clinic environment (see Table 7 for representative quotations). Implementation was also facilitated in contexts where reproductive health clinics perceived that their organizational routines and priorities were aligned with the goals for the study (“Organizational Culture”). One participant clarified that because their organization follows the “same guidelines and expectations” for informed contraceptive choices “it was just easy and natural. It flow[ed] very naturally for us” (011, Clinic 3, Clinical role, Decision aids + Training).

Table 7 Factors related to “Opportunity” and representative quotations

Clinical staff typically felt that the decision aids fit easily into the clinic or counseling room space, and helped to make appointments shorter by creating a more focused conversation. A minority of clinical staff perceived that decision aids “trigger[ed] more questions, and therefore a 15-minute visit would tend to be 20 or 25 minutes” (017, Clinic 10, Clinical role, Both interventions). However, these staff typically found the longer visits easy to adjust to (“Environmental Stressors”). Staff that received both interventions did not perceive that having multiple components added to their implementation “load,” in part because of the division of labor between clinical (decision aids + training) and administrative (video + prompt card) staff. Rather, comments about time pressure and workload were more common among staff who were already experiencing environmental stressors, such as participating in other research studies or having unexpected staff turnover.

Staff from clinics with existing approved counseling materials used these and the decision aids together, so that one supplemented the other (“Resources/Material Resources”). Few staff felt that using multiple resources was cumbersome in counseling, because the resources were so similar. As described above, a minority of staff at reproductive health care clinics perceived that they already do SDM (“Competence”) and this attitude led them to believe that the decision aids and accompanying training were redundant (“Motivation”). No staff suggested that there were any existing materials that were replaced by or supplemented the video or prompt card.

Finally, one of the core domains that influenced implementation behavior was the “Social Influence” of patients on staff members’ routine use of the interventions. While staff perceived that patients found the decision aids acceptable, they felt that patients had minimal interest in the video and prompt cards, potentially because using them was inconsistent with typical waiting room behavior. Nonetheless, staff felt that both interventions were appropriate for their patient population, in particular for those with lower education and literacy, and those making their own health care decisions for the first time.

Sustainability

When asked if they would want to continue using the interventions now that the trial was complete, staff reported that they would like to continue using the decision aids but had mixed feelings about the video and prompt card. Without the tablets provided by the Right For Me study, most felt that their clinic would be unlikely to continue use of the video in its current format. While staff were keen to continue using the decision aids, those affiliated with a larger organization or network also felt that future implementation decisions would have to be made “high up,” for consistency of content and branding across the organization.

Discussion

Our findings suggest that the decision aids were more acceptable, feasible, and sustainable than the video and prompt cards. Awareness of these interventions, knowing how to use them correctly and competently, integrating the interventions into regular workflow, and having a professional role and organizational culture that supported using the interventions appeared to facilitate intervention implementation. Clinic environments, workflow, and physical space supported implementation of the decision aids, but did not facilitate use of the video and prompt card. While some facilitators are context-specific, our findings suggest that introducing interventions will not be successful without the resources required to modify existing routines and to monitor and sustain behavior change.

In clinics where implementation was explained as relatively weak, it appears that the interventions were not considered an essential professional responsibility. While integrating a video via tablet into a busy waiting room may not be a feasible strategy for facilitating SDM, integrating paper-based decision aids into clinical routines may prove more successful. However, it requires negotiating and planning as to what the task is, who does it, how it gets done, and whether it adds any real value.

Elwyn and colleagues conducted a thematic analysis of qualitative interviews embedded in the intervention phase of a trial of similar clinical encounter decision aids for treatment of knee osteoarthritis [25]. Before using the decision aid, clinicians expressed concern about time pressures, patient resistance, and patient information overload [25]. After minimal training, the same clinicians perceived that the decision aid was acceptable and helpful, and it had changed their usual way of communicating. In the USA, case studies of implementation of clinical encounter decision aids in routine care suggested that physicians may not perceive the decision aids have utility, particularly for patients with low literacy [26]. Lack of suitability for patients was not a factor that emerged from analysis of our clinic staff interviews; rather, participants suggested that patients with low literacy or limited education would benefit most from interventions that facilitate SDM.

Our findings suggest that participants in clinics that implemented both interventions did not experience implementation overload in comparison with those that were exposed to one only. Similar to a study investigating implementation of a clinical encounter decision aid for circumcision, we observed that gaining the skills to use the decision aid through practice (a learning curve) was necessary [27].

The research team overestimated some clinics’ capacity to self-organize in designing and preparing for implementation even when the interventions were conceptually aligned. However, the solution for bridging this capacity gap is unclear. The MAGIC (making good decisions in collaboration) program, which sought to implement SDM into routine primary and secondary care, similarly observed that clinical teams feel they already involve patients in decisions about their care [13]. In that program, hands-on role-playing that promoted practical skills and exercises to change embedded attitudes helped to show clinicians how SDM differed from their current practice. Changing individual SDM behavior in contraceptive care may thus require more interactive training, such as role-playing, that emphasizes both skills (Capability) and the value of the skill to the individual and their organization (Motivation).

All clinics had a strong perception that SDM was already a part of their organizational culture, and was facilitated in some clinics by use of existing educational resources. The high acceptability of the decision aids may stem from our extensive provider consultation about their content [28]. Implementing the decision aids was perceived to be a simple step at implementation (e.g., swapping their existing resources for the Right For Me interventions) but sometimes difficult to remember on a day-to-day basis. Not all staff randomized to the decision aid and training intervention reported completing the online modules or using the decision aids as intended (e.g., used them during the counseling encounter; wrote on them). A meta-analysis of six randomized controlled trials conducted in US practice settings [29] similarly observed that few clinicians used clinical encounter decision aids with fidelity. The authors of the meta-analysis observed that, after implementing the interventions, clinicians used them as intended only partially and inconsistently, and that higher fidelity was associated with increased patient knowledge and patient involvement in decision-making [29]. Such findings have led Montori and colleagues to suggest that “the answer is not in” regarding the effect of decision aids on SDM [30].

Clinics exposed to the video and prompt cards had limited awareness of the cards, and perceived that the videos were difficult to integrate into routine workflow and were of limited interest to patients. The implementation process was seen to be easiest in smaller clinics, or where staff felt they had flexible procedures and infrastructure to adapt the interventions to fit routine practice. The success of implementing these new routines was also dependent on the actions of clinic patients, who may either accept or decline to use the Right For Me interventions. Survey responses from patient participants in the trial will provide further insight into the number of patients who reported using the Right For Me interventions, and what proportion would recommend them to a friend (e.g., acceptability). Participants felt that both interventions were most appropriate for their low health literacy patients. However, these attitudes may not be supported by emerging literature. Recent investigations from Australia suggest that generic question sets alone, like those used in our video and prompt card, are not sufficient to support shared decision-making among adults with low literacy [31] and additional strategies may be required to improve understanding of SDM terms and probability concepts [32].

The video and prompt card used in this study were adapted from the “Ask, Share, Know” program previously tested and implemented in an Australian primary care setting [33]. A systematic review of the use of question prompt lists in routine practice highlighted the importance of “endorsement,” that is, when the list is not given to or mentioned by the clinician, studies demonstrate inconsistent findings with respect to patients’ question asking [34]. In our study, this construct was reflected in clinic staff “motivation.” Clinical staff were largely unaware of which patients were exposed to the video and prompt card, and wished to know so they could respond to the patients’ questions and observe whether or not they were useful to contraceptive counseling. Implementation of the video and prompt card may thus require an organization-wide or team-based training that increases clinician awareness of and motivation to engage with them.

Our study findings also suggest that there may be differences in implementation practices for patient-targeted interventions implemented by administrative staff (video + prompt card) versus those that are intended for the provider to use with the patient (decision aids). The organizational or institutional context may also play an important role. Sexual and reproductive health clinics and organizations have well-established norms, such as organization-wide counseling protocols and branding. These norms may represent a double-edged sword—they provide the capability, opportunity, and motivation for staff to engage in SDM, but may be inflexible to change.

Limitations of this study include lack of accompanying observational strategies for assessing implementation success. This meant we were unable to investigate the relationships between staff perceptions and actual implementation. We also took measures to minimize social desirability bias, but some participants may have over-reported positive and under-reported negative perceptions, attitudes, or experiences. In spite of our partnerships with and recruitment support from clinic staff at each study clinic, we received limited interest in participation from some clinicians, leading to no data for Clinic 4 and only one participating clinic staff each for Clinics 8 and 11. Findings from those clinics should be interpreted in relationship to the other settings, not individually.

The strengths of this study include our systematic application of a theoretical framework for behavior change [22] to develop SDM interventions, comprehensive evaluation of their use in routine contraceptive care, and identification of factors that influence their use. Having an independent researcher conduct and analyze the interviews mitigated potential interviewer and reporting bias. Finally, by including administrative staff in the study sample, we gathered data on the implementation experience of stakeholders across clinic organizations. Interviews with administrative staff provided critical data on the feasibility of the video and prompt card interventions, which would not have been collected through interviews with clinical staff alone.

Conclusion

Our results suggest that clinical and administrative staff perceived the clinical encounter decision aids to be more acceptable and feasible to implement than the patient video and prompt card questions. Implementation of interventions that align with existing roles, tasks, and workflow may have greater acceptability, feasibility, and sustainability than those that require new procedures and infrastructure. We demonstrated how use of the Theoretical Domains Framework can be used to understand the factors that influence implementation of SDM, and to create interventions that are theoretically and behaviorally informed. Future studies could build on our findings of the factors that influence implementation of SDM and use the Behavior Change Wheel and COM-B frameworks to characterize and design strategies for implementing our study interventions in different settings [22].

Availability of data and materials

The qualitative datasets generated and/or analyzed during the current study are not publicly available due to privacy and ethics restrictions.

Abbreviations

COM-B:

Capability, opportunity, motivation-behavior

MAGIC:

Making good decisions in collaboration

PCORI:

Patient Centered Outcomes Research Institute

SDM:

Shared decision-making

TDF:

Theoretical Domains Framework

References

  1. Charles C, Whelan T, Gafni A. What do we mean by partnership in making decisions about treatment? BMJ. 1999 Sep 18;319(7212):780–2.

    Article  CAS  Google Scholar 

  2. World Health Organization. Ensuring human rights in the provision of contraceptive information and services: guidance and recommendations [Internet]. Geneva: World Health Organization; 2014. Available from: http://www.ncbi.nlm.nih.gov/books/NBK195054/

    Google Scholar 

  3. Dehlendorf C, Grumbach K, Schmittdiel JA, Steinauer J. Shared decision making in contraceptive counseling. Contraception. 2017;95(5):452–5.

    Article  Google Scholar 

  4. Chen M, Lindley A, Kimport K, Dehlendorf C. An in-depth analysis of the use of shared decision making in contraceptive counseling. Contraception. 2019 Mar;99(3):187–91.

    Article  Google Scholar 

  5. Washington State. Providing high quality, affordable health care to Washingtonians based on the recommendations of the blue ribbon commission on health care costs and access [Internet]. E2S SB 5930 2007. Available from: http://apps.leg.wa.gov/billinfo/summary.aspx?bill=5930&year=2007. Accessed 3 Aug 2015.

  6. Légaré F, Stacey D, Forest P-G, Coutu M-F. Moving SDM forward in Canada: milestones, public involvement, and barriers that remain. Z Evid Fortbild Qual Gesundhwes. 2011;105(4):245–53.

    Article  Google Scholar 

  7. Coulter A, Edwards A, Elwyn G, Thomson R. Implementing shared decision making in the UK. Z Evid Fortbild Qual Gesundhwes. 2011;105(4):300–4.

    Article  Google Scholar 

  8. Munro S, Kornelsen J, Wilcox E, Corbett K, Bansback N, Janssen P. Do women have a choice? Care providers’ and decision makers’ perspectives on barriers to access of health services for birth after a previous caesarean. Birth. 2017;44(2):153–60.

    Article  Google Scholar 

  9. Munro S, Janssen P, Wilcox E, Corbett K, Bansback N, Kornelsen J. Seeking control in the midst of uncertainty: women’s experiences of choosing mode of delivery after caesarean. Women Birth. 2017;20(2):129–36.

    Article  Google Scholar 

  10. Munro S, Kornelsen J, Wilcox E, Kaufman S, Bansback N, Corbett K, et al. Implementation of shared decision-making in healthcare policy and practice: a complex adaptive systems perspective. Evid Policy. 2019; https://doi.org/10.1332/174426419X15468571657773.

  11. Légaré F, Ratté S, Gravel K, Graham ID. Barriers and facilitators to implementing shared decision-making in clinical practice: update of a systematic review of health professionals’ perceptions. Patient Educ Couns. 2008;73(3):526–35.

    Article  Google Scholar 

  12. Légaré F, O’Connor AM, Graham ID, Saucier D, Côté L, Blais J, et al. Primary health care professionals’ views on barriers and facilitators to the implementation of the Ottawa decision support framework in practice. Patient Educ Couns. 2006;63(3):380–90.

    Article  Google Scholar 

  13. Joseph-Williams N, Lloyd A, Edwards A, Stobbart L, Tomson D, Macphail S, et al. Implementing shared decision making in the NHS: lessons from the MAGIC programme. BMJ. 2017;357:j1744.

    Article  Google Scholar 

  14. Joseph-Williams N, Elwyn G, Edwards A. Knowledge is not power for patients: a systematic review and thematic synthesis of patient-reported barriers and facilitators to shared decision making. Patient Educ Couns. 2014;94(3):291–309.

    Article  Google Scholar 

  15. Frosch DL, Singer KJ, Timmermans S. Conducting implementation research in community-based primary care: a qualitative study on integrating patient decision support interventions for cancer screening into routine practice. Health Expect. 2011;14:73–84.

    Article  Google Scholar 

  16. Elwyn G, Scholl I, Tietbohl C, Mann M, Edwards AGK, Clay C, et al. “Many miles to go …”: a systematic review of the implementation of patient decision support interventions into routine clinical practice. BMC Med Inform Decis Mak. 2013;13(Suppl 2):S14.

    Article  Google Scholar 

  17. Scholl I, Hahlweg P, Lindig A, Bokemeyer C, Coym A, Hanken H, et al. Evaluation of a program for routine implementation of shared decision-making in cancer care: study protocol of a stepped wedge cluster randomized trial. Implement Sci. 2018;13:51.

    Article  Google Scholar 

  18. Tan ASL, Mazor KM, McDonald D, Lee SJ, McNeal D, Matlock DD, et al. Designing shared decision-making interventions for dissemination and sustainment: can implementation science help translate shared decision making into routine practice? MDM Policy Pract. 2018;3(2):2381468318808503.

    PubMed  PubMed Central  Google Scholar 

  19. Thompson R, Manski R, Donnelly KZ, Stevens G, Agusti D, Banach M, et al. Right for me: protocol for a cluster randomised trial of two interventions for facilitating shared decision-making about contraceptive methods. BMJ Open. 2017;7(10):e017830.

    Article  Google Scholar 

  20. Cane J, O’Connor D, Michie S. Validation of the theoretical domains framework for use in behaviour change and implementation research. Implement Sci. 2012;7(1):37.

    Article  Google Scholar 

  21. Atkins L, Francis J, Islam R, O’Connor D, Patey A, Ivers N, et al. A guide to using the theoretical domains framework of behaviour change to investigate implementation problems. Implement Sci. 2017;12:77.

    Article  Google Scholar 

  22. Michie S, van Stralen MM, West R. The behaviour change wheel: a new method for characterising and designing behaviour change interventions. Implement Sci. 2011;6(1):42.

    Article  Google Scholar 

  23. Michie S, Johnston M, Abraham C, Lawton R, Parker D, Walker A, et al. Making psychological theory useful for implementing evidence based practice: a consensus approach. Qual Saf Health Care. 2005;14(1):26–33.

    Article  CAS  Google Scholar 

  24. Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006;3(2):77–101.

    Article  Google Scholar 

  25. Elwyn G, Rasmussen J, Kinsey K, Firth J, Marrin K, Edwards A, et al. On a learning curve for shared decision making: interviews with clinicians using the knee osteoarthritis Option Grid. J Eval Clin Pract. 2018;24(1):56–64.

    Article  Google Scholar 

  26. Scalia P, Elwyn G, Durand M-A. “Provoking conversations”: case studies of organizations where Option Grid™ decision aids have become ‘normalized’. BMC Med Inform Decis Making. 2017;17:124.

    Article  Google Scholar 

  27. Fay M, Grande SW, Donnelly K, Elwyn G. Using option grids: steps toward shared decision-making for neonatal circumcision. Patient Educ Couns. 2016;99(2):236–42.

    Article  Google Scholar 

  28. Donnelly KZ, Foster TC, Thompson R. What matters most? The content and concordance of patients’ and providers’ information priorities for contraceptive decision making. Contraception. 2014;90(3):280–7.

    Article  Google Scholar 

  29. Wyatt KD, Branda ME, Anderson RT, Pencille LJ, Montori VM, Hess EP, et al. Peering into the black box: a meta-analysis of how clinicians use decision aids during clinical encounters. Implement Sci. 2014;9:26.

    Article  Google Scholar 

  30. Montori VM, Kunneman M, Brito JP. Shared decision making and improving health care: the answer is not in. JAMA. 2017;318(7):617–8.

    Article  Google Scholar 

  31. Muscat DM, Shepherd HL, Morony S, Smith SK, Dhillon HM, Trevena L, et al. Can adults with low literacy understand shared decision making questions? A qualitative investigation. Patient Educ Couns. 2016;99(11):1796–802.

    Article  Google Scholar 

  32. Muscat DM, Morony S, Smith SK, Shepherd HL, Dhillon HM, Hayen A, et al. Qualitative insights into the experience of teaching shared decision making within adult education health literacy programmes for lower-literacy learners. Health Expect. 2017;20(6):1393–400.

    Article  Google Scholar 

  33. Shepherd HL, Barratt A, Jones A, Bateson D, Carey K, Trevena LJ, et al. Can consumers learn to ask three questions to improve shared decision making? A feasibility study of the ASK (AskShareKnow) patient-clinician communication model(®) intervention in a primary health-care setting. Health Expect. 2016;19(5):1160–8.

    Article  Google Scholar 

  34. Sansoni JE, Grootemaat P, Duncan C. Question prompt lists in health consultations: a review. Patient Educ Couns. 2015;98(12):1454–64.

    Article  Google Scholar 

Download references

Acknowledgments

We thank the study participants for sharing their time and insights. Thank you to Ardis L. Olson and Krishna K. Upadhya for providing expert and professional feedback in preparing this study.

Disclaimers

The views presented in this protocol are solely the responsibility of the author(s) and do not necessarily represent the views of PCORI, its Board of Governors, or Methodology Committee.

The findings and conclusions in this report are those of the authors and do not necessarily represent the views of Planned Parenthood Federation of America, Inc.

Funding

Research reported in this protocol was funded through a Patient-Centered Outcomes Research Institute (PCORI) Award (CDR-1403-12221; contact: info@pcori.org). Apart from requiring adherence to Methodology Standards that specify best practices in the design and conduct of patient-centered outcomes research, PCORI has had no role in the design of the study and collection, analysis, and interpretation of data and in writing the manuscript.

Author information

Authors and Affiliations

Authors

Contributions

RT conceived the study and supervised the data collection, analysis, and manuscript write-up. SM led data collection (interviews), coding, and analysis and wrote the manuscript. RM, KD, and DA performed the coding, participated in analysis, and commented on the manuscript in progress. All authors contributed to study design and interpretation of findings, and approved the final manuscript.

Corresponding author

Correspondence to Sarah Munro.

Ethics declarations

Ethics approval and consent to participate

This study was approved by the Dartmouth College research ethics board STUDY00029945.

Consent for publication

Not applicable.

Competing interests

SM reports personal fees and non-financial support from Dartmouth College during the conduct of the study and non-financial support from Dartmouth College outside the submitted work. RM, KZD, DA, GS, TF, and DJJ report a grant from the Patient-Centered Outcomes Research Institute (PCORI) during the conduct of the study. MB reports personal fees from Dartmouth College during the conduct of the study. MBB reports a grant from PCORI and other payments from Dartmouth College during the conduct of the study. PB reports personal fees and non-financial support from Dartmouth College during the conduct of the study and non-financial support from Dartmouth College outside the submitted work. CCB reports personal fees and non-financial support from Dartmouth College during the conduct of the study. JN reports personal fees from Dartmouth College during the conduct of the study and non-financial support from PCORI outside the submitted work. MN reports personal fees and other payments from Dartmouth College during the conduct of the study. MN also reports a role as a healthcare provider and clinic representative in a clinic participating in the study. HLS reports a role as a developer of the AskShareKnow programme intervention components and related survey items that were adapted for use in the study but has not received any personal income connected to this role. LT reports other payments from Dartmouth College during the conduct of the study. GE reports a grant from PCORI during the conduct of the study and personal fees from Emmi Solutions LLC, Washington State Health Department, Oxford University Press, the National Quality Forum, SciMentum LLC, EBSCO Health, & think LLC and ACCESS Federally Qualified Health Centers outside the submitted work. GE also reports ownership of copyright in the CollaboRATE measure of shared decision-making, the Observer OPTION measure of shared decision-making, and several patient decision aids. RT reports a grant from PCORI during the conduct of the study and non-financial support from PCORI outside the submitted work. RT also reports ownership of copyright in several patient decision aids and a role as an editor of the text “Shared Decision Making in Health Care,” but has not received any personal income connected to this ownership or role.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Additional file 1.

Orientation materials for clinics implementing the Right For Me interventions.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Munro, S., Manski, R., Donnelly, K.Z. et al. Investigation of factors influencing the implementation of two shared decision-making interventions in contraceptive care: a qualitative interview study among clinical and administrative staff. Implementation Sci 14, 95 (2019). https://doi.org/10.1186/s13012-019-0941-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13012-019-0941-z

Keywords