Skip to main content

Advertisement

Study protocol: a pragmatic, stepped-wedge trial of tailored support for implementing social determinants of health documentation/action in community health centers, with realist evaluation

Article metrics

Abstract

Background

National leaders recommend documenting social determinants of health and actions taken to address social determinants of health in electronic health records, and a growing body of evidence suggests the health benefits of doing so. However, little evidence exists to guide implementation of social determinants of health documentation/action.

Methods

This paper describes a 5-year, mixed-methods, stepped-wedge trial with realist evaluation, designed to test the impact of providing 30 community health centers with step-by-step guidance on implementing electronic health record-based social determinants of health documentation. This guidance will entail 6 months of tailored support from an interdisciplinary team, including training and technical assistance. We will report on tailored support provided at each of five implementation steps; impact of tailored implementation support; a method for tracking such tailoring; and context-specific pathways through which these tailored strategies effect change. We will track the competencies and resources needed to support the study clinics’ implementation efforts.

Discussion

Results will inform how to tailor implementation strategies to meet local needs in real-world practice settings. Secondary analyses will assess impacts of social determinants of health documentation and referral-making on diabetes outcomes. By learning whether and how scalable, tailored implementation strategies help community health centers adopt social determinants of health documentation and action, this study will yield timely guidance to primary care providers. We are not aware of previous studies exploring implementation strategies that support adoption of social determinants of action using electronic health and interventions, despite the pressing need for such guidance.

Trial registration

clinicaltrials.gov, NCT03607617, registration date: 7/31/2018—retrospectively registered

Background

“Social determinants of health” impact health risks and outcomes [1,2,3,4,5,6,7,8,9,10,11,12]. For example, adverse social determinants (e.g., chronic stress, poverty, lack of access to healthy foods/safe exercise) create barriers to acting on diabetes care recommendations, increasing risks of poor diabetes outcomes [13,14,15,16,17,18,19,20,21]. Through such mechanisms, social determinants contribute to health disparities, hamper efforts to implement guideline-based care, and break the link between care quality and health outcomes [3, 22,23,24,25]. A small but growing body of research shows that documenting patients’ social determinants of health in healthcare settings leads to improved receipt of social services and improved health outcomes [26,27,28,29]. Social determinants documentation in electronic health records can also improve care teams’ ability to track and respond to patients’ social needs systematically [30,31,32,33].

Thus, numerous national leaders now recommend documenting social determinants of health in electronic health records, and taking action to address social determinants of health (e.g., referring patients to social service agencies; adapting care plans as needed) [31, 34,35,36,37,38,39,40]. Since such documentation/action may become required for some care providers, especially those in Accountable Care Organizations [41], many health care delivery systems are exploring ways to incorporate social determinants of health screening/action into routine care [28, 42,43,44], including through more routine documentation of patient-reported social determinants of health in electronic health records [32, 33, 45, 46].

Systematic electronic health record documentation of patients’ social determinants of health needs could help care teams understand potential impacts on their patients’ health and ability to act on care recommendations [28, 40, 47,48,49,50,51] and adjust care plans accordingly (e.g., prescribe medications that do not require refrigeration if a patient is homeless) [29, 52], or intervene to address social determinants of health (e.g., through referrals to community resources) [29, 52,53,54,55,56,57]. Well-documented social determinants of health could also identify needed social service resources [47] and inform health care payment structures that account for the social vulnerability of a clinic’s patient population [58,59,60]. And, while emergent research suggests the health benefits of social determinants documentation/action, improving such documentation in electronic health records will enable further scientific assessment of which social determinants most impact specific patients’ health, and how clinical teams can intervene to address these impacts.

These benefits cannot accrue without effective strategies for implementing social determinants of health data documentation/action, but little evidence yet exists to guide integrating social determinants of health documentation into standard practice [28, 42, 61,62,63,64]. The need for such guidance is especially urgent in primary care community health centers, which serve patients whose health risks are high, and whose exposure to social determinants of health are profound [17, 20, 21, 65,66,67]. Although community health centers have long sought to understand and address the social factors that impact health, their efforts have typically been ad hoc and rarely documented in electronic health records [15, 24, 25, 65, 68,69,70,71,72,73,74,75].

Some efforts to help community health centers and other primary care settings adopt systematic social determinants of health documentation in electronic health records are underway. The National Association of Community Health Centers’ “Protocol for Responding to and Assessing Patient Assets, Risks, and Experiences” (PRAPARE) [76] outlines how community health centers can collect patient-reported social determinants of health data and suggests electronic health record-based social determinants of health data documentation tools.

Our team built on PRAPARE in a recent pilot study (R18DK105463) that sought to optimize the documentation and presentation of social determinants of health data within standard electronic health record functions. (We believe this was the first US study on documenting standardized social determinants of health data using electronic health record-based tools in community health centers) [45, 46]. We developed a suite of electronic health record-based social determinants of health data tools [45, 77] and activated them in a network of > 500 community health centers with a shared electronic health record in June 2016. These tools are described elsewhere [46, 77]. Three pilot study clinics were also given electronic health record tools to facilitate referring patients with social determinants of health needs to community resources. These tools enable staff to give patients information about local services; provide “internal” referrals to social workers, community health workers, etc.; and help patients make appointments with those services. The tools’ lists of available community resources must be manually updated by clinic staff.

Our pilot study demonstrated the feasibility of developing electronic health record tools for social determinants of health documentation/action. It also revealed myriad implementation barriers. Some barriers were similar to those associated with implementing other patient-reported data collection [78,79,80,81,82,83,84], such as difficulties with optimizing workflows/minimizing logistical burdens; staff turnover; adequately training relevant staff; billing for staff time spent collecting and acting on these data; knowing which patient-reported measures are most important; having resources for addressing identified needs; and ensuring that the right staff see the needed data at the right workflow step and can respond to these data [78, 79, 81,82,83,84,85,86,87,88,89,90]. Barriers specific to adoption of social determinants of health documentation/action included the need to change perceptions of healthcare teams’ responsibilities; lack of clarity about how to make social determinants of health-related “referrals”; clinic staff concerns about collecting data on social determinants of health needs when no “action” could be taken to address those needs; limited knowledge of how to use the electronic health record for this purpose; false-positive screening results (e.g., patient has food insecurity, but already accesses a food bank); the initial lack of a method for documenting whether patients want help; and inadequate infrastructure, incentives, and decision support for effective social determinants of health screening/action.

Such barriers could substantially hamper implementation of social determinants of health documentation, thus impeding community health centers’ (and others’) ability to use social determinants of health data. The “ASCEND” trial (1R18DK114701-01, ApproacheS to Community Health Center ImplEmeNtation of Social Determinants of Health Data Collection and Action), described here, will test whether and how providing tailored, scalable, pragmatic implementation support helps community health centers adopt social determinants of health screening documentation/action using electronic health record tools. To our knowledge, no previous trials have formally tested implementation strategies targeting electronic health record-based social determinants of health documentation/action [91]. Secondary analyses will assess impacts of social determinants of health documentation and action on care quality and biomarkers in patients with/at risk for diabetes (an expected subset of screened patients); only a few previous studies have assessed such impacts [26,27,28, 57]. Study results could inform diverse national efforts to increase social determinants of health documentation and action.

This study will directly address dissemination and implementation science priorities by evaluating the impact of providing tailored implementation strategies [92,93,94,95,96], and demonstrating a method for tracking such tailoring [97,98,99,100]. Through this method, we will report on how support was tailored at each implementation step. To augment this information, our realist evaluation will identify context-specific pathways through which these tailored strategies effect change [101, 102]. This study was approved by the Kaiser Permanente Northwest Institutional Review Board.

Methods

This 5-year study began in September 2017. It is being conducted at OCHIN (not an acronym), a non-profit health center-controlled network that hosts and centrally manages an Epic© electronic health record for > 500 primary care community health centers located in 18 states, as of July 2018 [103,104,105]. OCHIN’s electronic health record is shared by its member community clinics, making it the nation’s largest community health center network on a single electronic health record instance. Table 1 shows the characteristics of OCHIN community health centers’ patients seen between June 2016 and May 2018. The table shows that OCHIN community health centers’ patients’ socioeconomic risks are reflected in social determinants of health data already collected per federal requirements: 21% are uninsured and 63% are publicly insured; only 37% are white; 31% are of Hispanic ethnicity, 28% primarily non-English speakers, and 62% are from households < 138% of the federal poverty level.

Table 1 Demographic characteristics of OCHIN patients with an ambulatory visit/office encounter, 6/24/2016–5/17/2018

The social determinants of health data tools in OCHIN’s electronic health record are the “innovation” whose adoption is targeted in this study. The tools were fine-tuned for the current study, based on lessons from the pilot study and formative analyses (described below), and to ensure their alignment with the Epic© electronic health record’s 2018 social determinants of health module. They include options for clinic staff to document social determinants of health data directly into the electronic health record, or for patients to do so through the patient portal or a tablet at the clinic. If patients complete social determinants of health screenings on paper, the data must be entered into the electronic health record by clinic staff. Social determinants of health screening results and past social determinants of health-related referrals are shown in an social determinants of health summary, with positive screening results highlighted visually (Fig. 1). The social determinants of health questions in the tools align with those recommended by several national groups [30, 45, 76, 77].

Fig. 1
figure1

Social determinants of health summary view

We will provide step-by-step tailored implementation support to the 30 study community health centers (details below) and evaluate how effectively this intervention supports such clinics’ adoption of social determinants of health screening/action, as documented in the electronic health record. This is a mixed-methods, pragmatic, stepped-wedge, cluster-randomized trial, with a hybrid type 3 implementation-effectiveness design: we focus on adoption of electronic health record documentation of social determinants of health data and processes, and also consider the health impacts of this adoption [106,107,108,109]. Primary outcomes are adoption of electronic health record-based social determinants of health documentation/action; secondary outcomes are the impact of such adoption on the health of adults with/at risk for diabetes. (Study clinics will decide which patients they want to screen; we will conduct secondary analyses among those with diabetes.) Cluster randomization enables controlling for clinic-level characteristics, appropriate to our primary outcomes of clinic-level changes. The stepped-wedge design, with six wedges, enables us to provide the intervention to five community health centers at a time while ensuring that all study clinics eventually receive the intervention, which will help with participation and retention and has advantages over parallel cluster-randomized trials in terms of statistical power [110] (Table 2).

Table 2 Stepped-wedge design

Conceptual guide

This study is guided by the “building blocks of primary care” [111], which outlines components essential to high-performing primary care practices, building on Starfield’s four pillars of primary care practice, and elements of the Joint Principles and Primary Care Medical Home recognition standards [112]. Its foundational “building blocks” are (1.) engaged leadership; (2.) data-driven improvement using electronic health records; (3.) empanelment; (4.) team-based care. The intervention directly addresses these four building blocks as they relate to social determinants of health screening/action using electronic health record tools (Tables 3 and 4). A realist evaluation framework will guide our evaluation of the causal processes that lead to intervention outcomes [113]. Measurement of implementation success is guided by the RE-AIM framework [114]. Analysis details are given below.

Table 3 Characteristics of study implementation strategies, per Proctor et al. [116]
Table 4 Implementation support components

Recruitment and randomization

We recruited eight community health center organizations from OCHIN’s membership for formative interviews with clinic staff, targeting clinics with prior social determinants of health documentation in the electronic health record’s social determinants of health data tools. Thirty additional OCHIN member community health centers will be recruited in two waves for the trial portion of the study, targeting those who want to initiate or improve their social determinants of health documentation/action efforts. The first wave of 15 practices was recruited in the spring of 2018 and block-randomized to wedges 1–3 with 5 clinics per wedge. The second wave of 15 practices will be recruited in 2019 and randomized to wedges 4–6 as in wave 1. This two-wave process ensures that no recruited clinics will wait more than a year to receive the intervention, important both for recruitment and because the rapidly changing social determinants of health screening landscape means clinics’ needs and interests may change between recruitment periods. As our primary outcomes can be derived historically from the electronic health record, we will obtain pre-intervention data at all time points as required to evaluate stepped-wedge trials. All study clinics will receive the same intervention; randomization staggers the timing of when the intervention starts (Table 2).

Study clinics will be asked to identify a clinician champion and/or a “Social determinants of health Operational Champion” to oversee the clinic’s social determinants of health implementation efforts, and take part in the intervention’s implementation support activities. The clinics will receive a description of the tasks involved with each role and may select staff for these roles as they deem appropriate.

The implementation support intervention

Implementation support will be provided to one “wedge” of five community health centers at a time by a multi-disciplinary implementation support team, for 6 months per wedge (Tables 3 and 4). Implementation support team members have expertise in social determinants of health, clinic workflows and practice change implementation, and electronic health record use. (If any needed competencies are identified that the implementation support team does not have, we will bring in the needed expertise and document the skills needed to support community health centers’ social determinants of health screening/action adoption.) We expect that implementation support team members will spend approximately 1 h/month in calls with each study clinic, 2 h/month on office hours, 1–2 h/week to discuss the clinics’ progress and needs internally, and 1–2 h/week to respond to clinic emails, for a total of 15–23 h per month to support five clinics. We will document whether more or less time is needed.

The tailored support uses implementation strategies selected for their demonstrated effectiveness at supporting practice change [98, 115,116,117,118,119,120,121,122,123,124,125,126], results from our pilot study, and potential scalability. They include staff training, technical assistance, audit and feedback, goal identification, leadership engagement, practice coaching, peer-to-peer learning, orientation materials, and implementation guides. This approach is based on evidence that practice change is best supported by a combination of implementation strategies, e.g., “change toolkits” are more likely to be adopted if guidance for their use is also provided. Table 3 shows characteristics of each implementation strategy, as per Proctor et al.’s implementation strategy reporting recommendations [117].

In each implementation step, these strategies will be supported by specific materials and interactions with the study clinics (Table 4). This “lesson plan” approach, in which the implementation support team provides each clinic with just the materials needed for their next implementation step (although all materials will be available on a learning management system), is designed to avoid overwhelming the clinics with too much information at once [127].

Implementation strategies 1–4

  1. 1.

    The clinic action plan. This step-by-step guide to implementing social determinants of health data documentation/action (first two columns, Table 4) was developed based on findings from our pilot study.

  2. 2.

    Technical assistance—implementing social determinants of health screening. We will provide written materials to support each clinic action plan step. These were informed by social determinants of health implementation guides developed by national groups (e.g., PRAPARE, HealthLeads) [76, 128] with input from these groups and by learnings from our pilot study [46, 77]. The materials include recommendations and decision tools for each step. Figures 2 and 3 are examples of these decision tools. The entire implementation guide is in Additional file 1.

  3. 3.

    Technical assistance—using the electronic health record. We found no existing social determinants of health implementation guides that emphasize use of electronic health record data tools. Our team developed training materials on the use of the electronic health record’s tools for social determinants of health documentation/action. They include tips on using the social determinants of health tools in workflows; illustrated guides to using the electronic health record tools for social determinants of health screening/action and for monitoring the clinic’s tool use adoption; and information on how to identify community social service agencies to which patients can be referred.

  4. 4.

    Ongoing technical assistance, tailored problem-solving:

    1. 4a.

      Bi-monthly hour-long webinars/office hours/peer support: The implementation support team will hold “office hours” via webinar every 2 weeks; study clinics will be encouraged to attend and submit questions in advance. Each webinar will focus on one aspect of social determinants of health adoption, determined by clinic request/the coach’s knowledge of the clinics’ progress. To support peer-to-peer learning, we will ask clinics that have made progress in a given step to present on their success, and encourage discussion across sites.

    2. 4b.

      Monthly hour-long coaching call: A member of the implementation support team will meet with each clinic’s champion by phone to review the clinic’s progress, ask about barriers/facilitators to social determinants of health documentation/action implementation, and help as needed.

    3. 4c.

      Email questions: Study clinics will be encouraged to email the implementation support team with questions; the implementation support team will respond within two workdays. Content from these emails and the monthly webinars will be summarized and shared with all clinics in a given wedge via a monthly email.

Fig. 2
figure2

Decision tool: which patients does the clinic want to target for social determinants of health screening?

Fig. 3
figure3

Social determinants of health workflow planning tool

Tailoring

A growing body of research [97,98,99] supports tailoring implementation support to meet local needs, i.e., customizing implementation support rather than providing a one-size-fits-all strategy. We will tailor implementation support to each clinic’s specific needs and track this customization. The implementation support team will first review each clinic’s baseline data, consider what might address each clinic’s needs, and tailor the implementation support plan as feasible. For example, if a given clinic does not have experience using their own data to drive improvement efforts, the implementation support team will plan to offer additional training on how to do so. During the intervention period, clinics will complete a bi-monthly web-based survey describing their progress. We will track which support strategies the clinics needed at each clinic action plan step, and if a clinic is stuck at a certain step, the implementation support team will identify additional implementation support that might help. For example, if a clinic gets stuck on step 2 after receiving the support listed in Table 4, we might provide additional calls, trainings, or materials to help with specific encountered barriers. We will document the precise implementation steps where study clinics faced barriers, the support provided to address those barriers, and whether that support helped. Thus, the clinic action plan is a pragmatic tool for guiding and tracking the provision of tailored implementation support.

Data collection and analysis

Formative data collection and analysis (year 1; completed)

At time of writing, we are at the start of study year 2. In study year 1, we measured social determinants of health data collection among all OCHIN community health centers, using extracted electronic health record data. We recruited eight community health center organizations with high social determinants of health documentation rates (as identified in these formative analyses) to take part in exploratory semi-structured interviews. Clinics were asked to identify six staff members who played different roles related to social determinants of health documentation. The interviews explored barriers/facilitators to electronic health record-based social determinants of health data collection/use, and experiences with the electronic health record’s social determinants of health data tools. Results were used to identify needed improvements to the social determinants of health data tools and informed development of the implementation support intervention. Formative data analysis results will be reported in a future publication.

Implementation data collection and analysis (years 2–5)

Quantitative evaluation

All quantitative data will be extracted from study clinics’ shared electronic health record. Outcome measures are guided by the RE-AIM framework [114] (Table 5). Outcomes will be measured monthly in all study clinics at every period. Each wedge provides data points in both control and intervention conditions.

Table 5 Study outcomes

To compare the effect of the intervention with usual practice on social determinants of health outcome measures in a stepped-wedge design, we will utilize generalized linear mixed models with random effects for clinic. Random effects for state will be considered to account for clustering of practices within states. This model will incorporate independent variables, take into account the general time trend, and allow for the intervention effect to grow over time. We will estimate the intervention effect with the within-site difference between social determinants of health collection rates pre- and post-intervention, averaging across practices and accounting for possible secular trends which might confound results. As our statistical tests are specified a priori and our proposed social determinants of health outcome measures are highly related, we will report p values rather than adjust for multiple comparisons [129, 130]. If significant differences in key clinic characteristics between wedges remain post-randomization, we will use propensity score methods to reduce observed bias and thereby minimize external threats to validity [131, 132].

In secondary analyses focused on a diabetes population, we will measure intervention-associated changes in clinical measures reflecting diabetes risk management (blood pressure, hemoglobin A1c, body mass index, lipids, etc.), rates of incident comorbidities, and rates of patients up-to-date on key diabetes tests (lipid panel annually, hemoglobin A1c within 6 months, eye/foot exams). We hypothesize that patients at intervention clinics for whom social determinants of health data are collected will have significant improvements in these measures by the end of the study period, compared to those at control clinics. Data for these analyses will be extracted from OCHIN’s electronic health record. A similar model will be considered as in the primary analysis.

Realist evaluation

A key priority for implementation science is identifying the mechanisms by which implementation strategies exert their effects [101, 102]. Realist evaluation clarifies which components of a multifaceted intervention work, for whom, and under what conditions [133], to produce change. Assuming that interactions between contextual and mechanistic factors are key to effective cross-setting translation of interventions [134], realist evaluation conceptualizes intervention outcomes as resulting from a relationship between context and mechanism: context + mechanism = outcome. We will disaggregate “mechanism” into resources and reasoning; thus, mechanism (resources) + contextmechanism (reasoning) = outcome [135]. The goal is to identify context-mechanism-outcome configurations that explain the pathways through which the intervention (tailored implementation support) impacts the systematic collection of social determinants of health data, and the integration of such data into care. This framework will guide data collection and analyses, as below and in Fig. 4.

Fig. 4
figure4

Realist evaluation model—factors influencing intervention impact on outcomes [135, 136, 162]

In this evaluation, the intervention is the tailored implementation support (not shown in Fig. 4) and the mechanism (resources) are the electronic health record-based data tools. Context can include characteristics of individuals (e.g., roles, attitudes, knowledge), teams (e.g., relationships, team functioning), organizations (e.g., staffing, culture, leadership, resources), and environment (e.g., payor policies, political structures) [134, 136]. Here, context will be measured through (i) a brief baseline survey and (ii) analysis of the exchanges between the clinics and the implementation support team. People (in this case, clinic staff) respond to available resources (mechanism: resources) in different ways [134]. This “response to resources,” or mechanism (reasoning), will be assessed through (i) analysis of interactions between study clinics and the implementation support team (as above) and (ii) a condition-specific card study.

Data collection

To limit burden on clinics, and mitigate potential Hawthorne effects [137], our realist evaluation will primarily use data collected by the implementation support team in the course of its regular activities. The card study (see below) is the one exception.

Baseline survey

Shortly before each wedge of community health centers starts the intervention, the operational champion at each clinic will complete a baseline survey. Lacking validated, easily implemented methods for assessing clinics’ readiness to adopt practice changes, we developed a brief baseline survey specifically designed to assess some aspects of readiness as related to adoption of social determinants of health documentation/action. Informed by the building blocks of primary care, it assesses the clinics’ status in empanelment and team-based care, as well as their access to community health workers/social workers/behaviorists; external policies/incentives that might impact results; recent major disruptive events; other clinic initiatives; and payment models. The survey is available in Additional file 2.

Content of community health center interaction with implementation team

As noted earlier, during each study wedge clinical and/or operational champions from each clinic (as well as any other staff member that is interested in attending) will participate in monthly organization-specific coaching calls and bi-monthly webinars/peer support conversations. With permission, these discussions will be recorded and transcribed. We will track which community health center staff attended these discussions. At monthly calls with each study clinic, we will evaluate each clinic’s progress per the clinic action plan and ask which intervention components were used that month and by whom. We will also document additional support that the study clinics request. We will collect relevant email exchanges between the implementation team and the study clinics, as well as “trouble-tickets” about the social determinants of health tools, as submitted to OCHIN’s member support system. When each wedge ends, we will record a debrief session with the implementation support team to capture their understanding of implementation at each community health center in that wedge.

As shown in Fig. 4, data from the implementation team interactions with study clinics, primarily in the form of monthly check-ins and office hours, are a key data source for measurement of both context and mechanism: reasoning. The monthly check-ins are a particularly important data source for the evaluation, as the majority of rich back and forth between the implementation team and clinic staff (questions, conversations, talking through challenges the clinics are facing them and brainstorming ways to address those challenges) happens during these organization-specific interactions.

Card study

We will measure the impact of social determinants of health data on point-of-care decision-making via an electronic health record-embedded card study focused on clinical action (care decisions/referrals) at encounters with patients in each clinic’s target population. Two providers at each study clinic, identified based on how often they see patients that the clinic is targeting for social determinants of health screening, will be recruited by the clinic’s operational champion to complete a < 1 min survey on all patients in the target population seen in a 3-week period. The provider will complete a “card” after the encounter with a targeted patient, which will ask (1) whether/how social determinants of health data informed clinical decisions/actions; (2) how the social determinants of health data was obtained, e.g., via the electronic health record tools?; (3) whether any desired social determinants of health data were unavailable; and (4) estimated time spent looking up social determinants of health data. The questions will not ask for any patient data, and the survey answers will not be saved in the patient chart. We will associate each card with the following information: provider type (MD, SO, PA, NFP, behavioral health), whether the patient was seen by their assigned primary care provider, encounter chief complaint, and whether a completed social determinants of health screen was in the patient’s chart at the time of encounter. These data will be collected ≈ 5 months into the 6-month intervention period.

Realist evaluation analysis

We will conduct a mixed-methods convergent comparative “case analysis” [138] in which qualitative and quantitative data will be collected concurrently and used to build understanding of the change process in each case (clinic). Data from each case will be “merged” for analysis, then compared within and across clinics to confirm, expand on, or challenge each site’s findings [116, 138]. Data collection and analysis will be parallel and iterative; analysis will begin at the end of the first wedge and continue as data from each wedge are collected. A grounded theory approach [139,140,141] and immersion-crystallization process [142] will be used to engage deeply with the data and identify emergent themes [143] that will be categorized into context, mechanism, or outcome. Potential configurations of data in these categories will be proposed, then refined as data collection continues, to identify context-specific intervention components that enable effective implementation of social determinants screening documentation/action [144, 145].

Discussion

Myriad national initiatives are underway to begin clinic-based social determinants of health documentation/action. These efforts will likely encounter barriers similar to those associated with adoption of any practice change involving new workflows/electronic health record functionalities, plus barriers specific to social determinants of health activities [77]. However, little empirical evidence guides this implementation; to our knowledge, no previous studies have examined the implementation strategies needed to support adoption of social determinants of health-related practice changes in any setting [91]. Even the Centers for Medicare and Medicaid Services’ innovative Accountable Healthcare Communities initiative [146], designed to test “… whether systematically identifying and addressing the health-related social needs … will impact health care costs and reduce health care utilization,” does not focus on the support needs associated with implementing these activities. The study described here will identify strategies for helping community health centers adopt electronic health record-based documentation of patient-reported social determinants of health needs and actions to address those needs [28, 47, 63, 64]. We will document how this support can be tailored to meet local needs and the resources and competencies needed to do so. We chose to test support from a centralized, remote team for its scalability.

Using rigorous methods, the study will also yield important knowledge to dissemination and implementation science as follows:

  • We will test the effectiveness of a set of evidence-based implementation strategies which have helped community health centers adopt new workflows/tools in prior research [4, 20, 98, 115,116,117,118,119,120,121,122,123,124,125,126, 147,148,149,150,151,152,153,154,155,156,157], but that have not been assessed in the context of electronic health record-based social determinants of health documentation/action, either in isolation or in combination. We are not aware of other formal studies of implementation strategies needed to support adoption of this important practice change, despite the need for such guidance.

  • We will assess how interdisciplinary implementation teams support practice change [158]. We will track the competencies that the team uses to help the study clinics (e.g., knowledge of electronic health record systems), plus any competencies that the team identifies as needed, and how those needs were met. We recognize that with our tailored strategy, some clinics will need and receive more intensive support. We will document this carefully on our process evaluation by tracking what strategies are needed and provided, and how much time the implementation team spends on each clinic, and overall, to provide the support that is needed.

  • This study will yield information on how to tailor implementation support strategies to meet local needs [96, 98, 100, 116]. The step-by-step clinic action plan is designed to be a focused, pragmatic tool that both guides study clinics’ change implementation and enables tracking the specific implementation supports provided at each step, and how this support is tailored. Such detail about tailoring of implementation strategies is rarely reported [117, 159, 160]. Rather than estimating implementation barriers a priori, this approach focuses on the implementation strategy changes that are needed in practice. By documenting where a given clinic gets stuck within an overall shared approach, and the subsequent impact of additional support provided, our findings could have relevance both for specific social determinants of health-related implementation approaches and for other implementation efforts involving tailored support.

  • The realist evaluation approach is increasingly used to evaluate complex interventions [135, 136, 161] and is well-suited to pragmatic implementation research due to its emphasis on the impact of context. The focus on identifying the context-specific causal mechanisms through which the tailored support impacts clinic uptake will facilitate appropriate adaptation of successful support strategies to other settings. Furthermore, identifying such causal mechanisms is an implementation science priority, as such mechanisms are infrequently reported.

Conclusion

Despite the known health impacts of social determinants of health, and a national movement urging healthcare providers to identify and act on patients’ social determinant-related needs, little is known about how to help community health centers adopt social determinants of health documentation/action. By learning whether and how scalable, tailored implementation strategies help community health centers adopt these changes, the proposed study will yield timely guidance to community clinics nationwide.

Abbreviations

OCHIN:

[Not an acronym]

PRAPARE:

Protocol for Responding to and Assessing Patient Assets, Risks, and Experiences

RE-AIM:

Reach, Effectiveness, Adoption, Implementation, Maintenance

References

  1. 1.

    Commission on Social Determinants of Health. Closing the gap in a generation: health equity through action on the social determinants of health: Commission on Social Determinants of Health final report. Geneva: World Health Organization; 2008.

  2. 2.

    World Health Organization. Social determinants of health: About social determinants of health. 2017; http://www.who.int/social_determinants/sdh_definition/en/. Accessed 11/29/2017, 2017.

  3. 3.

    U. S. Department of Health & Human Services (DHHS). Healthy People 2010, 2nd Edition. Washington: US Government Printing Office; 2000.

  4. 4.

    Frieden TR. A framework for public health action: the health impact pyramid. Am J Public Health. 2010;100(4):590–5.

  5. 5.

    Fenton. Health Care’s Blind Side: The Overlooked Connection between Social Needs and Good Health: Robert Wood Johnson Foundation; 2011.

  6. 6.

    Woolf SH, Johnson RE, Phillips RL Jr, Philipsen M. Giving everyone the health of the educated: an examination of whether social change would save more lives than medical advances. Am J Public Health. 2007;97(4):679–83.

  7. 7.

    Hammig O, Bauer GF. The social gradient in work and health: a cross-sectional study exploring the relationship between working conditions and health inequalities. BMC Public Health. 2013;13:1170.

  8. 8.

    Krieger N, Kosheleva A, Waterman PD, Chen JT, Beckfield J, Kiang MV. 50-year trends in US socioeconomic inequalities in health: US-born Black and White Americans, 1959-2008. Int J Epidemiol. 2014;43(4):1294–313.

  9. 9.

    Lahiri S, Moure-Eraso R, Flum M, Tilly C, Karasek R, Massawe E. Employment conditions as social determinants of health. Part I: the external domain. New Solut. 2006;16(3):267–88.

  10. 10.

    Moure-Eraso R, Flum M, Lahiri S, Tilly C, Massawe E. A review of employment conditions as social determinants of health part II: the workplace. New Solut. 2006;16(4):429–48.

  11. 11.

    Lahelma E, Laaksonen M, Aittomaki A. Occupational class inequalities in health across employment sectors: the contribution of working conditions. Int Arch Occup Environ Health. 2009;82(2):185–90.

  12. 12.

    Kawachi I, Berkman LF. Neighborhoods and Health. New York: Oxford University Press; 2003.

  13. 13.

    [No authors listed.] Social disadvantage linked to diabetes through chronic inflammation. BMJ. 2013;347:f4368.

  14. 14.

    Hsu CC, Lee CH, Wahlqvist ML, et al. Poverty increases type 2 diabetes incidence and inequality of care despite universal health coverage. Diabetes Care. 2012;35(11):2286–92.

  15. 15.

    Jackson CA, Jones NR, Walker JJ, et al. Area-based socioeconomic status, type 2 diabetes and cardiovascular mortality in Scotland. Diabetologia. 2012;55(11):2938–45.

  16. 16.

    Lipton RB, Liao Y, Cao G, Cooper RS, McGee D. Determinants of incident non-insulin-dependent diabetes mellitus among blacks and whites in a national sample. The NHANES I epidemiologic follow-up study. Am J Epidemiol. 1993;138(10):826–39.

  17. 17.

    Lysy Z, Booth GL, Shah BR, Austin PC, Luo J, Lipscombe LL. The impact of income on the incidence of diabetes: a population-based study. Diabetes Res Clin Pract. 2013;99(3):372–9.

  18. 18.

    Muller G, Kluttig A, Greiser KH, et al. Regional and neighborhood disparities in the odds of type 2 diabetes: results from 5 population-based studies in Germany (DIAB-CORE consortium). Am J Epidemiol. 2013;178(2):221–30.

  19. 19.

    Sacerdote C, Ricceri F, Rolandsson O, et al. Lower educational level is a predictor of incident type 2 diabetes in European countries: the EPIC-InterAct study. Int J Epidemiol. 2012;41(4):1162–73.

  20. 20.

    Stringhini S, Sabia S, Shipley M, et al. Association of socioeconomic position with health behaviors and mortality. JAMA. 2010;303(12):1159–66.

  21. 21.

    Espelt A, Kunst AE, Palencia L, Gnavi R, Borrell C. Twenty years of socio-economic inequalities in type 2 diabetes mellitus prevalence in Spain, 1987-2006. Eur J Pub Health. 2012;22(6):765–71.

  22. 22.

    Hill J, Nielsen M, Fox MH. Understanding the social factors that contribute to diabetes: a means to informing health care and social policies for the chronically ill. Permanente J. 2013;17(2):67–72.

  23. 23.

    Clark ML, Utz SW. Social determinants of type 2 diabetes and health in the United States. World J Diabetes. 2014;5(3):296–304.

  24. 24.

    Behforouz HL, Drain PK, Rhatigan JJ. Rethinking the social history. N Engl J Med. 2014;371(14):1277–9.

  25. 25.

    Hughes LS. Social determinants of health and primary care: intentionality is key to the data we collect and the interventions we pursue. J Am Board Fam Med. 2016;29(3):297–300.

  26. 26.

    Gottlieb LM, Hessler D, Long D, et al. Effects of social needs screening and in-person service navigation on child health: a randomized clinical trial. JAMA Pediatr. 2016;170(11):e162521.

  27. 27.

    Berkowitz SA, Hulberg AC, Standish S, Reznor G, Atlas SJ. Addressing unmet basic resource needs as part of chronic Cardiometabolic Disease Management. JAMA Intern Med. 2017;177(2):244–52.

  28. 28.

    Gottlieb LM, Wing H, Adler NE. A systematic review of interventions on Patients' social and economic needs. Am J Prev Med. 2017;53(5):719–29.

  29. 29.

    Garg A, Toy S, Tripodis Y, Silverstein M, Freeman E. Addressing social determinants of health at well child care visits: a cluster RCT. Pediatrics. 2015;135(2):e296–304.

  30. 30.

    Institute of Medicine. Recommended social and behavioral domains and measures for electronic health records. 2014. http://nationalacademies.org/HMD/Activities/PublicHealth/SocialDeterminantsEHR.aspx. Accessed 2018.

  31. 31.

    Committee on the Recommended S, Behavioral D, Measures for Electronic Health Records - Board on Population H, Public Health P. Capturing Social and Behavioral Domains and Measures in Electronic Health Records PHASE 2. Washington: National Academies Press; 2014.

  32. 32.

    Gottlieb LM, Tirozzi KJ, Manchanda R, Burns AR, Sandel MT. Moving electronic medical records upstream: Incorporating social determinants of health. Am J Prev Med. 2015;48(2):215-18.

  33. 33.

    Pinto AD, Glattstein-Young G, Mohamed A, Bloch G, Leung FH, Glazier RH. Building a foundation to reduce health inequities: routine collection of sociodemographic data in primary care. J Am Board Fam Med. 2016;29(3):348–55.

  34. 34.

    Billioux AV, K.; Anothony, S.; Alley, D. Standardized screening for health-related social needs in clinical settings. The Accountable Health Communities Screening Tool 2017; https://nam.edu/wp-content/uploads/2017/05/Standardized-Screening-for-Health-Related-Social-Needs-in-Clinical-Settings.pdf. Accessed 2018.

  35. 35.

    Centers for Disease Control and Prevention. CMS Timeline of Important MU Dates. Meaningful Use 2016; https://www.cdc.gov/ehrmeaningfuluse/timeline.html. Accessed 02/28/2017.

  36. 36.

    Tagalicod RR, Jacob. Progress on Adoption of Electronic Health Records. 2015; https://www.cms.gov/eHealth/ListServ_Stage3Implementation.html. Accessed 2018.

  37. 37.

    Health Information Technology Advisory Committee (HITAC). Meaningful Use Stage 3 Final Recommendations. 2014. https://www.cdc.gov/ehrmeaningfuluse/index.html. Accessed 2018.

  38. 38.

    Centers for Medicare & Medicaid Services. CMS Quality Strategy 2016. 2016. https://www.cms.gov/medicare/quality-initiatives-patient-assessment-instruments/qualityinitiativesgeninfo/downloads/cms-quality-strategy.pdf. Accessed 2018.

  39. 39.

    The Office of the National Coordinator for Health Information Technology (ONC). Federal Health IT Strategic Plan. 2016; https://www.healthit.gov/sites/default/files/9-5-federalhealthitstratplanfinal_0.pdf. Accessed 2018.

  40. 40.

    Adler NE, Stead WW. Patients in context—EHR capture of social and behavioral determinants of health. N Engl J Med. 2015;372(8):698–701.

  41. 41.

    Centers for Medicare & Medicaid Services. Accountable Care Organizations (ACOs). 2018; https://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/ACO/. Accessed 6/7/2018.

  42. 42.

    Thomas-Henkl C, Schulman M. Screening for social determinants of health in populations with complex needs: implementation Considerations. 2017. https://www.chcs.org/media/SDOH-Complex-Care-Screening-Brief-102617.pdf. Accessed 2018.

  43. 43.

    Institute for Alternative Futures. Community Health Centers Leveraging the Social Determinants of Health. In: Alexandria, VA2012: http://www.altfutures.org/pubs/leveragingSDH/IAF-CHCsLeveragingSDH.pdf. Accessed 2018.

  44. 44.

    https://www.commonwealthfund.org/sites/default/files/documents/___media_files_publications_fund_report_2014_may_1749_bachrach_addressing_patients_social_needs_v2.

  45. 45.

    LaForge K, Gold R, Cottrell E, et al. How 6 organizations developed tools and processes for social determinants of health screening in primary care: an overview. J Ambul Care Manage. 2018;41(1):2–14.

  46. 46.

    Gold R, Cottrell E, Bunce A, et al. Developing electronic health record (EHR) strategies related to health center patients’ social determinants of health. J Am Board Fam Med. 2017;30(4):428–47.

  47. 47.

    DeVoe JE, Bazemore AW, Cottrell EK, et al. Perspectives in primary care: a conceptual framework and path for integrating social determinants of health into primary care practice. Ann Fam Med. 2016;14(2):104–8.

  48. 48.

    Daniel H, Bornstein SS, Kane GC. Addressing social determinants to improve patient care and promote health equity: an American College of Physicians Position Paper. Ann Intern Med. 2018;168(8):577–8.

  49. 49.

    Byhoff E, Freund KM, Garg A. Accelerating the implementation of social determinants of health interventions in internal medicine. J Gen Intern Med. 2018;33(2):223–5.

  50. 50.

    Pinto AD, Bloch G. Framework for building primary care capacity to address the social determinants of health. Can Fam Physician. 2017;63(11):e476–82.

  51. 51.

    Tong ST, Liaw WR, Kashiri PL, et al. Clinician experiences with screening for social needs in primary care. J Am Board Fam Med. 2018;31(3):351–63.

  52. 52.

    Gottlieb L, Sandel M, Adler NE. Collecting and applying data on social determinants of health in health care settings. JAMA Intern Med. 2013;173(11):1017–20.

  53. 53.

    Knowles M, Khan S, Palakshappa D, et al. Successes, challenges, and considerations for integrating referral into food insecurity screening in pediatric settings. J Health Care Poor Underserved. 2018;29(1):181–91.

  54. 54.

    Alderwick HAJ, Gottlieb LM, Fichtenberg CM, Adler NE. Social prescribing in the U.S. and England: emerging interventions to address Patients' social needs. Am J Prev Med. 2018;54(5):715–8.

  55. 55.

    Pruitt Z, Emechebe N, Quast T, Taylor P, Bryant K. Expenditure reductions associated with a social service referral program. Popul Health Manag. 2018;21(6):469-76. https://doi.org/10.1089/pop.2017.0199. Epub 2018 Apr 17.

  56. 56.

    Berkowitz SA, Terranova J, Hill C, et al. Meal delivery programs reduce the use of costly health care in dually eligible Medicare and Medicaid beneficiaries. Health Aff (Millwood). 2018;37(4):535–42.

  57. 57.

    McClintock HF, Bogner HR. Incorporating patients’ social determinants of health into hypertension and depression care: a pilot randomized controlled trial. Community Ment Health J. 2017;53(6):703–10.

  58. 58.

    Bachrach DG, J.; Meier, S.; Meerschaert, J.; Brandel, S. Enabling Sustainable Investment in Social Interventions: A Review of Medicaid Managed Care Rate-Setting Tool. 2018; http://www.commonwealthfund.org/publications/fund-reports/2018/jan/social-inteventions-medicaid-managed-care-rate-setting. Accessed 06/05/2018.

  59. 59.

    National Academies of Sciences E, Medicine, National Academies of Sciences E, Medicine. A Proposed Framework for Integration of Quality Performance Measures for Health Literacy, Cultural Competence, and Language Access Services: Proceedings of a Workshop Accounting for Social Risk Factors in Medicare Payment. Washington: The National Academies Press; 2018.

  60. 60.

    Gottlieb L, Tobey R, Cantor J, Hessler D, Adler NE. Integrating social and medical data to improve population health: opportunities and barriers. Health Aff. 2016;35(11):2116–23.

  61. 61.

    Cantor MN, Thorpe L. Integrating data on social determinants of health into electronic health records. Health Aff (Millwood). 2018;37(4):585–90.

  62. 62.

    Beck AF, Cohen AJ, Colvin JD, et al. Perspectives from the Society for Pediatric Research: interventions targeting social needs in pediatric clinical care. Pediatr Res. 2018;84(1):10–21.

  63. 63.

    Cottrell E, Gold R, Likumahuwa S, al. e. Using health information technology to bring social determinants of health into primary care: a conceptual framework to guide research. Journal of Health Care for the Poor and Underserved. 2018. (In Press).

  64. 64.

    Gottlieb L, Cottrell EK, Park B, Clark KD, Gold R, Fichtenberg C. Advancing social prescribing with implementation science. J Am Board Fam Med. 2018;31(3):315–21.

  65. 65.

    Arvantes J. Affordable Care Act Creates Greater Health Care Role for CHCs - Number of Health Centers Expected to Double by 2015. 2010; http://www.aafp.org/news/government-medicine/20101117hcreformchcs.html. Accessed 06/05/2017.

  66. 66.

    Muennig P, Franks P, Jia H, Lubetkin E, Gold MR. The income-associated burden of disease in the United States. Soc Sci Med. 2005;61(9):2018–26.

  67. 67.

    Marmot M, Wilkinson R, editors. Social Determinants of Health. 2nd ed. Oxford: Oxford University Press; 2006.

  68. 68.

    Garg A, Jack B, Zuckerman B. Addressing the social determinants of health within the patient-centered medical home: lessons from pediatrics. JAMA. 2013;309(19):2001–2.

  69. 69.

    Garg A, Sarkar S, Marino M, Onie R, Solomon BS. Linking urban families to community resources in the context of pediatric primary care. Patient Educ Couns. 2010;79(2):251–4.

  70. 70.

    Garg A, Butz AM, Dworkin PH, Lewis RA, Thompson RE, Serwint JR. Improving the management of family psychosocial problems at low-income children's well-child care visits: the WE CARE project. Pediatrics. 2007;120(3):547–58.

  71. 71.

    Page-Reeves J, Kaufman W, Bleecker M, et al. Addressing social determinants of health in a clinic setting: the WellRx pilot in Albuquerque, New Mexico. J Am Board Fam Med. 2016;29(3):414–8.

  72. 72.

    Matthews KA, Adler NE, Forrest CB, Stead WW. Collecting psychosocial “vital signs” in electronic health records: why now? What are they? What’s new for psychology? Am Psychol. 2016;71(6):497–504.

  73. 73.

    Giuse NB, Koonce TY, Kusnoor SV, et al. Institute of medicine measures of social and behavioral determinants of health: a feasibility study. Am J Prev Med. 2017;52(2):199–206.

  74. 74.

    Muller G, Hartwig S, Greiser KH, et al. Gender differences in the association of individual social class and neighbourhood unemployment rate with prevalent type 2 diabetes mellitus: a cross-sectional study from the DIAB-CORE consortium. BMJ Open. 2013;3(6).

  75. 75.

    Unequal treatment: confronting racial and ethnic disparities in health care. Washington: National Academy Press; 2005.

  76. 76.

    National Association of Community Health Centers. PRAPARE. 2016; http://www.nachc.org/research-and-data/prapare/. Accessed 02/23/2017.

  77. 77.

    Gold R. Learnings from Community Health Centers’ Adoption of Social Determinants of Health EHR tools. In: Annals of Family Medicine; 2018.

  78. 78.

    Bryan S, Davis J, Broesch J, et al. Choosing your partner for the PROM: a review of evidence on patient-reported outcome measures for use in primary and community care. Healthcare Policy. 2014;10(2):38–51.

  79. 79.

    Spertus J. Barriers to the use of patient-reported outcomes in clinical care. Circ Cardiovasc Qual Outcomes. 2014;7(1):2–4.

  80. 80.

    Nelson EC, Eftimovska E, Lind C, Hager A, Wasson JH, Lindblad S. Patient reported outcome measures in practice. BMJ. 2015;350:g7818.

  81. 81.

    Boyce MB, Browne JP, Greenhalgh J. The experiences of professionals with using information from patient-reported outcome measures to improve the quality of healthcare: a systematic review of qualitative research. BMJ quality & safety. 2014;23(6):508–18.

  82. 82.

    Hostetter MK, Klein S. Using patient-reported outcomes to improve health care quality: The Commonwealth Fund; 2012.

  83. 83.

    Ivanova JI, Birnbaum HG, Schiller M, Kantor E, Johnstone BM, Swindle RW. Real-world practice patterns, health-care utilization, and costs in patients with low back pain: the long road to guideline-concordant care. Spine J. 2011;11(7):622–32.

  84. 84.

    Ridgeway JL, Beebe TJ, Chute CG, et al. A brief Patient-Reported Outcomes Quality of Life (PROQOL) instrument to improve patient care. PLoS Med. 2013;10(11):e1001548.

  85. 85.

    Jensen RE, Rothrock NE, DeWitt EM, et al. The role of technical advances in the adoption and integration of patient-reported outcomes in clinical care. Med Care. 2015;53(2):153–9.

  86. 86.

    Campbell RJ. The five rights of clinical decision support: CDS tools helpful for meeting meaningful use. J AHIMA. 2013;84(10):42–7.

  87. 87.

    McCullagh LJ, Sofianou A, Kannry J, Mann DM, McGinn TG. User centered clinical decision support tools: adoption across clinician training level. Appl Clin Inform. 2014;5(4):1015–25.

  88. 88.

    Heisey-Grove D, Danehy LN, Consolazio M, Lynch K, Mostashari F. A national study of challenges to electronic health record adoption and meaningful use. Med Care. 2014;52(2):144–8.

  89. 89.

    Rittenhouse DR, Ramsay PP, Casalino LP, McClellan S, Kandel ZK, Shortell SM. Increased health information technology adoption and use among small primary care physician practices over time: a national cohort study. Ann Fam Med. 2017;15(1):56–62.

  90. 90.

    Stehlik J, Rodriguez-Correa C, Spertus JA, et al. Implementation of real-time assessment of patient-reported outcomes in a heart failure clinic: a feasibility study. J Card Fail. 2017;23(11):813–6.

  91. 91.

    Pescheny JV, Pappas Y, Randhawa G. Facilitators and barriers of implementing and delivering social prescribing services: a systematic review. BMC Health Serv Res. 2018;18(1):86.

  92. 92.

    Eccles MP, Armstrong D, Baker R, et al. An implementation research agenda. Implement Sci. 2009;4:18.

  93. 93.

    Institute of Medicine. Initial national priorities for comparative effectiveness research. Washington: The National Academies Press; 2009.

  94. 94.

    Newman K, Van Eerd D, Powell BJ, et al. Identifying priorities in knowledge translation from the perspective of trainees: results from an online survey. Implement Sci. 2015;10:92.

  95. 95.

    Powell BJ, Garcia K, Fernandez ME. Optimizing the cancer control continuum: advancing the science of implementation across the cancer continuum. In: Chambers D, Vinson CA, Norton WE, editors. Advancing the science of implementation across the cancer continuum. New York: Oxford University Press; 2018.

  96. 96.

    Bunger AC, Powell BJ, Robertson HA, MacDowell H, Birken SA, Shea C. Tracking implementation strategies: a description of a practical approach and early findings. Health Res Policy Syst. 2017;15(1):15.

  97. 97.

    Baker R, Camosso-Stefinovic J, Gillies C, Shaw EJ, Cheater F, Flottorp S, Robertson N, Wensing M, Fiander M, Eccles MP, Godycki-Cwirko M, van Lieshout J, Jäger C. Tailored interventions to address determinants of practice. Cochrane Database Syst Rev. 2015;(4). Art. No.: CD005470. https://doi.org/10.1002/14651858.CD005470.pub3.

  98. 98.

    Powell BJ, Beidas RS, Lewis CC, et al. Methods to improve the selection and tailoring of implementation strategies. J Behav Health Serv Res. 2017;44(2):177–94.

  99. 99.

    Wensing M. The tailored implementation in chronic diseases (TICD) project: introduction and main findings. Implement Sci. 2017;12(1):5.

  100. 100.

    Boyd MR, Powell BJ, Endicott D, Lewis CC. A method for tracking implementation strategies: an exemplar implementing measurement-based care in community behavioral health clinics. Behav Ther. 2018;49(4):525–37.

  101. 101.

    Lewis CC, Klasnja P, Powell BJ, et al. From classification to causality: advancing understanding of mechanisms of change in implementation science. Front Public Health. 2018;6:136.

  102. 102.

    National Institutes of Health. 2016; http://grants.nih.gov/grants/guide/pa-files/PAR-16-238.html. Accessed 2018.

  103. 103.

    DeVoe JE, Wallace LS, Fryer GE Jr. Measuring patients’ perceptions of communication with healthcare providers: do differences in demographic and socioeconomic characteristics matter? Health Expect. 2009;12(1):70–80.

  104. 104.

    DeVoe JE, Gold R, Cottrell E, et al. The ADVANCE network: accelerating data value across a national community health center network. J Am Med Inform Assoc. 2014;21(4):271–8.

  105. 105.

    OCHIN. Collaboration and innovation through research. 2018; https://ochin.org/ochin-research/. Accessed 2018.

  106. 106.

    Thorpe KE, Zwarenstein M, Oxman AD, et al. A pragmatic-explanatory continuum indicator summary (PRECIS): a tool to help trial designers. J Clin Epidemiol. 2009;62(5):464–75.

  107. 107.

    Patsopoulos NA. A pragmatic view on pragmatic trials. DialoguesClin Neurosci. 2011;13(2):217–24.

  108. 108.

    Tunis SR, Stryer DB, Clancy CM. Practical clinical trials: increasing the value of clinical research for decision making in clinical and health policy. JAMA. 2003;290(12):1624–32.

  109. 109.

    Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care. 2012;50(3):217–26.

  110. 110.

    Hussey MA, Hughes JP. Design and analysis of stepped wedge cluster randomized trials. Contemp Clin Trials. 2007;28(2):182–91.

  111. 111.

    Bodenheimer T, Ghorob A, Willard-Grace R, Grumbach K. The 10 building blocks of high-performing primary care. Ann Fam Med. 2014;12(2):166–71.

  112. 112.

    National Committee for Quality Assurance. Patient-Centered Medical Home (PCMH) Recognition. 2016; http://www.ncqa.org/programs/recognition/practices/patient-centered-medical-home-pcmh. Accessed 06/05/2018.

  113. 113.

    Pawson RT, N. Realistic evaluation. London: SAGE Publications; 1997.

  114. 114.

    RE-AIM framework. 2018; http://www.re-aim.org/. Accessed 2018.

  115. 115.

    Powell BJ, McMillen JC, Proctor EK, et al. A compilation of strategies for implementing clinical innovations in health and mental health. Med Care Res Rev. 2012;69(2):123–57.

  116. 116.

    Powell BJ, Proctor EK, Glisson CA, et al. A mixed methods multiple case study of implementation as usual in children's social service organizations: study protocol. Implement Sci. 2013;8:92.

  117. 117.

    Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8:139.

  118. 118.

    Godley SH, Garner BR, Smith JE, Meyers RJ, Godley MD. A large-scale dissemination and implementation model for evidence-based treatment and continuing care. Clin Psychol (New York). 2011;18(1):67–83.

  119. 119.

    Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004;82(4):581–629.

  120. 120.

    Massoud MR, Nielson GA, Nolan K, Schall MW, Sevin C. A Framework for Spread: From Local Improvements to System-Wide Change. IHI Innovation Series white paper. Cambridge: Institute for Healthcare Improvement; 2006. www.IHI.org

  121. 121.

    Paina L, Peters DH. Understanding pathways for scaling up health services through the lens of complex adaptive systems. Health Policy Plan. 2012;27(5):365–73.

  122. 122.

    Lukas CV, Meterko MM, Mohr D, et al. Implementation of a clinical innovation: the case of advanced clinic access in the Department of Veterans Affairs. J Ambul Care Manage. 2008;31(2):94–108.

  123. 123.

    Solberg LI. Improving medical practice: a conceptual framework. Ann Fam Med. 2007;5(3):251–6.

  124. 124.

    Janovsky K, Peters DH, Arur A, Sundaram S. Improving health services and strengthening health systems: adopting and implementing innovative strategies. Making health systems work, working paper no. 5. Geneva: World Health Organization, Evidence and Information for Policy; 2006.

  125. 125.

    Taylor EF, Machta RM, Meyers DS, Genevro J, Peikes DN. Enhancing the primary care team to provide redesigned care: the roles of practice facilitators and care managers. Ann FamMed. 2013;11(1):80–3.

  126. 126.

    Munoz M, Pronovost P, Dintzis J, et al. Implementing and evaluating a multicomponent inpatient diabetes management program: putting research into practice. Jt Comm J Qual Patient Saf. 2012;38(5):195–206.

  127. 127.

    Chung GH, Choi JN, Du J. Tired of innovations? Learned helplessness and fatigue in the context of continuous streams of innovation implementation. J Organ Behav. 2017;38(7):1130–48.

  128. 128.

    Health Leads Screening Toolkit. 2018; https://healthleadsusa.org/tools-item/health-leads-screening-toolkit/. Accessed 2018.

  129. 129.

    Rothman KJ. No adjustments are needed for multiple comparisons. Epidemiology. 1990;1(1):43–6.

  130. 130.

    Perneger TV. What's wrong with Bonferroni adjustments. BMJ. 1998;316(7139):1236–8.

  131. 131.

    Xu Z, Kalbfleisch JD. Propensity score matching in randomized clinical trials. Biometrics. 2010;66(3):813–23.

  132. 132.

    Little RJ, Rubin DB. Causal effects in clinical and epidemiological studies via potential outcomes: concepts and analytical approaches. Annu Rev Public Health. 2000;21(1):121–45.

  133. 133.

    McHugh S, Tracey ML, Riordan F, O'Neill K, Mays N, Kearney PM. Evaluating the implementation of a national clinical programme for diabetes to standardise and improve services: a realist evaluation protocol. Implement Sci. 2016;11:107.

  134. 134.

    Van Belle S, Wong G, Westhorp G, et al. Can "realist" randomised controlled trials be genuinely realist? Trials. 2016;17(1):313.

  135. 135.

    Dalkin SM, Greenhalgh J, Jones D, Cunningham B, Lhussier M. What’s in a mechanism? Development of a key concept in realist evaluation. Implement Sci. 2015;10:49.

  136. 136.

    Lacouture A, Breton E, Guichard A, Ridde V. The concept of mechanism from a realist approach: a scoping review to facilitate its operationalization in public health program evaluation. Implement Sci. 2015;10:153.

  137. 137.

    Hawthorne effect. 2018; https://en.wikipedia.org/wiki/Hawthorne_effect. Accessed 12/26/2018.

  138. 138.

    Fetters MD, Curry LA, Creswell JW. Achieving integration in mixed methods designs-principles and practices. Health Serv Res. 2013;48(6 Pt 2):2134–56.

  139. 139.

    Glaser BG, Strauss AL. The discovery of grounded theory: strategies for qualitative research. New York: Aldine de Gruyter; 1999.

  140. 140.

    Strauss A, Corbin J. Basics of qualitative research: techniques and procedures for developing grounded theory (2 nd ed.). Thousand Oaks: Sage; 1998.

  141. 141.

    Charmaz K. Grounded Theory Methods in Social Justice Research. In: Denzin NK, Lincoln YS, editors. The SAGE Handbook of Qualitative Research, 4 th ed. Thousand Oaks: Sage Publications, Inc.; 2011. p. 359–80.

  142. 142.

    Borkan J. Immersion/crystallization. In: Crabtree BF, Miller WL, editors. Doing qualitative research. Vol 2nd ed. Thousand Oaks: Sage Publications, Inc.; 1999. p. 179–94.

  143. 143.

    Bradley EH, Curry LA, Devers KJ. Qualitative data analysis for health services research: developing taxonomy, themes, and theory. Health Serv Res. 2007;42(4):1758–72.

  144. 144.

    Byng R, Norman I, Redfern S, Jones R. Exposing the key functions of a complex intervention for shared care in mental health: case study of a process evaluation. BMC Health Serv Res. 2008;8:274.

  145. 145.

    Scheirer MA, Dearing JW. An agenda for research on the sustainability of public health programs. Am J Public Health. 2011;101(11):2059–67.

  146. 146.

    U.S. Centers for Medicare & Medicaid Services. Accountable Health Communities Model. 2018; https://innovation.cms.gov/initiatives/ahcm. Accessed 6/7/18.

  147. 147.

    Yu JG, Kruszynska YT, Mulford MI, Olefsky JM. A comparison of troglitazone and metformin on insulin requirements in euglycemic intensively insulin-treated type 2 diabetic patients. Diabetes. 1999;48(12):2414–21.

  148. 148.

    Frieden TR. Forward: CDC health disparities and inequalities report-United States, 2011. MMWR Surveill Summ. 2011;60(Suppl):1–2.

  149. 149.

    Barnett E, Casper M. A definition of “social environment”. Am J Public Health. 2001;91(3):465.

  150. 150.

    Palloni A, Palloni G. Education, earnings, and diabetes. Health Aff (Millwood). 2012;31(5):1126.

  151. 151.

    van Lenthe FJ, Borrell LN, Costa G, et al. Neighbourhood unemployment and all cause mortality: a comparison of six countries. J Epidemiol Community Health. 2005;59(3):231–7.

  152. 152.

    Puig-Barrachina V, Malmusi D, Martenez JM, Benach J. Monitoring social determinants of health inequalities: the impact of unemployment among vulnerable groups. Int J Health Serv. 2011;41(3):459–82.

  153. 153.

    Mete C. Predictors of elderly mortality: health status, socioeconomic characteristics and social determinants of health. Health Econ. 2005;14(2):135–48.

  154. 154.

    McCarthy M. Social determinants and inequalities in urban health. Rev Environ Health. 2000;15(1–2):97–108.

  155. 155.

    Marmot M. Social determinants of health inequalities. Lancet. 2005;365(9464):1099–104.

  156. 156.

    Kreatsoulas C, Anand SS. The impact of social determinants on cardiovascular disease. Can J Cardiol. 2010;26 Suppl C:8C–13C.

  157. 157.

    Institute of Medicine: Board on Population Health and Public Health Practice: Committee on the Recommended Social and Behavioral Domains and Measures for Electronic Health Records. Capturing Social and Behavioral Domains in Electronic Health Records: Phase 1. Washington: National Academies Press (US); 2014.

  158. 158.

    Lessard LN, Alcala E, Capitman JA. Pollution, poverty, and potentially preventable childhood morbidity in Central California. J Pediatr. 2016;168:198–204.

  159. 159.

    Michie S, Fixsen D, Grimshaw JM, Eccles MP. Specifying and reporting complex behaviour change interventions: the need for a scientific method. Implement Sci. 2009;4:40.

  160. 160.

    Hoffmann TC, Glasziou PP, Boutron I, et al. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ. 2014;348:g1687.

  161. 161.

    Moore GF, Audrey S, Barker M, et al. Process evaluation of complex interventions: Medical Research Council guidance. BMJ. 2015;350:h1258.

  162. 162.

    Westhorp G. Realist impact evaluation: an introduction. London: Overseas Development Institute; 2014. p. 1–12.

Download references

Acknowledgements

The authors deeply appreciate the contributions of Arvin Garg, Jen Devoe, Carly Hood Ronick, Laura-Mae Baldwin, Zach Goldstein, Matthew Stiefel, Thomas Schuch, Danielle Hessler-Jones, Christian Hill, Michelle Proser, Nancy Gordon, and Ranu Pandey. We also want to thank the Oregon Primary Care Association (OPCA) for contributing to the design of the implementation guide.

Funding

This publication was supported by a grant from the National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK), 1R18DK114701–01.

Availability of data and materials

The datasets generated and/or analyzed during the current study are not publicly available because they are from a privately-hosted ERH system, but are available from the corresponding author on reasonable request.

Author information

RG, EC, MM, and AB designed the study. RG and EE directed the project. MM led all quantitative data aspects of the study. AB and IG led all qualitative data collection and analysis. SC and DW extracted all study data. MM created the social determinants of health tools in EPIC. MD managed all EPIC-related parts of the study. KD recruited formative study clinics and managed all study aspects at OCHIN. RG, NM, NY, MK, and KD developed the implementation guide. KD, JS, NY, and MK managed all study implementation aspects. LG and BP informed all implementation materials. RG wrote the manuscript with input from all authors. All authors read and approved the final manuscript.

Correspondence to Rachel Gold.

Ethics declarations

Ethics approval and consent to participate

This study was approved by the Kaiser Permanente Northwest Institutional Review Board.

Consent for publication

Not applicable

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Additional files

Additional file 1:

Guide to social determinants of health screening and referral-making using the electronic health record. (PDF 4124 kb)

Additional file 2:

Baseline survey. (DOCX 23 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Gold, R., Bunce, A., Cottrell, E. et al. Study protocol: a pragmatic, stepped-wedge trial of tailored support for implementing social determinants of health documentation/action in community health centers, with realist evaluation. Implementation Sci 14, 9 (2019) doi:10.1186/s13012-019-0855-9

Download citation

Keywords

  • Social determinants of health
  • Electronic health records
  • Community health centers
  • Implementation
  • Implementation strategies