Skip to main content
  • Study protocol
  • Open access
  • Published:

Dynamics behind the scale up of evidence-based obesity prevention: protocol for a multi-site case study of an electronic implementation monitoring system in health promotion practice

Abstract

Background

The effectiveness of many interventions to promote health and prevent disease has been well established. The imperative has therefore shifted from amassing evidence about efficacy to scale-up to maximise population-level health gains. Electronic implementation monitoring, or ‘e-monitoring’, systems have been designed to assist and track the delivery of preventive policies and programs. However, there is little evidence on whether e-monitoring systems improve the dissemination, adoption, and ongoing delivery of evidence-based preventive programs. Also, given considerable difficulties with e-monitoring systems in the clinical sector, scholars have called for a more sophisticated re-examination of e-monitoring’s role in enhancing implementation.

Methods

In the state of New South Wales (NSW), Australia, the Population Health Information Management System (PHIMS) was created to support the dissemination of obesity prevention programs to 6000 childcare centres and elementary schools across all 15 local health districts. We have established a three-way university-policymaker-practice research partnership to investigate the impact of PHIMS on practice, how PHIMS is used, and how achievement of key performance indicators of program adoption may be associated with local contextual factors. Our methods encompass ethnographic observation, key informant interviews and participatory workshops for data interpretation at a state and local level. We use an on-line social network analysis of the collaborative relationships across local health district health promotion teams to explore the relationship between PHIMS use and the organisational structure of practice.

Discussion

Insights will be sensitised by institutional theory, practice theory and complex adaptive system thinking, among other theories which make sense of socio-technical action. Our working hypothesis is that the science of getting evidence-based programs into practice rests on an in-depth understanding of the role they play in the on-going system of local relationships and multiple accountabilities. Data will be synthesised to produce a typology to characterise local context, PHIMS use and key performance indicator achievement (of program implementation) across the 15 local health districts. Results could be used to continuously align e-monitoring technologies within quality improvement processes to ensure that such technologies enhance practice and innovation. A partnership approach to knowledge production increases the likelihood that findings will be put into practice.

Peer Review reports

Background

The effectiveness of many interventions aimed at promoting health and preventing disease has been well established [1]. The imperative has thus shifted from amassing evidence of efficacy to delivering interventions at scale to achieve maximum population-level health gains [2]. Governments, funders and organisations working in the preventive health sphere are increasingly developing electronic implementation monitoring, or ‘e-monitoring’, systems to track the distribution of prevention policies, activities and programs. Some organisations contract commercial software companies to tailor existing software programs to their needs, while others create their own bespoke systems [3]. However, despite the continued demand for e-monitoring systems to track policy and program roll out, there is little research describing the development, use and adaptation of information technology systems, generally [4], or for prevention, specifically. There is also no in-depth analysis of how these systems sit within the design of larger on-going processes to increase uptake of evidence-based programs within a complex system of practice with its diverse accountabilities.

Meanwhile, in clinical settings, the use of e-monitoring systems to record delivery of services has not created expected gains in patient outcomes [5]. In the main, electronic systems designed to increase evidence-based practice have not worked the way they were first supposed. For example, inefficiencies in practice have resulted because workflows have been restructured and roles renegotiated [6,7,8]. Given this track record, Greenhalgh and colleagues [6] conducted a systematic review using the meta-narrative technique to synthesise evidence on electronic patient records. They identified nine different perspectives on what the act of e-monitoring means or represents, as encapsulated by different research traditions. This ranged from studies of the impact of e-monitoring technologies on patient outcomes to investigations of how relationships and practices change because of the e-monitoring. They recommended that future studies explore the ‘hidden’ and collaborative work of staff and ‘how staff contextualize and prioritize [different types of] knowledge for shared use’ [6]. Notably, only 12 of the 94 studies they reviewed used ethnographic techniques. Greenhalgh and Swinglehurst [9] subsequently recommended ethnography to further knowledge about how e-monitoring implementation technologies are situated in practice.

In the field of preventive policy and program implementation some large-scale e-monitoring systems have been abandoned due to poor design and incongruity with practice [10, 11]. Other researchers have documented how e-monitoring systems have been so bespoke that they became irrelevant at the end of projects, and not transferred to other contexts [3]. This is interesting because health promotion has a history of initiative-taking and innovation in evaluation [12] and thought leadership in quality improvement [13]. And while Maycock and Hall [14] cautioned against the type of performance monitoring which stifles innovation and locks health promotion too much into current approaches to problems, they also called on the field to be proactive in designing new ways forward in practice improvement. This is to ensure that any new implementation, performance and quality management processes are consistent with the philosophy and practice of health promotion, rather than falling passively into imposed ‘business paradigms’ [14].

Within this broader context, we report a study of Australia’s first large-scale e-monitoring system to facilitate the dissemination of evidence-based obesity prevention programs into every primary school and childcare centre across the state of New South Wales (NSW). In contrast to the somewhat disappointing history of e-monitoring systems we have summarised so far, the Population Health Information Management System (PHIMS) has been used state-wide since 2014 and continues to adapt. Following recommendations from Greenhalgh and Swinglehurst [9], we use ethnographic methods. We outline a study protocol to investigate how an e-monitoring system aids obesity prevention program implementation. In particular, the research is intended to be attuned to the diversity of multi-level contexts within which action takes place, to appreciate how context may shape both program implementation and the use of the e-monitoring system [9].

Context

In NSW, health promotion teams in local health districts (LHDs) have been in place since the late 1970s. The role of the teams is to promote and support the physical, mental and social health of all residents in the district, and to create supportive environments for health. On top of this basic level of responsibility and accountability, funds occasionally become available to expand capacity in particular priority domains (e.g. tobacco, falls prevention and nutrition).

In 2008, the National Partnership Agreement on Preventative Health between the Commonwealth Government and the States launched a large-scale effort to deliver setting-based programs to address the increasing prevalence of chronic disease [15]. Three flagship programs of the Healthy Children Initiative (HCI), Go4Fun, Live Life Well at School (LLW@S) and Munch & Move, were scaled up for state-wide delivery [16]. The latter two programs aim to improve primary school environments and early childhood education and care service environments by supporting healthy eating and physical activity [17, 18]. The evidence base on which these programs were designed was established, in part, by a previously implemented at-scale suite of childhood obesity interventions that demonstrated a 1% per year reduction in childhood obesity, against an otherwise increasing prevalence [17, 18].

LLW@S and Munch & Move are delivered at a local level by purposively funded, dedicated HCI positions in health promotion units across all 15 LHDs in NSW. Each LHD receives funds from the NSW Ministry of Health to implement these programs and to reach specified targets for each. The delivery approach involves HCI teams supporting primary schools and services to achieve a number of specified, evidence-based practices aimed at organisational changes to improve food and physical activity environments in those settings. The achievement of these practices is monitored through key performance indicator (KPI) targets defined by the Ministry of Health and written into the overarching service level agreements between the Ministry of Health and the LHDs [19, 20]. The KPIs are implementation targets. That is, the achievement of specified practices and policies (such as water provision instead of sugary drinks and physical activity breaks) represent a fully implemented program which should logically contribute to obesity reduction. The two programs are delivered to all primary schools and centre-based child care services in NSW. Currently, 91% (3320) of child care services and 83% (2126) of all primary schools are participating. The delivery cost is estimated to be $1500 per site per year.

The Population Health Information Management System (PHIMS) is an electronic, web-based monitoring system, purpose-built and designed via a collaboration with the Ministry of Health, the NSW Office of Preventive Health and LHD representatives. The two overarching purposes of the system are to (1) assist health promotion practitioners with local HCI program delivery of two programs, LLW@S and Munch & Move, and (2) record the achievement of practices which inform KPI reporting and quality improvement. At the LHD level, health promotion practitioners use PHIMS to support the day-to-day work of program delivery by tracking and planning visits to sites to provide support, documenting their interactions with sites, and recording site-level progress towards KPI achievement (e.g. sites’ achievement of healthy eating and physical activity practices). Data about progress towards KPI achievement is available for LHDs and the Ministry of Health to review in real time, providing up-to-date information about the status of each LHD in relation to their performance against KPIs. The data is used to monitor the quality and extent of program implementation and to inform service improvements. The additional investment in the design and implementation of PHIMS was approximately $2.5 m over two financial years and approximately $400K annually for support and maintenance.

PHIMS is based on the principle of user-centred design [21] and is consistent with a ‘tight-loose-tight’ approach to using it for policy and program implementation [22]. That is, agreements between the parties are tight (specific) about the problem and goal, loose (flexible) about how to reach the goal, and tight (specific) about the target to be achieved [22]. This is largely consistent with recommendations from Plsek and Wilson [23] to health care managers that they should recognise health care settings as complex adaptive systems and that they should act consistently with this theory for quality improvement purposes. That is, to amplify feedback about system performance; to provide minimal specifications when introducing change, but maximum information about purpose, goal, possibilities and choices; to build structures to encourage interaction/sharing about new actions taken and ideas tested; and to detect and communicate success/failure rapidly.

Methods

Study design

This is a multi-site case study in every LHD of NSW (n = 15), taking a mixed-methods approach to examining the use of PHIMS to aid the delivery of HCI programs. PHIMS operates at numerous levels of the NSW Health system. We expect the meaning and value of PHIMS, and the role of performance monitoring more generally, to vary according to the differential understanding of how users at different levels interpret the role of PHIMS (see Fig. 1). We adopt an ethnographic approach by which we may understand the multiplicity of users and sites from LHDs to policy-level decision makers. In addition, we will undertake key informant interviews, participatory workshops for data interpretation and an on-line social network analysis of the collaborative relationships within and across LHD health promotion teams.

Fig. 1
figure 1

Levels of PHIMS stakeholders and designers’ anticipated use of PHIMS data

Research questions

The primary overarching research question is: How does PHIMS intersect with health promotion practice? We will examine how this is manifest in specific phenomena such as how PHIMS is used in health promotion practice, how PHIMS has shaped health promotion practice, how PHIMS has been embedded into practice, and how much practice is represented in PHIMS. Essentially, we are studying the programs’ scale up through the lens of the recording system. We are also interested in capturing the ‘ripple effects’ of having PHIMS data newly available for use among health policy decision makers.

Research objectives

  1. 1.

    Describe the diversity of teams and contexts within which PHIMS is used and the differences in use, if any

  2. 2.

    Observe the breadth and intensity of work that goes into supporting early childhood services and schools to adopt practices and how this translates to data in PHIMS

  3. 3.

    Explore factors that influence adoption of practices within early childhood services and schools and the extent to which they are captured in PHIMS

  4. 4.

    Examine how PHIMS sits alongside other methods to structure, organise, record and manage health promotion practice at the local level

  5. 5.

    Understand how the technology interacts with the process of practice and how roles, routines and activities are impacted and how data are used

  6. 6.

    Articulate what matters most to health promotion practitioners in their practice—e.g. their values, attitudes and actions

  7. 7.

    Identify how performance monitoring via PHIMS use has impacted the field of health promotion in NSW from myriad perspectives including, for example, practitioners, health promotion managers, state-level program coordinators and funders, partnering sectors, and policy-level decision makers

Note that ongoing surveillance by NSW Health is tracking target achievement in obesity prevention program roll out (i.e. whether the KPIs about program adoption in schools and childcare are being met) and obesity prevalence rates.

Sites and sample

This study is conducted with HCI teams across the state of NSW, the Office of Preventive Health who manages HCI at the state level, and policy-level decision makers and administrators of the PHIMS data system located within the Ministry of Health. HCI teams are situated within health promotion units in 15 LHDs that comprise the health promotion workforce across NSW. We also anticipate collecting insights from other health promotion practitioners in the LHDs (within which the HCI teams are situated) to answer more general questions about the context of practice. Because each LHD operates independently, the staffing structure for each unit varies with some staff exclusively appointed to HCI and others with split appointments. The number of staff on each HCI team range from 3 to 25 people.

Data collection and analysis

Data collection will occur in four phases. The design of the instruments and data collection approach in each phase is informed by findings from previous phases. Four data collection activities are planned: (1) ethnography with local and state-level HCI teams; (2) semi-structured interviews with policy-level decision makers who play a role in preventive health activities, and/or whose work is informed, in part, by PHIMS data; (3) workshops with health promotion practitioners during which we present data from the previous two phases to validate findings and, finally, (4) a social network analysis with LHD health promotion units. Analysis will occur on an iterative and ongoing basis and findings will be shared with the partnership team to gain deeper insights into the study questions, refine the study approach for subsequent phases, and feedback to inform quality improvement processes in the delivery of HCI programs.

The study is described to LHDs as a way to understand what happens in practice in rolling out the HCI programs and using PHIMS to plan and record implementation and achievement of the KPIs. The image of an iceberg is used in communicating study objectives and in explaining that the research team would like to understand what goes on ‘beneath the surface’—i.e. what is not being seen and/or counted by PHIMS—and how PHIMS fits into everyday practice. Fieldwork is expected to occupy 12 months.

Ethnographic observations

The purpose of the ethnographies (2–3 days in each LHD) is to observe staff’s day-to-day practice and use of PHIMS at the LHD level, and at the state coordinators’ level, aiming for moderate participation [24]. Field notes will be taken documenting staff interaction with PHIMS, including the frequency and duration of use and the reasons for use, how PHIMS is talked about between staff and other types of monitoring tools that staff use to guide practice and collect data. Additionally, we will document the activities that staff engage in to deliver the HCI programs, and collect data on the collaborative nature of practice within the health promotion office, looking specifically at interactions of the HCI staff to other teams in the health promotion units. This information will be used to provide contextual information about the culture and organisation of each LHD to help us better understand variations in PHIMS usage and acceptance across HCI teams.

When possible, we will observe interactions between LHDs. Observing inter-LHD interactions will help to identify what aspects of PHIMS use and embeddedness are common and generalizable, and which are specific to individual sites. Examples of inter-LHD interactions may include monthly conference calls between supervisors, Health Promotion Officer working groups, workshops and whole-of-state meetings. As much as possible, inter-LHD activities will be identified in advance (e.g. whole-of-state meetings, monthly conference calls); however, others will be identified on an ad hoc basis, in collaboration with the Ministry of Health, the Office of Preventive Health and the LHDs. As the field work progresses, we will review findings to identify concepts or questions that require further examination.

The field-based research team comprises three researchers. Each will keep detailed and descriptive field notes of each site visit, observation, meeting and discussion [25]. Interviews will be audiotaped and transcribed. Each researcher will also maintain reflective field notes on the process of conducting this research. When appropriate, we will audio-record observed events (e.g. meetings, discussions with staff) to support the field notes. Researchers will use these recordings to verify accuracy of their field notes but they will not be transcribed verbatim unless required.

Interviews

We will conduct semi-structured interviews [26] with policy-level users of PHIMS data to explore different users’ perspectives of the impact of PHIMS use on health promotion practice. Interviews with policy makers will occur after ethnographies, and development of the interview guide will be informed by the initial findings. As opportunities arise during field visits, we will engage participants in conversational interviews [27] and/or follow-up with individual participants via telephone interviews to clarify or expand on observations of activities in the field. These interviews will be audio-recorded and transcribed verbatim.

Group workshops

The research team will present initial findings on what we have observed to the LHD teams and to the directors of those teams, and engage them in interpretation and discussion, in keeping with a participatory approach [28]. The purpose of the workshops is to validate preliminary findings, provide an opportunity for LHDs to address or correct any misunderstandings, and to enable the participants take part in the interpretation of our findings. Workshops provide an opportunity for further observation and understanding of how results are understood, translated, qualified and challenged. Data collected from workshops will include audio-recordings, notes made by researchers, and workshop materials (e.g. written feedback) that may be produced in the context of the interpretation process.

Additional data sources

We will also collect documents and reports, including, for example, policies, organizational charts for health promotion teams and LHDs, and evaluation summaries that will further inform our understanding of the organization of teams and processes, the local culture of practice, and strategic thinking about the implementation of HCI programs.

Analysis of qualitative data

The qualitative analysis will be an iterative and ongoing process, occurring during and after each phase is completed with each LHD so that findings from each LHD are used to inform the data collection approach in subsequent LHDs [29]. For the initial analysis, we will use a grounded theory approach to generate a project codebook. First, the existing field notes will be read sequentially, in full by four team members, to obtain a sense of the context and progression of activities over time. Then, each field note will be read line-by-line with the researcher highlighting each key concept and writing a note, impression or initial analysis about each [30, 31]. Next, the researcher will begin grouping similar codes together that reflect the same concept. Each researcher will do this process independently. The researchers will then meet to share initial codes and groupings. As a group, we will discuss the coding, develop definitions and coding criteria and create the initial codebook. During this stage, each data source will be coded by at least two researchers. We aim to keep codes and groupings of codes initially generally broad to enable more detailed analysis in later stages [32]. Subsequent coding will use the initially developed codebook, or derive new and more specific codes depending on the specific study question.

All coding will occur on an ongoing basis, as data is generated from the field, using NVivo [33] to facilitate organization and retrieval of codes for the next stage of analysis. The team will meet regularly to review the codes and their application and to refine the codebook as needed. During the coding process, we will mark meaningful stories for later retrieval and analysis using narrative techniques to identify cause-and-consequence thinking and illustrate key practice values [34]. To ensure replicability, we apply triangulation of investigators as well as triangulation of methods [35]. We also endeavour to ensure that the ‘voice’ in the research [36] is clearly identifiable as the researchers’ and that readers are given enough information about the researchers’ choices and interpretations to judge if the argument he/she makes is valid [36]. We are not able to offer direct member checking [35] as a means to get the informants’ view of our data, due to the anonymity required for the partnership model. But we have chosen participatory research methods [28] to gain feedback on our representation of perspectives and to involve participants in data interpretation [37].

Following initial coding, we will create LHD data summaries and then a cross-site typology to describe variation in teams and their engagement with PHIMS, HCI implementation practices and KPI achievement. The adequacy of the typology will be assessed using five criteria (based on Hunt [38]): Is the phenomenon to be classified adequately specified? Is the classification characteristic adequately specified? Are the categories mutually exclusive? Is the typology collectively exhaustive? And finally, is the typology useful?

Social network analysis

Social network analysis (SNA) creates quantitative summary scores for social structures [39]. Hence, it is used to quantify how connected practitioners are with each other, how central/or isolated some players may be, and why it is easy to defuse information across some groups faster than others. In health promotion, social network analysis has been used to convey local collaborative capacity in communities by examining ties among organizations [40]. The professional networks of health promotion practitioners could be similarly associated with ease of getting things done and hence SNA is used in this study to contextualise the wider team ‘culture’ within which the HCI is being delivered and PHIMS is being used. Specifically, our interest is to describe the diversity of teams and characterise local relationship infrastructure.

SNA data will be collected via an online survey that ensures a user-friendly interface and a maximum of 10 min to complete. The key relationships to be studied will be determined through field-based consultations with HCI staff and our co-partnership team. Examples of relationships may include (a) who people turn to for information/advice/problem solving; (b) who people work with most; and (c) who people feel have similar ideas and attitudes to practice to themselves. This will be a complete network survey—that is, all the specified relationships with a bounded set of people (i.e. the local health promotion team including the HCI team). We will also investigate in general terms the amount of contact across LHDs. This question will be followed by a listing of all LHDs. This question is designed to quantify the amount of collaboration that happens, particularly among adjacent rural LHDs. We will ask SNA questions twice, to capture two time periods ‘now’ and ‘same time 12 months ago’.

Analysis of the network survey will be conducted using UCINET 6 [41]. Network graphs will be drawn using Netdraw [42]. We will calculate the density of each of the relationships. Density is the amount of ties that are present as a proportion of the total possible ties [39]. So if everyone knows each other, the density score is 100%. We will also compute the two-step reach of the HCI coordinator for each of the relationships. Two-step reach illustrates the proportion of the total number of people in the network who can be reached by a person within one link of the people who comprise his/her immediate ties. It is considered a measure of how quickly a person can mobilise resources or convey information to others. We anticipate that the differences in the SNA scores over 12 months will capture turbulence in some parts of the state (e.g. high staff turnover that potentially disrupts practice).

Use of theory in the research

Theory will not guide our enquiry directly. Rather, it will be used as a sensitizing tool, as we identify and try to understand any universal patterns in our data. We will use theory for analysis/description and explanation, as opposed to using it for prediction or prescription [43]. This study encompasses a broad range of phenomena so no single theory can be identified as the primary means by which we will understand patterns of interest. In Table 1, we present the range of theories we anticipate will guide our understanding and analysis of the research questions. That said, our work is mainly embedded within sociological theory and an understanding of practice as a social system. For example, practitioners in the system have agency (e.g. thoughts, actions, efficacy). They appraise situations and act and adjust in ways they consider best. Their continuous actions and interactions create a social structure that shifts and responds to incentives and opportunities to act differently.

Table 1 Theories that will be used to analyse different domains within the research program

Expected outputs

We will construct (1) global sketch of results at each site, including the local SNA; (2) across the LHDs, a summative global typology of how PHIMS sits within practice as well as a cross-LHD analysis of the inter-LHD social network ties; (3) an analysis in which we use qualitative insights about HCI delivery and PHIMS use to make sense of variation in KPI achievement and (4) a series of papers which answers the research questions. Further, we anticipate that through the co-production process, the Ministry of Health will use findings and learnings from this project to inform the refinement and development of future e-monitoring systems for health promotion.

Governance and ethics

The research team is comprised of state-level administrators and policymakers from the Office of Preventive Health and the Ministry of Health, representatives from the LHDs, and university-based researchers. The research team is a part of The Australian Prevention Partnership Centre, an innovative $22.5 m partnership between the National Health and Medical Research Council, two state health departments (NSW Health and ACT Health), the Federal Department of Health and Hospitals Contribution Fund, a private health insurance agency. The Partnership Centre was established to develop stronger system-based approaches to chronic disease prevention [44]. Partnership research has long been recognized as a means to facilitate the translation of research findings into policy and practice [45, 46]. We aim to produce knowledge that is meaningful and actionable because it is co-created and relevant to the practitioners, policymakers and administrators who are responsible for the ongoing design and implementation of both HCI and PHIMS. Bi-monthly meetings held throughout the project will provide direction, feedback and insight into the findings. Synthesized, de-identified results (collected by the university-based team members) will be presented to the research partnership team to ensure anonymity of the staff and LHDs. Research ethics approval has been granted by the Royal Prince Alfred Hospital Human Research Ethics Committee (X16-0156 & LNR/16/RPAH/194), and by the research governance offices of each of the 15 LHDs.

Discussion

Our research design offers a cross-sectional ‘snap shot’ of the way different LHDs engage with PHIMS in practice and how this relates to activities directed at achieving KPIs for obesity prevention. The degree to which researchers are invited to observe the particularities of day-to-day practice will likely differ across LHD sites. However, we anticipate that the opportunity to present in-progress findings and offer them for interpretation may open further doors if LHDs feel that they are not being fully or fairly represented. Follow-up interviews are also designed to offer further insights over time to compensate for any distortion arising from the observational period chosen. So while the initial field contact in each site can be less than a week, the relationship with each LHD extends over 12–24 months, because of ongoing involvement in interpretation.

We acknowledge that our approach does not offer a longitudinal view of how PHIMS use and integration into practice may have shifted over time. Nor does it provide an in-depth understanding of each LHD, as one might get from more time in the field at each site. This study is unique in that we are studying an e-monitoring system for implementation in preventive health that has endured. Finding out what makes an enduring system endure is important. But this does not mean that we will not be party to the types of frustrations that may have ‘killed off’ similar e-monitoring attempts elsewhere [47].

Our partnership enables access to multiple bureaucratic levels and contexts within which PHIMS is used. Such access requires an ongoing commitment to generating and maintaining trust amongst partners. This is particularly important for participants at the LHD level where some of whom, due to turnover and other contextual issues, may not feel ‘on the same page’ with the PHIMS designers and state-level data users. The research project itself offers an opportunity for some otherwise distal groups to engage in dialogue and engage in building joint understanding. The establishment and maintenance of trust is a conscious, all-partner effort. The basis of this trust is facilitated by the long history and multi-level relationships that support health promotion practice in NSW and the research approach adopted. The research partnership team also recognises (1) that e-monitoring technology is part of, but not a substitute for, larger processes of continuous quality improvement that need their own stewardship and (2) that, for some participants, obesity prevention is just one of many priorities demanding attention in the field. It is vital, therefore, that we engage in close listening and strive for continual understanding. Flexible adjustment of the research approach is also important as relationships change and as our collective understanding of practice, technology use, implementation and governance grows.

Abbreviations

HCI:

Healthy Children Initiative

KPI:

Key performance indicator

LHD:

Local health district

LLW@S:

Live Life Well at School

NSW:

New South Wales

PHIMS:

Population Health Information Management System

References

  1. Community Preventive Services Task Force. The Community Guide. Centres for Disease Control and Prevention. 2017. https://www.thecommunityguide.org/. Accessed 09 Sept 2017.

  2. Milat AJ, King L, Newson R, Wolfenden L, Rissel C, Bauman A, Redman S. Increasing the scale and adoption of population health interventions: experiences and perspectives of policy makers, practitioners, and researchers. Health Res Policy Syst. 2014; https://doi.org/10.1186/1478-4505-12-18.

  3. Bors PA, Kemner A, Fulton J, Stachecki J, Brennan LK. HKHC community dashboard: design, development, and function of a web-based performance monitoring system. J Public Health Manag Pract. 2015; https://doi.org/10.1097/phh.0000000000000207.

  4. Lyon AR, Lewis CC, Melvin A, Boyd M, Nicodimos S, Liu FF, Jungbluth N. Health information technologies—academic and commercial evaluation (HIT-ACE) methodology: description and application to clinical feedback systems. Implement Sci. 2016; https://doi.org/10.1186/s13012-016-0495-2.

  5. Payne TH. Electronic health records and patient safety: should we be discouraged? BMJ Qual Safety. 2015; https://doi.org/10.1136/bmjqs-2015-004039.

  6. Greenhalgh T, Potts HWW, Wong G, Bark P, Swinglehurst D. Tensions and paradoxes in electronic patient record research: a systematic literature review using the meta-narrative method. Milbank Q. 2009; https://doi.org/10.1111/j.1468-0009.2009.00578.x.

  7. Kruse CS, Kristof C, Jones B, Mitchell E, Martinez A. Barriers to electronic health record adoption: a systematic literature review. J Med Syst. 2016; https://doi.org/10.1007/s10916-016-0628-9.

  8. Mair FS, May C, O'Donnell C, Finch T, Sullivan F, Murray E. Factors that promote or inhibit the implementation of e-health systems: an explanatory systematic review. Bull World Health Organ. 2012; https://doi.org/10.2471/blt.11.099424.

  9. Greenhalgh T, Swinglehurst D. Studying technology use as social practice: the untapped potential of ethnography. BMC Med. 2011; https://doi.org/10.1186/1741-7015-9-45.

  10. Thomas C, Smith B, Wright-DeAguero L. The program evaluation and monitoring system: a key source of data for monitoring evidence-based HIV prevention program processes and outcomes. AIDS Educ Prev. 2006;18(4 Suppl A):74–80.

    Article  PubMed  Google Scholar 

  11. Majic S. Protest by other means? Sex workers, social movement evolution and the political possibilities of nonprofit service provision. Ithaca, NY: Cornell University; 2010.

    Google Scholar 

  12. Hawe P, Degeling D, Hall J. Evaluating health promotion: a health worker's guide. MacLennan & Petty: Sydney, Australia; 1990.

    Google Scholar 

  13. Kahan B, Goodstadt M. Continuous quality improvement and health promotion: can CQI lead to better outcomes? Health Promot Int. 1999; https://doi.org/10.1093/heapro/14.1.83.

  14. Maycock B, Hall SE. The quality management and health promotion practice nexus. Promot Educ. 2003;10(2):58–63.

    PubMed  Google Scholar 

  15. Council of Australian Governments. National Partnership Agreement on Preventive Health. 2008. http://www.federalfinancialrelations.gov.au/content/npa/health/_archive/health_preventive_national_partnership.pdf. Accessed 8 Sept 2017.

  16. National Partnership Agreement on Preventive Health. Implementation Plan for the Healthy Children Initiative. Commonwealth of Australia. 2012. http://www.federalfinancialrelations.gov.au/content/npa/health/_archive/healthy_workers/healthy_children/NSW_IP_2013.pdf. Accessed 09 May 2017.

  17. Hardy LL, King L, Kelly B, Farrell L, Howlett S. Munch and Move: evaluation of a preschool healthy eating and movement skill program. Int J Behav Nutr Phys Act. 2010; https://doi.org/10.1186/1479-5868-7-80.

  18. Wiggers J, Wolfenden L, Campbell E, Gillham K, Bell C, Sutherland R, Hardy LL, King L, Grunseit A, Milat AJ, Orr N. Good for Kids, Good for Life, 2006-2016: Evaluation Report. NSW Ministry of Health. 2013. http://www.health.nsw.gov.au/research/Pages/good-for-kids.aspx. Accessed 6 Jan 2017.

  19. New South Wales Ministry of Health. Service Agreement Template. New South Wales Ministry of Health. 2016. http://www.health.nsw.gov.au/Performance/Documents/service-agreement-generic.pdf. Accessed 09 May 2017.

  20. Farrell L, Lloyd B, Matthews R, Bravo A, Wiggers J, Rissel C. Applying a performance monitoring framework to increase reach and adoption of children’s healthy eating and physical activity programs. Public Health Res Pract. 2014. doi:10.17061/phrp2511408.

  21. Bødker S. Scenarios in user-centred design—setting the stage for reflection and action. Interact Comput. 2000; https://doi.org/10.1016/S0953-5438(00)00024-2.

  22. Dickens P. Tight-loose-tight. A complexity approach to innovation. Org Develop Pract. 2016;48(4):27–31.

    Google Scholar 

  23. Plsek PE, Wilson T. Complexity, leadership, and management in healthcare organisations. BMJ. 2001;323(7315):746–9.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  24. Spradley JP. Participant observation. Orlando: Harcourt College Publishers; 1980.

    Google Scholar 

  25. Emerson RM, Fretz RI, Shaw LL. Writing ethnographic fieldnotes. Chicago: University of Chicago Press; 2011.

  26. Kvale S. Interviews: an Introduction to Qualitative Research Interviewing. Thousand Oaks: Sage; 1996.

  27. Roulston KJ. Conversational interviewing. In: Given LM, editor. The SAGE encyclopedia of qualitative research methods. Thousand Oaks: Sage Publications; 2008.

  28. Nielsen KA, Svensson L, editors. Action research and interactive research: beyond practice and theory. Hamburg: Shaker Verlag; 2006.

    Google Scholar 

  29. Flick U. An introduction to qualitative research. 3rd ed. London: Sage; 2006.

    Google Scholar 

  30. Hsieh HF, Shannon SE. Three approaches to qualitative content analysis. Qual Health Res. 2005; https://doi.org/10.1177/1049732305276687.

  31. Saldana J. The coding manual for qualitative researchers. Thousand Oaks, CA: Sage Publications; 2012.

    Google Scholar 

  32. Miles MB, Huberman AM. Qualitative data analysis: an expanded sourcebook. Beverly Hills, CA: Sage; 1994.

    Google Scholar 

  33. Nvivo qualitative data analysis software. 10 ed: QSR International Pty Ltd.; 2012.

  34. Riley T, Hawe P. Researching practice: the methodological case for narrative inquiry. Health Educ Res. 2005; https://doi.org/10.1093/her/cyg122.

  35. Lincoln YS, Guba EG. Naturalistic inquiry. Beverly Hills, CA: Sage; 1985.

    Google Scholar 

  36. Clifford J, Marcus G. Writing culture - the poetics and politics of ethnography. Berkeley, CA: University of California Press; 1986.

    Google Scholar 

  37. Clifford J. The Predicament of Culture. Boston, MA: Harvard University Press; 1988.

    Google Scholar 

  38. Hunt SD. Modern marketing theory: critical issues in the philosophy of marketing science. Cincinatti, OH: South-Western Publishing; 1991.

    Google Scholar 

  39. Wasserman S, Faust K. Social network analysis: methods and applications. New York: Cambridge University Press; 1984.

    Google Scholar 

  40. Valente TW, Coronges KA, Stevens GD, Cousineau MR. Collaboration and competition in a chidren's health initiative coalition. Eval Program Plann. 2008;31(4):392–402.

    Article  PubMed  Google Scholar 

  41. Borgatti SP, Everett MG, Freeman LC. Unicet 6 for windows: software for social newtork analysis. Harvard, MA: Analytic Technologies; 2002.

    Google Scholar 

  42. Borgatti SP. Netdraw network visualization. Analytic Technologies: Harvard, MA; 2002.

    Google Scholar 

  43. Gregor S. The nature of theory in information systems. MIS Q. 2006;30(3):611–42.

  44. Wutzke S, Redman S, Bauman A, Hawe P, Shiell A, Thackway S, Wilson A. A new model of collaborative research: experiences from one of Australia’s NHMRC Partnership Centres for better health. Public Health Res Pract. 2017;27(1):e2711706.

    Google Scholar 

  45. Dawes SS, Cresswell AM, Pardo TA. From “need to know” to “need to share”: tangled problems, information boundaries, and the building of public sector knowledge networks. Public Adm Rev. 2009; https://doi.org/10.1111/j.1540-6210.2009.01987_2.x.

  46. Gagliardi AR, Webster F, Brouwers MC, Baxter NN, Finelli A, Gallinger S. How does context influence collaborative decision-making for health services planning, delivery and evaluation? BMC Health Serv Res. 2014; https://doi.org/10.1186/s12913-014-0545-x.

  47. Greenhalgh T, Russell J, Ashcroft RE, Parsons W. Why national eHealth programs need dead philosophers: Wittgensteinian reflections on policymakers’ reluctance to learn from history. Milbank Q. 2011; https://doi.org/10.1111/j.1468-0009.2011.00642.x.

  48. Borgatti SP, Halgin DS. On network theory. Organ Sci. 2011;22(5):1168–81.

    Article  Google Scholar 

  49. Scott RW, Ruef M, Mendel P, Caronna CA. Institutional change and healthcare organizations: from professional dominance to managed care. Chicago and London: The University of Chicago Press; 2000.

    Google Scholar 

  50. Axelrod R, Cohen MD. Harnessing complexity. New York: Free Press; 1999.

    Google Scholar 

  51. O'Donnell CR, Tharp RG, Wilson K. Activity settings as the unit of analysis: a theoretical basis for community intervention and development. Am J Community Psychol. 1993; https://doi.org/10.1007/bf00942157.

  52. Feldman MS, Orlikowski WJ. Theorizing practice and practicing theory. Organ Sci. 2011; https://doi.org/10.1287/orsc.1100.0612.

  53. Bourdieu P. Distinction: a social critique of the judgment of taste. Boston, MA: Harvard University Press; 1984.

    Google Scholar 

  54. Gherardi S. How to conduct a practice-based study: Edward Elgar Publishing; 2012.

    Book  Google Scholar 

  55. Gherardi S. Introduction: the critical power of the ‘Practice lens’. Manag Learn. 2009;40(2):115–28.

    Article  Google Scholar 

  56. May C. Towards a general theory of implementation. Implement Sci. 2013;8 Available from: https://doi.org/10.1186/1748-5908-8-18

  57. May C, Johnson M, Finch T. Implementation, context and complexity. Implement Sci 2016; 11(1). Available from: https://doi.org/10.1186/s13012-016-0506-3.

  58. Geertz C. Ethos, World view, and the Analysis of Sacred symbols. The Interpretation of Cultures. New York City, New York: Basic Books, Inc.; 1973. p. 126-41.

  59. Rapport N, Overing J. Social and cultural anthropology: the key concepts. London, UK: Routledge; 2000.

    Book  Google Scholar 

Download references

Acknowledgements

We thank colleagues in the NSW Ministry of Health and local health districts for their input and agreement to develop this project.

Funding

This work is funded by the National Health and Medical Research Council of Australia (NHMRC) through its partnership centre grant scheme (Grant ID: GNT9100001). NSW Health, ACT Health, The Commonwealth Department of Health, The Hospitals Contribution Fund of Australia, and HCF Research Foundation have contributed funds to support this work as part of the NHMRC partnership centre grant scheme.

Availability of data and materials

Not applicable as no datasets were generated or analysed for the production of this protocol.

Author information

Authors and Affiliations

Authors

Contributions

PH conceptualized the study in response to critical reflection on PHIMS’ development by ST and JM. PH and KC developed the protocol and study approach. SG, VL, AG, AM, ST, LP, CI, JM and MW further refined the study questions and approach. All authors made important contributions to the theoretical approach and design of the study. KC and PH drafted the manuscript. All authors read, revised and approved the final manuscript. PH is guarantor.

Corresponding author

Correspondence to Penelope Hawe.

Ethics declarations

Ethics approval and consent to participate

Research ethics approval has been granted by the Royal Prince Alfred Hospital Human Research Ethics Committee (X16-0156 & LNR/16/RPAH/194), and, in addition, by the research governance offices of each of the 15 LHDs.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Conte, K.P., Groen, S., Loblay, V. et al. Dynamics behind the scale up of evidence-based obesity prevention: protocol for a multi-site case study of an electronic implementation monitoring system in health promotion practice. Implementation Sci 12, 146 (2017). https://doi.org/10.1186/s13012-017-0686-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13012-017-0686-5

Keywords