in

From data to decisions: understanding information flows within regulatory water quality monitoring programs

Mapping information flows

We developed data flow diagrams (DFDs) to illustrate the transmission of water quality data within institutions and to external stakeholders. We specified four elements in the DFDs31: (1) external entities (institutions or departments outside of the system boundaries); (2) processes (transformations of, or changes to, data); (3) data stores (physical storage of data); and (4) data flows (movement of data) (Fig. 1).

Fig. 1: Generalized DFD showing external entities, processes, data stores, and data flows representing the majority of monitoring programs included in this study.

The DFD representations comprise four elements: 1) external entities (shadowed boxes), which are systems, individuals, or institutions outside of the modeled system’s boundaries; 2) processes (rounded rectangles, assigned a unique number), which represent transformations of, or changes, to data; 3) data stores (open-ended rectangles, assigned a unique number, such as D1, D2, etc.), which represent physical storage of data (e.g., paper-based such as filing cabin et or notebook, or digital such as computer file or database); and 4) data flows (arrows), which depict movement of data.

Full size image

Our DFDs for the 26 institutions showed that suppliers and surveillance agencies used similar structures for collecting and sharing water quality information (as generalized in Fig. 1). First, institutions selected locations for water sampling (1.0, D1). Then they collected the samples (2.0a,b), recorded information about the water source on the sample containers or in a logbook (D2, D3), and tested the samples in the field and/or laboratory (3.0a,b). Subsequently, they recorded (D3, D4), compiled (4.0a), and transferred or transported (4.0b) test results to a location where they could be digitized (D5). Finally, they summarized data (4.0c) in reports (D6) that were passed to external entities (e.g., senior managers, regulators, ministries or other stakeholders) (6.0). In parallel, they applied the water test results (D3, D4) to guide actions (5.0) that addressed contamination: for example, communicating with water source owners/consumers, or performing corrective actions to the water source/distribution system. DFDs for individual institutions are available online at www.aquaya.org/dfds. Despite using similar structures for collecting and sharing water quality information, suppliers and surveillance agencies are generally responsible to different regulatory institutions (i.e., the Ministry of Water and Ministry of Health, respectively) and monitor different drinking water source types; water suppliers are responsible for monitoring their respective piped distribution networks, whereas surveillance agencies are responsible for monitoring all supplies of drinking water from any source type at the point of consumption within their geographical jurisdiction.

Processes

Information flows involved a number of processes: deciding on sample locations, collecting and processing samples, responding to contamination, and reporting the data. Most institutions had at least two different personnel groups (e.g., local management and local lab staff, or local lab staff and central management) responsible for these processes. At one extreme, a municipal water supplier in Ethiopia allocated all of these steps to a single individual. In contrast, four personnel groups conducted these processes for a provincial supplier in Zambia: local management of each town’s water supply system (which were under the jurisdiction of the provincial supplier) decided on the sampling locations; local laboratory staff collected and processed samples; local management transferred and transcribed the data; provincial laboratory staff summarized the data; local management took actions; and provincial management transmitted water quality information to the national regulator.

In Kenya, county public health offices typically had three staff involved in water quality monitoring and reporting: a county Public Health Officer (PHO) responsible for water quality sampling and analysis, a Monitoring and Evaluation (M&E) Officer responsible for digitizing water quality data, and the head county PHO responsible for reviewing and submitting data externally. Kenyan water suppliers typically had a laboratory technician and assistants responsible for water quality sampling, analysis, and data recording; water quality data was then reviewed by upper management (i.e., managing directors). “Monthly water quality reports are sent to the Technical Manager who will also sometimes request to see raw data if there are high levels of contamination. The Managing Director only sees the report if there is a serious contamination issue” (Kenyan Water Supplier).

Data stores

Data stores represent records that contain information regarding the water quality testing program, including sampling plans or guidelines (D1), information about the sample’s location and date of sampling (D2), and written records of the contextual information and test results of a sample (D3-D6), transformed or summarized into different formats. All suppliers in Kenya (4/4) and most throughout MfSW countries (10/11) had written sampling plans with a set schedule (e.g., dates, and/or sampling locations on those dates) (Table 1). Water suppliers typically established their sampling plans to meet regulatory requirements for sampling frequencies for distribution networks, often based on the World Health Organization (WHO) recommendations32. Water suppliers generally repeated sampling locations, but also altered sampling patterns when piped water was intermittent, resources were limited, or permissions were required (as in the case of household taps). In contrast, only one of the three surveillance agencies in Kenya and fewer than half of those that participated in MfSW (7/15) had written sampling plans, and even these rarely followed a set schedule or repeated sampling locations. The practice of varying sample collection between different water supply types follows the WHO recommendations for non-piped supplies, which state that every source should be tested every 3–5 years32 (Table 1). In practice, many surveillance agencies selected their sample schedule based on the availability of transportation, staff, and equipment, or on indications of suspected contamination; these constraints to water quality testing are discussed in more detail elsewhere 21.

Table 1 Summary of data stores reported by suppliers and surveillance agenciesa.

Full size table

Most institutions used a combination of paper and digital records to manage data collection and recordkeeping. In the field, data were recorded on the water sample container, on blank paper, or in photocopied recording templates (D2, Table 1). Mobile phone applications for recording data were only used in one of the testing programs, and this was excluded from Table 1 because the phone application was introduced through MfSW20. All institutions eventually transcribed microbial water quality data from paper to computer programs (often driven by the MfSW program requirement for electronic sharing of results with research staff), but this process occurred at different points in the sampling program as determined by computer or internet availability. Hardcopies of monthly and quarterly reports were often maintained in physical folders. Though most institutions responded to test results indicating that water supplies were out of compliance, they generally did not document these response follow-up or mitigation actions; therefore, we excluded this activity from the generalized DFD presented in Fig. 1. Among suppliers, management (central or local) was often responsible for reporting to external entities, and among surveillance agencies, health staff or local management were generally responsible for external reporting (Table 1).

The seven Kenyan institutions that participated in follow-up interviews in 2019 all recommended improvements in data compilation. Suppliers highlighted that managerial and M&E staff spent substantial time digitizing results and would benefit from having additional computers and a database (e.g., Excel) to improve internal record-keeping and data sharing. “Hard copies of data are transferred from the lab to the Technical Manager’s office, which is inefficient. This also means that the Technical Manager spends a lot of time inputting data into Excel when this could be done directly by the laboratory staff” (Kenyan Water Supplier). Public health offices faced similar challenges and expressed a desire for transferring data digitally instead of via paper records that were hand-carried from sub-county public health offices to county offices. “One opportunity is to digitize results at the sub-county and county level into an electronic reporting system so reporting can be more efficient” (Kenyan County Public Health Office). In addition, two institutions expressed desire for an internal computer or internet-based data analysis system and subsequent training that would allow them to examine temporal trends in water supply and quality.

Data flows

All institutions reported water quality information to at least one national administrative unit: a health ministry, an environment/water ministry, an independent regulator, or national boards/management bodies (Table 2). Most suppliers reported water quality results to upper management and to a national administrative unit, while surveillance agencies sent data to a wide variety of both local government units and other stakeholders, including health staff, epidemics committees, village committees, non-governmental organizations, and donors (Tables 2 and S1). In some cases, water quality data were a component of a report that included information about other topics (e.g., health or disease data for surveillance agencies, operational performance data for water suppliers). We observed a wide variety of final reporting formats, which were either required by external entities (e.g., health reporting systems) or developed by institutions themselves.

Table 2 Summary of data reporting to external entities, including the number of institutions sharing data (sending data and those that send feedback) and the types of data shared (compliance summaries (C), raw data (R), or other (O)a).

Full size table

The DFDs depicted in Fig. 2 highlight the processes used by six institutions to report to stakeholders (complete DFDs for all institutions are available online at www.aquaya.org/dfds). Surveillance agencies (top row) had more reporting routes than suppliers (bottom row) (Fig. 2 and Table 2). In addition, surveillance agencies and water suppliers in the same country had differing reporting practices (two Kenyan public health offices are represented in Fig. 2a, b; two Zambia suppliers are represented in Fig. 2e, f). Although institutions regularly shared data with external entities, they rarely received feedback (such as acknowledgement of results, questions about results, or a formal response such as a written summary or rewards/penalties). In Kenya, regulators and managing directors provided feedback to suppliers, while only upper management (i.e., county and sub-county public health officers or directors) provided feedback to surveillance agencies, despite their transmission of data to many other Local Government Units and stakeholders. Routine feedback from upper management consisted of approval of compliance reports before they are sent to external agencies. If compliance reports indicated contamination, upper management generally provided instructions to laboratory personnel and technical managers (suppliers) or public health officers (surveillance agencies) for mitigation (described in Table 3).

Fig. 2: DFDs illustrating water quality information flows to external entities.

These include (a) a county public health office in Kenya; b a second county public health office in Kenya; c a district health office in Zambia; d a water supplier in Kenya; e a water supplier in Zambia; f a second water supplier in Zambia. WOA Well Owner’s Association, EHT Environmental Health Technician, WSB Water Services Board, KAM Kenya Association of Manufacturers, DHIS District Health Information System (though sharing via DHIS only included the number of samples per month rather than testing results).

Full size image
Table 3 Number of institutions reporting various actions in response to contamination.

Full size table

Kenyan institutions reported that current reporting systems did not facilitate data sharing: “Data from the sub-county public health offices is not digitized, which is inefficient. A sub-county Public Health Officer must hand deliver water quality test results to the county public health office” (Kenyan County Public Health Office). Suppliers in Kenya noted a similar challenge: water quality data were typically recorded manually in a logbook and then digitized by laboratory or management staff. Limited access to computers and the internet also prevented efficient data sharing. “We do not have a dedicated computer for our office so we share with other departments. It would be more efficient to have a computer at the laboratory so that data can be digitized immediately” (Kenyan Water Supplier). A national electronic reporting system exists to capture health data from county public health offices (the District Health Information System, DHIS), but the database only allows entry of the number of water quality tests conducted, not the actual test results: “The District Health Information System [DHIS] does not have a water quality component so we do not know the quality of water at the local and country level” (Kenyan County Public Health Office).

To improve data sharing, all three interviewed Kenyan surveillance agencies suggested a regional database or integrated national database to capture water quality data, similar to or integrated within the Ministry of Health’s (MoH) DHIS. A PHO also noted that an online reporting system would standardize water quality data reporting across counties, though internet access was a challenge. “There should be a national database or reporting tool that captures water quality data from the ground” (Kenyan County Public Health Office). It is important to note that the current DHIS system does not capture data from water suppliers, who instead submit data through a different system to the Water Serves Regulatory Board (WASREB) under the Ministry of Water. Kenyan institutions also recommended holding WASH stakeholder meetings, separate from regular public health meetings, to discuss water quality results and concerns with all county stakeholders, including communities reliant on point source types. A sub-county PHO emphasized meetings solely dedicated to water quality: “Regular stakeholder meetings would provide an opportunity to prioritize water quality and discuss any issues that arise” (Kenyan sub-County Public Health Office).

When contamination was detected, all institutions reported acting on the results by verifying contamination, mitigating risks, and/or engaging with consumers. All suppliers reported verifying contamination and/or mitigating risks, while surveillance agencies engaged with consumers (14/15), and, to some extent, verified contamination (4/15) and/or mitigated risks (5/15) (Table 3). As noted above, however, institutions did not document their response actions.

Case study: policy and practice in Kenya

We compared the policies and regulations for water quality testing and reporting in Kenya with the actual practices of water suppliers and surveillance agencies. Licensed water suppliers are regulated by WASREB under the Ministry of Water, with water suppliers mandated to report water quality data quarterly and annually to WASREB under section 50 of the 2002 Water Act33. WASREB has established monitoring requirements that include water quality parameters as well as testing frequency and sample numbers based on populations served and volumes of piped water supplied34. Suppliers are required to submit a sampling plan to WASREB for each water treatment facility. According to WASREB documents34, all water supplies must comply with drinking water quality standards established by the Kenya Bureau of Standards, although none of the water suppliers that we interviewed reported penalties for reporting results that did not meet these standards. Notably, the Kenya Bureau of Standards for drinking water35 list many more water quality parameters than are commonly included in supplier testing programs. For Kenyan surveillance agencies, the national MoH oversees the county public health government but has limited legal authority, due to the devolved transfer of responsibilities from national to county governments under the 2010 Constitution of Kenya. County governments are responsible for water and sanitation provision and the allocation of funds for these services. The MoH does not provide water quality parameter or sampling guidelines and instead refers to the WHO’s Drinking Water Quality Guidelines 32.

We examined information systems within WASREB and the MoH as they existed in 2019 (Fig. 3). WASREB had an electronic Water Regulation Information System (WARIS) with reporting requirements that included: i) the number of tests planned and conducted, and ii) number of these samples whose results met the required standard for physicochemical (i.e., turbidity, pH, and residual chlorine) and microbial parameters. Suppliers also reported additional utility performance information, such as coverage, continuity, and financial performance. These metrics were processed into annual Impact Reports that rank utility performance on nine key indicators, one of which is water quality36 (Fig. 3a). The water quality indicator included the following metrics for chlorine residual and bacteriological testing: (i) the percentage of tests conducted (i.e., number of tests conducted divided by the number of tests planned), and (ii) the percentage of samples meeting water quality standards. Water suppliers that did not meet these standards therefore received a low score for this key indicator. Despite knowledge of WASREB’s reporting frameworks, the four Kenyan water suppliers that participated in this study were not complying with WASREB’s schedule for reporting microbial water quality results. Other than lowered performance ratings, none had been penalized for non-compliance; however, in theory, low indicator ratings could result in the dismissal of the water supplier’s managing director.

Fig. 3: Example DFDs from institutions in Kenya.

a DFD of information flows in Kenya’s Water Services Regulatory Board (WASREB); b DFD of information flows in Kenya’s Ministry of Health (MoH). WARIS Water Regulation Information System; WSP Water Service Provider; MoW Kenya Ministry of Water; DHIS District Health Information System; ICC Interagency Coordination Committee; WESCOORD Water and Environmental Sanitation Coordination mechanism; TWG Technical Working Group.

Full size image

To improve information flows, the suppliers suggested that WASREB should conduct more audits: “If WASREB or KEBS [Kenya Bureau of Standards] audited us, we would feel more pressure to sample and submit data” (Kenyan Water Supplier). One supplier suggested adding an emergency reporting component to WARIS: “If there is a cholera outbreak, it is important to report this immediately” (Kenyan Water Supplier). WASREB personnel suggested including a feature in WARIS that allows water suppliers to attach raw data or other supporting documents, as well as re-designing WARIS to allow rural water suppliers not directly regulated by WASREB to submit less detailed water quality data.

Public health offices (surveillance agencies) in Kenya are required to enter monthly health data and the number of water samples tested (but not the results of those tests) into the MoH’s DHIS. Subsequently, monthly and annual reports are then generated and re-uploaded back into DHIS for access by a variety of other MoH departments and outside stakeholders (Fig. 2b). However, in practice, none of the surveillance agencies that we revisited in 2019 reported water quality data to the MoH.

During the MfSW program, most participating institutions conducted regular tests, although not always at the frequency required by national guidelines or standards21. Four years afterwards, most (5/7) institutions were no longer conducting routine microbial water quality testing (Table 4). One county PHO noted, “We rarely share data because we do not do enough testing” (Kenyan sub-County Public Health Office). “We cannot take the appropriate actions without more data [to manage water safety]” (Kenyan Water Supplier). Institutions attributed the lack of testing to insufficient funding to replace broken laboratory equipment, purchase reagents, and cover transportation (i.e., no vehicle for sampling). “Because we have limited resources, we have not conducted routine water quality testing in over a year. Our office does not have enough water quality results to produce summaries or inform decisions” (Kenyan County Public Health Office). Without regular testing, water quality information is not available to inform decisions.

Table 4 Summary of sampling programs in Kenya.

Full size table

Two of the four water suppliers in Kenya that had participated in MfSW were still conducting regular water quality testing, although at a lower frequency. The other two suppliers tested only when equipment and reagents were available. The three surveillance agencies only tested water in response to customer complaints or disease outbreaks, though they did not specifically document complaints or responses. “We have not conducted routine water quality sampling or analysis since MfSW. We only conduct water quality testing when there is a customer complaint” (Kenyan sub-County Public Health Office). “Water quality testing is reactionary, so we can only confirm contamination rather than fully understand water quality in our community” (Kenyan County Public Health Office). It was more common for all institutions to test water for basic physico-chemical parameters (pH, turbidity, and residual chlorine) rather than microbial parameters (Table 4). “Since running out of consumables after MfSW, we no longer test for bacteriological parameters, only pH and residual chlorine” (Kenyan Water Supplier). Two suppliers reported testing for additional physico-chemical parameters, including alkalinity, conductivity, dissolved oxygen, and various nutrients and heavy metals. All sampling data, inclusive of these parameters were included in reports to upper management, but only those required for compliance are included in external reporting.


Source: Resources - nature.com

MIT News – Food | Water

Increasing the broad-leaved tree fraction in European forests mitigates hot temperature extremes