The quality of healthcare services is firmly on the global health and development agenda: it is explicitly part of SDG target 3.8 on universal health coverage and a key feature of USAID’s Vision for Health Systems Strengthening. To meet these goals, we need data that can support and inform the design of interventions that aim to improve the quality of healthcare. Such data, and analyses of it, could help policymakers make better decisions by revealing areas in need of improvement and identifying high-quality health facilities that could serve as models for others. But herein lies the problem: we are currently lacking the data that is so crucial for making these policy decisions.
As one solution to this, some have called for more surveys, but those cost time and money. I have another, faster, cheaper solution: let’s find out what can and cannot be learned through data that are already available.
To begin tackling this idea, Elizabeth Lee, Supriya Madhavan, and I completed a proof-of-concept to measure quality using Service Provision Assessments (SPA), health facility surveys conducted as part of the Demographic and Health Surveys Program. Specifically, we used Kenya’s 2010 SPA to construct indicators for the World Health Organization’s six dimensions of quality for antenatal care services: effectiveness, efficiency, accessibility, equity, safety, and acceptability/patient-centeredness. We then looked at variations in quality of care across province, management authority, type of facility, and (for the equity dimension) client’s education level.
There are two main takeaways from our study (you can read the complete paper here):
Surveys like the SPA are both promising and challenging for quality measurement. Promising because they are publicly available and can provide at least some insights into problems and variations in quality of care. Yet also challenging because they were not designed to specifically measure quality and therefore don’t capture all important aspects of the WHO-based indicators. Plus, key variables of interest, such as a comprehensive inventory of materials to prevent infections, often have missing data. But at least the SPA data are free, ready to be used, and can provide some evidence and decision-support to policymakers.
Our indicators show substantial variation in the quality of antenatal care services in Kenya. As one example, public facilities tended to perform worse than private-for-profit facilities, and faith-based facilities performed well on almost all quality dimensions we considered. In countries like Kenya, where there are a number of interventions aimed at improving maternal health and quality of care, including an expanding results-based-financing program, it’s useful for policymakers to understand quality variations so that programs can be designed for maximum effectiveness and efficiency.
The bottom line is that the level of quality is low, variation across facilities is large, and we urgently need better data and methods if we want to develop effective solutions to address these problems. Data sources like the SPA are not perfect, but can provide some value at low cost and hence should be used more. In fact, because they are standardized and available for 14 countries, analytics on one country can be easily adapted to others. But organizers of large-scale survey efforts like SPA and DHS also need to evolve and better align their surveys to emerging policy issues such as quality of care. In addition, donors can play a role by increasing their support for national administrative data systems and pushing for surveys’ interoperability with other data sources, such as household surveys. To assess quality of care, the quality of data must be addressed first.