BLOG POST

Cutting through the Noise: Can Generative AI Enable Personalized Interventions at Scale?

If you have read a proposal for a global development intervention, it probably contained something like, “The community we are working with faces issue X. A common cause of issue X is underlying problem Y. To address this problem, we propose the following program…” While proposals may be packaged with a more compelling narrative, most programs are designed to tackle large problems common across a community. They do this for several understandable reasons. First, households within a community often face similar challenges. For example, rural families in the Sahel may face widespread difficulties such as regular droughts or long distances to a health facility. Second, standardized programs that tackle common issues are more efficient to implement than programs customized for each household.

While households often face similar constraints, they also encounter unique ones. For example, a family in rural sub-Saharan Africa may struggle to buy farm inputs while dealing with an unplanned teen pregnancy, while their neighbor may face difficulties paying for medical care and business loans. Next month, the challenges may change—the family may have managed to purchase farming inputs but there could be complications in the pregnancy. Life circumstances change, and one’s needs change along with it, but our development programs aren’t built to adapt accordingly. This blog explores the challenges of developing targeted interventions that are adaptable to evolving household needs, and hypothesizes how generative AI can help.

One way to address the varied needs of a community, without the use of AI, is to give households cash and let them decide how to spend it. GiveDirectly, a nonprofit that delivers cash transfers, has often argued that “Not everyone needs a goat,” and that the development sector should move away from one-size-fits-all services. One of the reasons cash transfers are so impactful is that they tackle the highly prevalent condition of poverty while enabling customizable pathways out of it. 

There are however certain development outcomes that cash transfers are not designed to impact. While cash transfers can increase school attendance, they are rarely cost-effective at improving learning outcomes. Money can help kids get to school, but it is unlikely what makes them more numerate. Instead, individualized tutoring, even when conducted over a phone call can boost learning. Additionally, there is a growing body of evidence suggesting that certain forms of cash “plus” programming, where complementary services or products (e.g., flood risk information, enrollment in a mothers nutrition group) that drive focus on certain sectoral outcomes can increase the impact relative to giving transfers alone. The non-cash components may however be expensive and hard to scale. It is also costly to figure out which “plus” components are relevant to each family. 

How generative AI could help

Generative AI could help deliver inexpensive programs tailored to each household in three ways: (i) It could enhance personal problem diagnosis, (ii) it could help select the most effective interventions for families based on this personalized diagnosis, and (iii) once the intervention is selected, it could deliver it in a personalized and adaptive way. Perhaps most importantly, because it could be delivered conversationally over SMS, voice, or an app, it can be rapidly scaled.

For example, following a clinic visit, an AI agent could call a caregiver to ask if a child is responding well to medication. The same agent could then ask how the family’s farm is doing, and warn of an upcoming heat wave. If the user starts another topic, and talks about how his child is not learning at school, the AI agent could pivot and connect him to a phone-based tutor. If the user complains that her child is not eating sufficiently and losing weight, the AI agent could connect her to an AI-powered nutrition coach. These are all examples of the first two ways in which generative AI could enable personalization—diagnosing the unique obstacles facing a family and customizing recommendations of effective interventions based on that diagnosis. 

Once the intervention is recommended, generative AI could ensure the delivery of that intervention is itself adaptive. To understand what an adaptive intervention is, take the example of coaching. Carlo Ancelotti, one of the most successful soccer managers in history, talks often about how great coaches adapt to their players, and not the other way around. His coaching goal is not just to transfer knowledge but to adapt his style and tactics to unlock the agency of his players. While achieving development outcomes and coaching a soccer team are obviously different,the ability to tailor and adapt information for the intended audience is likely a core trait of a good coach. However, because coaching is customized to each recipient, scaling invariably leads to issues in quality. It is tough to train large numbers of Carlo Ancelottis who operate without a script and provide assistance in a dynamic and adaptive way. This issue of scaling quality isn't limited to soccer or nutrition coaching—it affects interventions like Farmer Field Schools, where hands-on, iterative learning has shown promise in small studies but struggles to maintain impact at larger scales.

The digital and continuous conversational engagement described above would provide much more frequent and rich data on household needs. Instead of relying on infrequent surveys and broad, one-size-fits-all interventions, generative AI could facilitate continuous, personalized engagement with households. Through these personalized engagements, generative AI could enable demand-driven, customized interventions similar to the best-case management systems in high-income countries, but at larger scale and a fraction of the cost.

There are still many open questions

While exciting, these are all still hypotheses. The empirical evidence on whether generative AI can effectively customize problem diagnosis and provide iterative engagement is still emerging, with mixed results. In a recent randomized controlled trial (RCT) conducted in Nairobi, researchers tested the use of a ChatGPT-enabled business coach via WhatsApp for entrepreneurs. The results were uneven: successful entrepreneurs saw a 15 percent increase in profits or revenue, while struggling entrepreneurs experienced a 10 percent decline. This outcome challenges my hypothesis—if the AI coach had adjusted effectively, we might have expected different magnitudes of impact but probably not opposing outcomes across user segments.

One possible explanation for the uneven impacts is that entrepreneurship is a highly dynamic challenge, and the AI coach may have failed to adapt appropriately to struggling entrepreneurs. Specifically, the AI might have been too reluctant to advise failing entrepreneurs to shut down a poorly performing business. Instead, it continued to offer suggestions to keep the business running, sticking to a fairly rigid principle of continuous iteration rather than tailoring its advice to recommend closure when warranted. Perhaps the issue was also that a human was not kept in the loop to correct for these unforeseen behaviors. 

Even if the issues described above are resolved, it is fair to question whether users would consistently engage with an AI tool at scale, especially for personal matters. However, consider the massive adoption of voice assistants like Siri, Google Assistant, and Alexa prior to their integration with generative AI. These often clunky tools are used by 150 million people in the US alone. While these assistants are used for simple tasks like checking the weather, they demonstrate a willingness to engage with digital agents. Moreover, a pilot of a voice-based generative AI tool in Zambia called “Ask Viamo Anything” found more than a third of users asking about health, with 90 percent asking another question within two days. Even in the Nairobi RCT with entrepreneurs, users wished the AI coach a “Happy New Year”—a seemingly unnecessary engagement with a robot. Personally, I have used generative AI for health queries—once asking OpenAI outloud if my newborn’s refusal to sleep and wide-eyed stare at midnight was a sign of an infant seizure (thankfully, it was not). 

This idea that advances in technology can help tailor solutions to individual needs has been taking root across multiple fields. Precision medicine, for instance, recognizes that people can react differently to the same treatment. Advances in fields like pharmacogenomics now allow for medications to be prescribed based on a person’s genetic profile, optimizing outcomes on an individual basis. Similarly, precision agriculture leverages technologies such as remote sensing and advanced weather forecasting to offer farming recommendations tailored to specific plots. These fields demonstrate how technological leaps can shift from one-size-fits-all solutions to more personalized, targeted interventions at a large scale. 

That said, the idea that generative AI can lead to effective, customized development interventions is a hypothesis requiring rigorous testing. There is no inherent quality in the technology that guarantees impact. As with past technologies, generative AI for the poorest will likely be underinvested in, as it is harder to monetize products for this population. Privacy concerns will also need to be taken seriously and addressed. If an AI agent were to become a trusted tool that helps families navigate unique personal challenges, such data could be used to deliver personalized social services, but also be used to deliver targeted harm. Who manages and governs the tool and the data collected needs to be more rigorously thought through.

If these challenges can be overcome, this approach would represent a paradigm shift in how development programs, and public services more broadly, are delivered. 

Disclaimer

CGD blog posts reflect the views of the authors, drawing on prior research and experience in their areas of expertise. CGD is a nonpartisan, independent organization and does not take institutional positions.


Image credit for social media/web: poco_bw / Adobe Stock