Deforestation has been found to increase malaria risk in some settings, while a growing number of studies have found that deforestation increases malaria prevalence in humans, suggesting that in some cases forest conservation might belong in a portfolio of anti-malarial interventions. However, previous studies of deforestation and malaria prevalence were based on a small number of countries and observations, commonly using cross-sectional analyses of less-than-ideal forest data at the aggregate jurisdictional level. In this paper we combine fourteen years of high-resolution satellite data on forest loss with individual-level survey data on malaria in more than 60,000 rural children in 17 countries in Africa, and fever in more than 470,000 rural children in 41 countries in Latin America, Africa, and Asia. Adhering to methods that we pre-specified in a pre-analysis plan, we tested ex-ante hypotheses derived from previous literature. We did not find that deforestation increases malaria prevalence nor that intermediate levels of forest cover have higher malaria prevalence. Our findings differ from most previous empirical studies, which found that deforestation is associated with greater malaria prevalence in other contexts. We speculate that this difference may be because deforestation in Africa is largely driven by the slow expansion of subsistence or smallholder agriculture for domestic use by long-time residents in stable socio-economic settings rather than by rapid clearing for market-driven agricultural exports by new frontier migrants as in Latin America and Asia. Our results imply that at least in Africa anti-malarial efforts should focus on other proven interventions such as bed nets, spraying, and housing improvements. Forest conservation efforts should focus on securing other benefits of forests, including carbon storage, biodiversity habitat, clean water provision, and other goods and services.