When schools in Sierra Leone closed last March, the government was more ready than many to respond. With the school closures due to the Ebola epidemic all too recent a memory, the Ministry of Basic and Senior Secondary Education acted fast to mobilise a national distance learning programme via radio. A number of non-state organisations supported this effort providing content for the broadcast, funding the broadcast, and supporting its use with SMS reminders. But radio is a one-way media, and most students were missing interaction with teachers which is important to reinforce their learning.
To increase interactivity, the government included a section of the broadcast where children could call-in with questions. In addition, Rising Academies, a private school operator in Sierra Leone which had experience providing education to children during the Ebola epidemic, suggested that the radio broadcast be supplemented with teacher phone tutorials. They also proposed that the intervention be evaluated for impact.
To do this, and with support from a grant from the World Bank’s Strategic Impact Evaluation Fund, we designed a randomised control trial which assigned 4,399 students from 25 government primary schools to receive—in addition to the standard access to the government’s broadcast that all students received—either
- SMS reminders to listen to the radio (the control group); or
- SMS reminders and weekly, one-on-one tutorials by phone from live school teachers.
Given the rising interest in phone-based learning assessments, we also cross-randomised survey mode, assigning 500 students to be tested in person (in line with COVID-19 guidance) rather than by phone to measure the effectiveness of testing by phone. We successfully tracked 90 percent of students. We supplemented the impact evaluation with qualitative research to help us understand the findings. The results are out in a new working paper, and here are three main takeaways:
1. Teacher phone calls had no effect on test scores
This was contrary to our hypothesis and is robust to controls by student background, school fixed effect, and to whether the tests were administered by phone or in person. Tutorials did increase student activity and parent engagement, but did not increase time spent on learning reported retrospectively. Parents reported that calls lasted an average of 22 minutes, and that children spent on average just over one hour per day listening to educational radio. Some parents and students found the program helpful: one student reported that, “You don’t see the teacher but the teacher that was teaching was very nice and sometimes will teach me in Mende just for me to understand. She will give me an assignment and text every week.” Others, less so, mostly because it could be difficult to find a convenient time or conducive conditions for the tutoring calls. As one parent put it, “Do you think the market is a convenient place for her to learn?” (These and other quotes come from a complementary qualitative study.)
2. Private school teachers worked harder than public school teachers—but to no avail
We also randomized whether the teachers making the calls were private school teachers employed by Rising Academies, or regular public school teachers. We hypothesized that teachers employed by Rising Academies would have higher implementation fidelity, given that their full-time job might be on the line, whereas for the public school teachers this was just a side gig. This was correct. Out of a maximum of 16 potential calls per child per subject, students in the private school teacher group received an average of ten calls in mathematics and nine calls in language, compared to students in the government school teacher group who received an average of seven calls in mathematics and six in language. However, there were no discernible differences in student’s test scores between the groups.
3. Learning assessments by phone may not be reliable
Children surveyed in person scored much worse than children surveyed by phone - 0.44 standard deviations worse in mathematics. (This is a big difference: bigger than the impact reported on the vast majority of interventions. In practice, it means that children interviewed in person answered an average of 7 of 12 maths problems correctly, while children interviewed by phone answered 8.4/12 correctly). Some of that may be due to differences between the students reached in person during the follow-up survey, and those who could only be reached by phone. When we rely on random assignment to survey mode (i.e., in person or by phone) to measure this difference, results are inconclusive.
Nevertheless, we do see stark differences in the types of questions kids can answer in person versus by phone, raising concerns about comparability of the methods. This is an important caveat to the optimism about phone-based assessment at the start of the pandemic. Our qualitative research suggests that the difference in test scores between in-person and phone assessments may—in part—be explained by parents helping children who were surveyed by phone, despite the request of interviewers not to do so. One student said that “Yes, my sister was with me during that time… Yes, if they asked me questions and I did not know the answers she would help me”; and one parent admitted that he spoke in his students’ ears “many times and told him the answers.”
At the very least, these results suggest that assessments administered by phone should not be used for decisions about the future of students.
Other interventions that involve teacher phone tutorials during the pandemic have had mixed results, with positive impacts in Botswana and Bangladesh and no impacts in Kenya. One difference between the interventions that did identify impacts and those that did not is that the former worked with a subset of students who opted into the intervention. Seeking to reach the broader population of all students - as our intervention did—may pose greater challenges. In Kenya, the lowest performing students—those who might benefit the most from tutoring calls—were the least likely to receive them.
The pandemic is not over. Low vaccination rates, variants of the virus, and surges of COVID-19 cases, may lead to future school closures. Beyond this pandemic, schools will close in individual countries for various reasons in the future. This study suggests that radio lessons, even complemented with follow-up from teachers, may not deliver the desired gains. It also shows a need for more experimentation to find ways to help children learn when schools are closed, particularly in environments where students lack access to high-quality internet-based instruction.
CGD blog posts reflect the views of the authors, drawing on prior research and experience in their areas of expertise. CGD is a nonpartisan, independent organization and does not take institutional positions.