Disrupting Education? Experimental Evidence on Technology-Aided Instruction in India
- Student learning
- Digital and mobile
While school enrollment has increased substantially in low- and middle-income countries over the past twenty years, learning levels remain low. In response to low levels of learning, educational technology is seen as a promising solution to help improve learning outcomes. Despite excitement around the use and promising benefits of education technology, there is limited evidence to date on its impacts and cost-effectiveness. Researchers evaluated a computer-based adaptive learning platform (Mindspark) for secondary school students in urban India to measure the impact of customized learning technology on student test scores. The program increased test scores across all groups of students and was cost-effective compared to traditional schooling models.
Low- and middle-income countries have made impressive progress in improving school enrollment and completion in the last two decades, yet learning levels remain low. In India, for example, over 50 percent of students in grade 5 cannot read at the grade 2 level, despite primary school enrollment rates of over 95 percent.1 The rapid expansion of education in low- and middle-income countries has led to the enrollment of millions of first-generation learners, who lack instructional support when they fall behind the curriculum. Students who fall behind may then learn very little in school if the level of classroom instruction is considerably above their learning level.
While pedagogical interventions aimed to “Teach at the Right Level” with human support have been successful at the primary level, there is very little evidence to date on effective instructional strategies for post-primary education settings with a wide range of student learning levels. One promising option for addressing this challenge is to make greater use of technology instruction, particularly if it can deliver individually-customized content to teach at the level of each student. However, while technology-aided instruction may have a lot of potential to improve post-primary education, there is limited evidence of notable successes to date.
Context of the evaluation
The intervention was administered in three stand-alone Mindspark centers in Delhi focused on serving low-income neighborhoods. The 619 student participants—mostly between grades 6 and 9—were from five public middle schools close to the Mindspark centers. In the population sampled, the average student achievement was several grade levels behind grade-appropriate standards, and the gap grew by grade; the average grade 6 student was around 2.5 grade levels below grade 6 standards in math; by grade 9, this deficit increased to 4.5 grade levels. Therefore, the default of classroom instruction based on grade-appropriate textbooks was likely considerably above the preparation level of academically-weaker students. Additionally, there was a large range of within-grade student learning levels; the learning levels of the highest and lowest achieving students in the same grade typically spanned five to six grade levels in preparation, with the majority of students being below grade-level standards. Finally, it is possible that the academically weakest students made no academic progress in a given year though enrolled in school.
Developed by Educational Initiatives, Mindspark is a computer-assisted learning (CAL) software that provides students with personalized instruction. At the time of the study, it had been used by over 400,000 students, had a database of over 45,000 test questions, and administered over one million questions across its users every day. Mindspark is interactive and uses a set of games, videos, and activities based on an extensive body of high-quality instructional materials to continuously assess students while also providing thorough explanations and feedback. A key feature of the Mindspark platform is its ability to use data to identify the learning level of every student, deliver customized content targeted at this level, and dynamically adjust to the student’s progress. Mindspark can be delivered through computers, tablets, and smartphones; it can be used online or offline; and, it can be implemented in school classrooms, in after-school programs, or through self-guided study.
Details of the intervention
Researchers partnered with Educational Initiatives to test the impact of the Mindspark platform on student test scores in mathematics and Hindi. The version of Mindspark evaluated provided students with 45 minutes of the CAL software and 45 minutes of instructor-led small group instruction. Children signed up for the program by selecting a 90-minute slot, which included about 15 students and ran six days a week. Typically, parents paid INR 200 (USD 3) per month to send their children to the program.
Among the 619 students recruited for participation, around half were offered a voucher for free attendance at a Mindspark center starting in late 2015. During the self-driven learning period, each child was assigned to a computer with software that provided customized activities on math, Hindi, and English. During the small group instruction, teaching assistants covered core concepts that were not customized to each student’s learning level, and provided time for students to complete homework assignments. To measure the impact of the program on student achievement, researchers tested students in math and Hindi at the beginning and end of the program—a gap of about 4.5 months—at the Mindspark centers.
Results and policy lessons
Among students offered the free voucher, 58 percent attended the Mindspark centers. Receiving vouchers to attend Mindspark centers increased learning levels across all groups of students and was cost-effective compared to other instruction types.
Independently-collected Test Scores: The program improved performance in both math and Hindi across multiple grade levels. Students offered a voucher scored 0.37 standard deviations higher in math, improving by over twice as much as students in the comparison group. Students who received the voucher also scored 0.23 standard deviations higher in Hindi, improving by 2.4 times as much as students in the comparison group. Impacts did not vary significantly by level of initial achievement, gender, or wealth, implying that the program was equally effective in teaching all students. However, the relative impact was much greater for weaker students, since their rate of progress in the comparison group (value-added) was much lower than better-performing students, and not distinguishable from zero.
School Test Scores: Researchers also used administrative data on student test scores at their schools to measure their performance in material at their official grade level. Mindspark, which presented material at students’ official grade levels in Hindi since the learning gap was smaller, had positive impacts on school test scores. In math, the program had no effect. Researchers hypothesized that since students were usually several years behind in math, the school exams would still be beyond their learning level, even if they had made some improvements.
Cost-Effectiveness: Mindspark was cost-effective. The per-student monthly cost of the program was around INR 1000 (around US$15) per month, compared to a cost of around INR 1500 (US$22) per month in spending per student at the public schools from where the students came in Delhi. Researchers expect that the program cost per student would decrease to under US$2 if it were scaled up to a larger number of students.
These evaluation results are being used by the program implementers, Educational Initiatives, to set up potential scale-ups of the intervention in government schools in multiple states. One at-scale randomized evaluation in Rajasthan is integrating Mindspark into classroom settings within government schools.
Muralidharan, Karthik, Abhijeet Singh and Alejandro J. Ganimian. 2019. "Disrupting Education? Experimental Evidence on Technology-Aided Instruction in India." American Economic Review 109 (4): 1426-60. doi: https://doi.org/10.1257/aer.20171112