Every year, most Ministries of Education (MoEs) conduct a school census.
School directors indicate the resources their school possesses (e.g., books, a library etc.), the characteristics of their teachers (e.g., gender, qualifications, etc.) and the number of students enrolled. This data feeds into the Education Management Information System (EMIS) and is an indispensable tool for education stakeholders.
What have MoEs and the DMS team learnt from linking administrative datasets?
The Data Must Speak (DMS) research team at UNICEF Innocenti has worked closely with Ministry of Education partners to co-create and conduct in-depth analyses of administrative data.
First, we linked the information on schools over time, which allowed us to connect important pieces of data, for example, knowing how many girls and boys were enrolled in each grade for several consecutive years. This made it possible to track cohorts of students and know which schools were best at retaining and promoting students. Additionally, where possible, the DMS team linked school census data to exam data to determine which schools were most effective at teaching students.
Then, using this data, the research was able to link some key educational inputs (e.g., textbooks, number of teachers etc.) with performance to help us understand the current status of education, in addition to challenges and methods to improve students’ school retention and learning.
For instance, in Togo, the DMS research showed that female students are more likely to be promoted to the next grade and score higher in exams when their teacher was a woman. This is an important finding as recruiting more female teachers could help eliminate the gender gap in this country, as illustrated in figure 1 below that shows the simulated effects of doubling the share of female teachers on gender gaps in educational attainment.
Figure 1: Simulated effects of doubling the share of female teachers on the gender gap in promotion rates and exam success in Togo.
In Madagascar, Nepal and Togo, the DMS research analysed the relationship between student-teacher ratio and students’ performance, which informed MoEs about the potential educational benefits of recruiting more teachers. Various other educational inputs such as canteens, textbooks, or teachers’ qualifications were analysed, and reports from all participating countries are published on our website.
An additional benefit of the DMS co-creation approach (Co-Creation Part 1, Co-Creation Part 2) is the documentation of best practices and recommendations on how to improve overall EMIS data collection, cleaning, merging and analysis. In Cote d’Ivoire and Ghana, those recommendations sparked an initiative to create unique school IDs facilitating the linking of school information over time.
What is the future of research with school’s administrative datasets?
Integrating other datasets within EMIS could open new opportunities
Although these analyses are informative, they are still only scratching the surface when it comes to the potential of administrative data in informing education policy.
Firstly, the monitoring of the education sector could be improved by adding new layers of information in EMIS. For instance, in Niger, the DMS team linked information on schools with local data on poverty to identify the contexts that schools operate in. This could be done systematically in all countries to measure socio-economic inequalities in education and monitor progress.
Similarly, in Côte d’Ivoire, we used information on the presence of cocoa fields near the school. Geo-coded data on violence could be used in countries with high level of insecurity to understand how violence affects students, and climate and natural disaster data will become more and more important to help schools adapt to climate change consequences.
With initial technical support from the DMS research team and our partners, statisticians in MoEs can build their skills to link this data and report these trends regularly to policymakers.
Regularly monitoring outcomes of education policies
Secondly, it is clear that EMIS data is underused when it comes to measuring the impact of education policy.
Was a policy regarding mass textbook delivery successful at raising exam success rates? Did the drop-out rate decrease when the MoE expanded the number of canteens? In our analyses, the DMS research showed that EMIS data has the potential to answer these questions and, by working closely with MoE experts, understand which schools or regions benefitted from specific policies. We can then compare their performances before and after the policy was implemented with schools and regions that have not benefitted from it, to precisely measure the impacts of these policies.
Ideally, these types of analyses would be done systematically by MoEs, through education labs, whenever new policies are implemented. Our current work with MoE partners in various countries facilitates capacity development. The work that DMS and MoE experts are doing to integrate datasets and analyse data could lay the foundation of future education labs, to enhance the transformative impact of data.
Embed impact evaluations within administrative data to test new education policies
Thirdly, EMIS data could be used to test new education policies and measure if they are successful.
Impact evaluations, including Randomized Control Trials (RCTs), are considered the gold standard when it comes to program evaluation, but they are usually quite time consuming and expensive to implement. However, in some cases, it is possible to embed RCTs within administrative data collection and save on costs.
For instance, a study conducted by the World Bank in Pakistan embedded an RCT within routine data collection to evaluate the impact of school management committees (SMCs) on school performance. The DMS research, just like in Pakistan, could leverage this integrated data to test new education policies. Rigorously measuring the impact of new policies is critical to improving education systems.
Often, lack of data is seen as barrier to creating evidence-based policy. However, when it comes to education, the vast majority of existing administrative data is still undervalued and underused.
Harnessing the potential of existing datasets is not only cost-effective but sustains ownership and investment in national data systems. The DMS research plans to continue co-creating our research with MoEs and local academia, to leverage data in innovative ways and inform education policies for greater impact.
The DMS Positive Deviance research aims to mitigate the learning crisis by using existing data to understand the behaviours and practices of exceptional schools (i.e., positive deviant schools). It is co-created and co-implemented with Ministries of Education, partners, and key stakeholders. The DMS research relies on mixed methods and innovative approaches (i.e., positive deviance, behavioural sciences, implementation research, and scaling science) to generate knowledge and practical lessons about ‘what works,’ ‘why,’ and ‘how’ to scale grassroots solutions for national policymakers and the broader international community of education stakeholders.
The DMS research is currently implemented in 14 countries: Brazil, Burkina Faso, Chad, Cote d'Ivoire, Ethiopia, Ghana, the Lao People’s Democratic Republic, Madagascar, Mali, Nepal, Niger, the United Republic of Tanzania, Togo, and Zambia. It is made possible through a coalition of donors: Global Partnership for Education (GPE) and International Development Research Centre (IDRC) Knowledge and Innovation Exchange (KIX), Hewlett Foundation, Jacobs Foundation, Norwegian Agency for Development Cooperation (Norad), Schools2030 initiative (Aga Khan Foundation), and UNICEF internal resources.