Looking Back at 2020 in Education Research
By The Department of Education and Child Protection Research Team
Understanding what works for children’s education is always complex, requiring adaptive thinking, development of new measurement approaches and continuous improvement of data sharing solutions. This was especially true in 2020. The Department of Education and Child Protection’s Research Team continues to adapt and innovate to ensure that it collects quality data to inform its understanding of what helps children learn and thrive. This post highlights recent examples of the innovative and adaptive work done by the team.
Innovative Tool Design for Literacy, Numeracy and Social-Emotional Skill Measurement
In order to make data-driven decisions about what works for children, the measures used to evaluate learning progress and program outcomes must be appropriate for the contexts within which they’re being used. Save the Children is committed to using tools that have been proven to be valid and reliable to evaluate our education programs, and sometimes this means innovating and creating new measurement tools.
Creating Linguistically and Culturally Appropriate Tools for Academic and Social and Emotional Learning in Syria
For children displaced by the ongoing conflict in Syria, many service provider NGOs noticed that there was little known about how children were achieving the foundational literacy, numeracy, and social and emotional skills critical for long-term learning and well-being. To address this gap, several organizations have administered a variety of academic and social and emotional learning (SEL) assessment tools. However, the tools used were developed in Western contexts, and have not been proven to be reliable or valid within the Syrian context.
For example, most assessments were translated from English into Arabic and likely failed to capture important linguistic (e.g., diglossia), curricular (e.g., orthography, script), and cultural (e.g., gender norms) factors within the Syrian context.
To address these problems Save the Children and our partners at UNICEF and NYU Global TIES for Children created a new model to more accurately gather data about children’s education and wellbeing. The process included four distinct steps:
1. First, the partners reviewed a series of secondary psychometric analyses using five datasets collected from Syrian children in grades two-four. Analyses were designed to assess the construct validity and reliability of data collected.
2. Second, we built on this work by engaging 17 regional measurement experts and other stakeholders in a four-day workshop to add their own expertise in skills assessment in Arabic and to systematically review and improve the psychometric properties of the tools.
3. Third, we pilot tested the newly developed instrument with over 1,500 children in Northeast, Northwest, and the Euphrates Shield regions of Syria.
4. Fourth, we conducted new psychometric analyses and worked with the team of regional experts to refine and finalize the instrument.
This approach improved on prior efforts to assess learning in the region that had been based on template tools and guidance not created specifically for the local languages and contexts. We plan to use and adapt this model for other contexts across the globe, advocating for rigorous, relevant, and appropriate instrument development with stakeholders across the development and education continuum. By drawing upon a team of experts fluent in local language and highly trained in measurement, combining more learning domains in a similar amount of time used for a single early grade assessment, and testing the psychometric properties of the new measure, we will be able to create customized tools to assess children’s learning and wellbeing in a linguistically and culturally appropriate way.
Improving and Expanding Early Grade Reading Assessments in Laos
Around the world more than 40 countries use the Early Grade Reading Assessment (EGRA) to measure students’ early grade reading skills through a variety of sub-tasks. A major limitation of the current structure of the tool is that tasks jump from word reading to items that require reading paragraphs of text. This leaves a gap in assessing intermediate skills at the phrase/short sentence-level. To attempt to solve this problem, Save the Children developed and tested a revised instrument with 785 Grade 2 children in rural Laos. We added two pilot sub-tasks at the phrase/short sentence-level, intended to measure children’s ability to read and understand simple phrases and short sentences.
The first pilot sub-task required children to choose a picture from four pictures that best matched a short phrase. The second asked children to identify whether a simple sentence was ‘true’ (e.g., ‘Cows eat grass') or ‘silly’ (e.g., ‘fish live in trees’). To reduce the possibility of students getting questions correct merely by chance, questions were asked in pairs with students only getting a point when they could answer both correctly.
In Laos, the validity and reliability test for the pilot sub-tasks was robust. Sentence-level sub-tasks had more variation in the distribution of student score, with a lower ceiling than the word-picture recognition task, and a lower floor than the paragraph comprehension task. The sentence-level sub-sections correlate well with other established sub-tasks and student characteristics. Given these encouraging results, we plan to continuing testing these new sub-tasks in additional countries in 2021.
Distribution of % of correct: Word, paragraph, and sentence-level
Adapting to a Remote Assessment for the Return to Learning Program During COVID-19
The combined impact of the Venezuelan refugee crisis, internal conflict, and COVID-19 has been devastating for both Colombian and Venezuelan children’s access to quality education. The evidence and learnings from piloting and validating a remote approach to Save the Children’s Return to Learning (RtL) program in Colombia will inform program expansion in other countries and regions where conflict, natural disaster, and/or health emergencies such as COVID-19 keep children from realizing their right to education.
In response to COVID-19 school closures, SC Colombia needed to find a way to rapidly adapt the RtL program[i] from in person to remote implementation for children in Arauca and Cali. Programmatic adaptations included things like modifying content to be delivered via podcasts and distribution of at-home activity kits.
In addition to adapting program materials for a remote approach, we pivoted to a remote monitoring and evaluation (M&E) approach that includes baseline-endline assessment and weekly check-in calls with caregivers via Whatsapp. We measure children’s prosocial communication and emotional regulation while they engage in RtL activities using the Social Competence Scale at baseline and endline. We also monitor children’s learning and engagement in activities via weekly check-in calls and comprehension questions based on activity content.
Emerging program results from the first two cohorts show that all children (n=120) reported gains in emotional regulation skills, as reported by caregivers. Caregiver engagement in the learning content was high; 81% of caregivers in cohort one and 90% of caregivers in cohort two reported helping their child with at-home learning content. Parental engagement, measured by number of times caregivers report using at home content with their children was directly correlated with higher child wellbeing outcomes. We found differences in wellbeing outcomes along equity lines of age, sex, socio-economic status (SES), nationality, and location.
Based on these findings, SC Colombia plans to expand the RtL program in 2021, including in informal settlements in La Guajira, a region in northeast Colombia that borders Venezuela. We also plan to pilot a blended approach that combines remote activities with in-person activities which will enable us to compare this mixture of approaches with results from children who participate in the remote-only program to see if one model produces better outcomes for children.
Visualizing Data in New Ways with IDELA
One of the most unique tools pioneered by the DECP Research and ECCD Teams is the International Development and Early Learning Assessment (IDELA). IDELA is a rigorous tool to measure children’s learning and development . robust translation and adaptation guidance for IDELA enables reliable measurement in communities around the world.
The IDELA Network database now represents the largest repository of information children’s school readiness in the world. In 2020, the IDELA Team took on a new challenge; how to enable better interpretation and use of IDELA data for program improvement, advocacy and ultimately change for children. The solution needed to be appropriate for a variety of stakeholders, from parents to policy-makers, and allow them to leverage information from the growing IDELA database.
After a year of development the IDELA Data Explorer launched in August 2020 as a dynamic new data visualization and analysis tool. The Data Exploreris open to the public and shares information about the effectiveness and equity of ECD interventions. This effort continues Save the Children’s commitment to innovation and the creation of global goods for the ECD field.
From a global pandemic to the massive, ongoing displacement of millions of people around the world 2020 has been a year of challenges for children and for Researchers at Save the Children who are working with country teams across the globe to find new ways to understand what children need to learn and achieve their full potential. The DECP Research Team continues in our commitment to data driven research, adaptive thinking, and innovation in order to produce the best outcomes for children, caregivers, and teachers even in the most challenging contexts.
Thank you for signing up! Now, you’ll be among the first to know how Save the Children is responding to the most urgent needs of children, every day and in times of crisis—and how your support can make a difference. You may opt-out at any time by clicking "unsubscribe" at the bottom of any email.