Promoting Equity through Data: Our New Program Dashboard

Lindsey Wang is a Program Analyst at the Luminos Fund where she is instrumental in program monitoring, evaluation, and reporting. She joined Luminos in 2016 as a Mechanical Engineering graduate of MIT and is currently pursuing a Master in Public Policy at Harvard Kennedy School.

Why build a data dashboard?

COVID-19 has interrupted students’ learning all around the world. Now, more than ever, the international education community needs effective tools to analyze data in real time and spur equitable solutions to close learning gaps. As Luminos’ Program Analyst, I ensure that rigorous data collection and analysis are at the foundation of program management and support efficient service delivery because, in the end, each data point represents an individual or a community. With over 1 billion students slowly returning to school due to COVID-19 (UNESCO), we need faster feedback loops to identify and address learning gaps and better meet the needs of every student and family.

One year ago, I set out to develop a tool for program managers to leverage the wealth of data collected from the field to drive program delivery. I sought to capture a real-time snapshot of the state of our Second Chance program by integrating both quantitative and qualitative data into our model, thus ensuring that vital institutional knowledge and first-hand observations could be shared across far-reaching geographies. Our solution: a data dashboard for program management with three simple objectives to help us monitor program results in real-time and deploy program resources more efficiently:

  1. Monitor Key Performance Indicators (KPIs)
  2. Capture a holistic view of the program and drill down to granular insights
  3. Identify struggling students and facilitators
With the data dashboard, we can easily contrast how students performed on the Early Grade Reading and Math Assessments at the beginning and end of our program. The vast majority of students enter our program unable to read even a single word. In 2018-19, students graduated from our program reading an average of 40 words per minute, well above the national average.

A dashboard for program management

Given my past experience as an engineer applying user-centered design principles to develop products to support low-resource communities, I believed it critical that we prioritize our intended users (program staff) throughout the dashboard development process to ensure we built a tool that met their needs. To do so, I built iteration and feedback into the design process: at each stage of development, I solicited feedback and co-created elements alongside program staff.

My constant engagement with program staff in Liberia and within the Luminos HQ informed the creation of four distinct dashboard reports:

  • The Program Overview captures an up-to-date snapshot of the program by pairing student enrollment information with internal field reports and spot checks. Program staff can drill down and filter data by program year, region, and implementing partner.
  • The Student Assessments dashboard captures the distribution of scores for literacy, numeracy, and words per minute (WPM) in each phase and allows users to disaggregate data by region, implementing partner, student demographics, or classroom rating.
  • The Classroom Observations dashboard enables users to review a log of classroom observations from field visits conducted by program coordinators. KPIs include facilitator performance and internal measurements of attendance and words read per minute.
  • The Baseline and Endline dashboard compares the results from our external baseline and endline EGRA/EGMA surveys from the program level down to the individual student level.
With the Student Assessments dashboard, we can investigate the distribution of assessment scores across three subjects: literacy, numeracy, and words read per minute (WPM). Our program staff can deep dive all the way to the classroom level to identify struggling students.

Dashboards in practice

The potential applications of the dashboard are vast. Here is a taste of what I hope to achieve once this new tool is implemented:

  1. Diagnose barriers: Imagine we notice a classroom in which most students scored below the program average. The dashboard allows us to examine this datapoint in context. Has there been an economic shock in the community that caused parents to withdraw their students to work? We can compare the performance of the classroom in question against other classrooms in the same or neighboring communities to determine if this is a shared phenomenon. Perhaps the issue lies with the facilitator. We can review the classroom observations logged by program coordinators over the prior weeks to determine if the facilitator is struggling to grasp the principles of Second Chance’s activity-based pedagogy.
  2. Map trends: With data stored in a centralized database, we can combine external baseline and endline data with internal midline and phase-level assessments to create a picture of students’ learning trajectories.
  3. Promote equity: Luminos disaggregates data by region, implementing partner, and student demographic information — such as gender — to promote equity in program delivery. The success of our Second Chance program has always depended on strong partnerships with leaders and advocates in the community who help us localize the program to meet students and families where they are. With this dashboard, we can easily assess how different sub-populations are performing and address their specific barriers to learning.

As Second Chance classes gradually reopen from months of school closures and interim distance learning efforts, our team is committed to supporting our students, classroom facilitators, and communities. I am currently training our program coordinators to use the dashboard to inform their management practices in anticipation of classes resume. We have already begun the orientation process and will dive deeper into each of the dashboard visualizations in the coming months. My hope is that eventually our program staff will bring their own creativity and curiosity to the dashboard and derive unique insights from the data.

The Classroom Observations and Ratings report helps our program coordinators catalog and review their notes from field visits to classrooms.

So, you want to build a data dashboard….

Design is fundamentally an iterative process. Here are some of the lessons we have learned through many rounds of feedback:

  • Build for your end users: Who will use the dashboard, and how will they engage with the tool? User interviews and feedback testing are great ways to make sure you build something your users need rather than what you think they need.
  • Identify KPIs early: It is easy to try and incorporate too much into one dashboard. Enumerating your KPIs early will help avoid scope creep.
  • Address varying levels of data literacy: Make sure to assess the data literacy of your end users and tailor your dashboard to their comfort level with data visualizations.
  • Track your data sources: Especially as you begin to combine datasets, it is critical to track where your data come from, how and by whom they were collected and cleaned, and how often the data are updated.

If you’re interested in learning more about the development of the Luminos dashboard, please contact info@luminosfund.org.

Originally published by The Luminos Fund: Source