How do you handle a situation where you have built a preschool class, but many of the children do not show up? This was the situation faced by the Head Start program run by CAP Tulsa in Oklahoma. In September 2016, a significant portion of the program’s preschoolers, 135 in total, did not appear for the start of the school year even though their parents had enrolled them. In response, CAP Tulsa turned to data to identify the problem and come up with a solution. This example demonstrates how all of Head Start’s grantees are now expected to incorporate data into their decision-making and continuous improvement processes.
CAP Tulsa provides care and educational services for newborns to preschoolers. However, when it comes to 4-year-olds, there is competition from preschools offered by the Tulsa school district or local charter schools. To better predict the program’s enrollment, CAP Tulsa’s director of research and innovation, Cindy Decker, and her team developed a statistical model. The model identified common factors among the children who did not show up, such as having an older sibling in elementary school. This suggested that parents may prefer to have their younger child attend a preschool in the same building for convenience. The model also found that new program enrollees or those not receiving behavioral or disability supports were more likely to be no-shows. Armed with this information, staff members started proactively communicating with parents to understand their plans. CAP Tulsa also collaborated with the district and local charters to ensure that the same children were not showing up on multiple rolls. As a result, the number of no-shows decreased from 135 to 99 the following year, reducing disruption in the first weeks of the school year.
In addition to this data-driven approach to enrollment, CAP Tulsa uses data in various other ways to enhance its program. By analyzing data, they can identify both problem areas that need attention and successes that should be celebrated. This data-driven approach is part of a broader effort within the Head Start community to improve the use of data. Historically, data has primarily been collected for compliance purposes rather than to drive program improvement or improve outcomes for children. Recognizing the need for change, the National Head Start Association commissioned a report called "Moneyball for Head Start" in 2016. The report advocated for the use of data in a similar way to how statistical analysis is used in sports, specifically referring to Billy Beane’s approach in assembling competitive baseball teams. The report emphasized the importance of embracing data and called for federal support to facilitate this shift.
In response to this call for change, Head Start released new performance standards in 2016, which required programs to use data in decision-making related to budgeting, teacher coaching, and instruction improvement. This represented a significant shift in mindset, and Head Start provided technical support at various levels to help grantees make this transition. This included practice-based coaching, where data is used to support teacher professional development. Head Start also organized a "data boot camp" to enhance the data-related skills of staff members and technical assistance providers. Overall, the aim is to move away from a compliance-focused approach towards a performance-focused one, where data plays a central role in driving program effectiveness.
In conclusion, CAP Tulsa’s experience demonstrates the power of using data to address challenges and improve outcomes in the Head Start program. By leveraging data, programs can better understand enrollment patterns, identify areas for improvement, and make informed decisions. This shift towards data-driven decision-making is now being embraced across the Head Start community, supported by new performance standards and technical assistance from Head Start.
"When data was first introduced to Head Start, there was a sense of hesitation and uncertainty," explained Esmirna Valencia, the executive director of Riverside County’s early childhood programs. "We recognized the need to present data in a way that made sense to our staff." Program managers began discussing how they already utilized data in their daily work, even if they didn’t explicitly refer to it as "data-driven decision making." To further support this approach, program leaders recruited staff members who were skilled in understanding and manipulating the existing data management systems used in Riverside County. One such system, ChildPlus, captured a vast array of data points on children and families. In addition to allowing users to generate basic reports, the creators of ChildPlus granted Riverside County access to the entire database, enabling them to generate their own customized reports. Riverside also integrated the database with a visualization program called Tableau, providing them with extensive analytical capabilities. For instance, Riverside now maintains a "dynamic dashboard" that displays enrollment information, allowing managers to quickly identify program capacities, areas that require more children, and the number of potential students awaiting eligibility confirmation.
Aiming to Enhance Teaching Practices
Guilford Child Development Center in North Carolina, another organization funded by Head Start, leverages data to improve teaching practices. Serving approximately 1,200 infants, toddlers, and preschoolers, Guilford relies on a tool called the Classroom Assessment Scoring System (CLASS) as a key component of its evaluation process. Programs that fall below a certain threshold on the CLASS data and other metrics are required to recompete for federal funding. To facilitate this evaluation, Guilford employs its own trained CLASS assessors who regularly observe classrooms. While federal officials do not mandate their own CLASS assessments, comparing Guilford’s performance to other programs within the state and nationwide is crucial for identifying areas that require professional development. Robin Sink, an educational coach specialist for the program, emphasizes that while analyzing numbers is essential, establishing trust and building relationships with teachers is equally important. "I need to connect with them and establish a foundation of trust," Sink stated. "The process of building relationships is more complex than simply sharing data." The use of data for continuous improvement extends beyond Head Start managers and extends to teachers who utilize student assessments to make daily decisions regarding how to best support their students. In Riverside, for example, Head Start teachers have access to real-time data on their students through a program called Learning Genie. Teachers can input observations and assessments, and the program generates interactive reports for educators and parents.
Boris Sanchez, a Head Start teacher in Riverside, shares how she utilizes the program on a daily basis to monitor her students’ progress. The data helps her determine which students require individualized attention, which ones can work together in small groups, and how she can adapt her lesson plans accordingly. Sanchez explains that if her students show an interest in learning about butterflies but also need support with letter recognition, she integrates the two topics. "We merge the technical aspects with the fun elements," she says. Sanchez appreciates the focus on data-driven continuous improvement as it aligns with the practices she was already accustomed to. "We always had our checklists. I’m not afraid of data because we were already utilizing it," she adds.