Chapter 11: Measuring Success—Data & Accountability
In the flurry of afterschool activity each day—snacks, enrichment clubs, homework support, STEM challenges, sports, art projects—it’s easy to overlook the quiet power of data. Yet it is precisely this systematic gathering of information, joined with thoughtful analysis, that can tell us whether our programs are working as intended. For California’s afterschool programs supported by the Expanded Learning Opportunities Program (ELO-P), the After School Education and Safety (ASES) program, or the federal 21st Century Community Learning Centers (21st CCLC) initiative, the stakes are high. Funders, families, and community members all want proof that these programs make a tangible difference in the lives of young people. To keep that proof at hand, effective afterschool sites devote significant energy to measuring success across three primary domains: attendance, academic performance, and social-emotional learning (SEL). From the moment students sign in, to that final bus ride home, each data point becomes part of a larger story—a story of how expanded learning can help close opportunity gaps and give children the support they need to thrive.
Setting Key Performance Indicators
The first step on this data-driven journey is setting clear Key Performance Indicators (KPIs). In California, attendance is front and center. Average Daily Attendance (ADA) within the afterschool program is typically expected to remain at or above 85% of funded capacity, reflecting not only the program’s reach but also its ability to engage students consistently. Many sites also track “regular attenders”—students who attend the program at least 30 days over the course of the year—because research shows that higher participation correlates with stronger academic gains, better engagement in school, and fewer discipline problems. Programs like those studied in South Carolina’s 21st CCLC initiative have documented that regular attendees see significant improvements in grades, underscoring why these metrics matter.
Attendance in afterschool is often intertwined with the broader goal of improving students’ regular school-day attendance. Chronic absenteeism poses a real threat to learning, and afterschool programs that keep kids excited about coming to school have become a critical part of the solution. Nationwide surveys indicate that nearly half of chronically absent 21st CCLC participants improve their school-day attendance after enrolling in consistent out-of-school-time support. One California evaluation found that ASES students attended one and a half percent more school days than nonparticipants, a small-sounding number that can translate into many more hours of instruction over the course of a year. With the state’s chronic absence rate having doubled from pre-pandemic levels, this role of afterschool—ensuring children are present and ready to learn—is more vital than ever.
A second core KPI for California programs is academic performance, typically measured through standardized test scores, grades in core subjects, or teacher feedback on classroom engagement. For example, some ASES sites compare spring and fall state assessment results to determine if their participants are making adequate gains in reading or math. Others analyze quarterly report cards, looking for percentage increases in passing grades over time. Teacher surveys can add texture: Does the student show improved motivation in class, or a willingness to complete homework more consistently? Even a simple measure such as “turning in assignments on time” can speak volumes about a student’s evolving sense of responsibility and academic identity. One statewide study of 21st CCLC sites showed that teachers reported better school-day behavior and higher levels of participation from a majority of afterschool participants. The simple act of tying these academic indicators back to the afterschool program underscores the idea that quality time spent after 3 p.m. can directly boost performance before the bell rings the next morning.
In recent years, however, schools and communities have grown increasingly aware that data can’t stop at test scores alone. Modern afterschool programs are expected to cultivate the whole child, nurturing creativity, emotional health, leadership, and a sense of belonging. As a result, social-emotional learning (SEL) measures have become the third pillar of data collection. Some programs rely on behavior and discipline metrics—tracking whether program participants see declines in office referrals, suspensions, or chronic misconduct in the school day—while others administer surveys to gauge self-confidence, empathy, or ability to manage emotions. The University of California, Irvine’s multi-year study of promising afterschool programs found that regular participants reported fewer behavior problems and more positive social skills. Research from the Collaborative for Academic, Social, and Emotional Learning (CASEL) shows that high-quality SEL interventions can have transformative effects, including improved attitudes toward school and higher achievement in class. By gathering both qualitative and quantitative SEL data—from pre- and post-surveys to anecdotal teacher reports—programs can confirm that they are truly supporting children’s growth as individuals, not just as students.
Collecting and Analyzing Data
Once the KPIs have been established, the next challenge is implementing a consistent, transparent system to collect the necessary data. This process begins with getting the right tools in place. Some programs still use paper-based sign-in sheets, transcribing attendance counts into spreadsheets at the end of each day, while others have switched to electronic attendance systems for more immediate reporting. Either way, reliable attendance data is essential, since California’s ELO-P and ASES grants require semiannual reporting to the California Department of Education (CDE). Similarly, academic data sharing often depends on carefully negotiated agreements between school-day and afterschool staff, ensuring that afterschool coordinators can review test scores or report cards while maintaining student privacy.
Effective data collection also involves setting up protocols to ensure integrity and accuracy. Afterschool program leaders often provide staff training on how to enter information correctly, set definitions (such as how to mark partial attendance), and maintain confidentiality. Some citywide afterschool systems have even formed data committees to standardize these processes and share tips across multiple sites. For SEL metrics, programs may use validated instruments like the Youth Program Quality Assessment (PQA) or the Assessment of Program Practices Tool (APT), both of which align with California’s overall Quality Standards for expanded learning. Others administer in-house surveys. Regardless of the approach, programs that commit to consistency in definitions and methods find it much easier to spot real trends and to draw meaningful comparisons from one semester—or year—to the next.
This drive for consistency fits neatly into the broader concept of Continuous Quality Improvement (CQI), a cycle of planning, doing, studying, and acting on the results. The CDE encourages ASES and 21st CCLC programs to adopt CQI by reviewing academic, attendance, and SEL data each year, then adjusting programming in response to findings. One ELO-P site in Yuba City described a process in which coordinators, district administrators, and afterschool staff meet quarterly to review data and consider midcourse tweaks: “If test scores in math aren’t rising as we hoped, do we need a different approach to tutoring? Is attendance dropping on certain days or in certain grades? If so, what creative strategies can we implement to re-engage students?” By foregrounding data in conversations with staff, parents, and students, these programs treat data not as a bureaucratic burden but as a roadmap for making better decisions.
Using Data to Inform Improvements
For many school administrators, the most gratifying part of data collection arrives when all those numbers and percentages translate into improved program design. Leaders who see that attendance is lagging on specific weekdays may decide to introduce popular clubs—like cooking, robotics, or dance—on precisely those afternoons, thus incentivizing more consistent turnout. Or perhaps a review of SEL metrics reveals that girls in certain grades do not feel as safe or supported as their peers. In that scenario, the program can create new social circles or mentorship opportunities aimed at fostering an inclusive environment for those students. The beauty of this kind of data use is that it channels real information from children’s experiences into actionable shifts in programming.
The same logic applies to academics. Data might reveal that, in one site, participants’ reading scores are improving significantly, while math results remain static. Armed with this information, program leaders can examine their math-focused tutoring efforts or consult with daytime teachers to identify more engaging curricula. Another site might see that elementary students show dramatic reading gains, while middle schoolers appear less responsive. That insight can prompt the creation of age-appropriate book clubs, or the adoption of project-based learning activities that resonate with older youth. As the Wallace Foundation has observed in its analysis of citywide afterschool networks, a continuous improvement culture is most effective when program staff at every level learn to use data not as a punitive measure but as a collaborative resource.
Communicating Impact to Stakeholders
Yet collecting data and using it internally is only part of the puzzle. Afterschool programs must also convey their results persuasively to parents, funders, and community stakeholders who rely on these services and who often help sustain them financially. Effective communication can take many forms. Some programs develop internal dashboards that present real-time metrics—attendance rates, number of students regularly attending 30 days or more, academic progress data—so that staff can keep a finger on the pulse of the program’s health. Others publish annual outcome reports, peppered with infographics and anecdotes that make the data come alive for a general audience. Parents might see a brightly colored chart indicating that 75% of students raised a math grade at least one letter over the course of the semester, paired with a testimonial from a mother whose child’s confidence soared after joining the program.
The balance between quantitative and qualitative evidence is key. Numbers indicate scope and scale—how many children are served, how many days of attendance, how many improved test scores. But personal stories highlight the meaningful change behind those numbers. For instance, an annual report might feature a fifth-grade boy who struggled with reading comprehension and rarely turned in homework. Over six months in the afterschool program, with targeted tutoring and a chance to practice reading through an engaging book club, he not only improved his reading grade but found a genuine passion for graphic novels. Such a story, presented alongside the statistic that 70% of participants now meet grade-level reading standards, ties the data to a human face. A synergy arises: The data underscores that the success story is not an outlier, while the story gives the numbers emotional resonance.
When sharing these findings with policymakers or potential funders, it helps to connect back to the broad goals of ELO-P, ASES, or 21st CCLC—namely improved attendance, academic support, and enhanced social-emotional well-being. Some afterschool programs produce polished infographics or short videos that can be posted online or presented at school board meetings. Others invite local business leaders, school administrators, or legislators to “showcase” nights, letting visitors see firsthand how program activities reinforce reading fluency, collaborative learning, and healthy life choices. The impetus behind all of these outreach efforts is that “no data, no dollars,” as one advocacy group bluntly puts it. Communicating data clearly and compellingly is one of the best ways to maintain vital funding streams.
The Bigger Picture: Building on ELO-P Momentum
In California, the timing for data-driven engagement in afterschool could not be more opportune. The state’s ELO-P initiative began with a $1.8 billion investment in 2021–22, then scaled to a $4 billion annual commitment, aiming to expand access to before- and after-school programming for hundreds of thousands of high-need students. Early surveys suggest that 97% of local educational agencies, along with 88% of community-based organizations, have expanded capacity significantly—some doubling or tripling their enrollment. Participants tend to report higher rates of school attendance and deeper feelings of belonging, while families cite greater job stability and peace of mind from knowing their children are in safe, supportive environments after the school day ends. On the academic side, staff and parents highlight improved homework completion and an uptick in students who feel motivated to attend class consistently. The cumulative effect is to narrow opportunity gaps and help protect against the well-documented “summer slide,” which can widen achievement gaps over time.
All of this progress, however, depends on reliable data to keep track of which communities are served, which students are benefiting, and how well the programs align with California’s Quality Standards for expanded learning. The state continues to refine its approach—for instance, by considering how to integrate afterschool attendance data into the larger CALPADS system, or how to systematically track demographic information for ELO-P participants statewide. Many programs still need guidance on linking program attendance with academic outcomes, or on blending multiple funding streams for a stable operational model. Yet the willingness of both state officials and local programs to engage in a continuous improvement cycle speaks well of the future. As one afterschool coordinator put it, “The data is what helps us see our blind spots. It also gives us a chance to celebrate the everyday progress we might otherwise overlook.”
A Culture of Continuous Improvement
For any afterschool leader in California weighing their next move—whether they’re newly applying for an ASES grant or already running an established ELO-P site—the formula is clear: Start with intentional KPIs, build robust systems for data collection, integrate that data into a cycle of improvement, and then showcase the outcomes. The result is a dynamic synergy, where accountability drives higher-quality programming, which in turn yields stronger results in attendance, academics, and social-emotional development. It takes effort to maintain such a system. Staff need training, data specialists may be required, and careful coordination with school-day personnel is crucial. But the payoff is immense. Students who participate in afterschool programs see real gains—better attendance, sharper reading skills, higher motivation to learn—and a supportive environment that fosters their social and emotional growth.
Along the way, data becomes more than a series of spreadsheets; it transforms into a narrative, a chance to link each child’s individual story with the collective arc of a thriving program. This is the promise of truly measuring success: not just checking a compliance box or chasing elusive targets, but bringing evidence to bear on the question that animates every afterschool program—“How do we help our kids succeed?” When the data consistently affirms the positive answer, the entire community benefits. From the classroom teacher who sees fewer discipline issues, to the parent who rests easy knowing her child is in a safe, enriching space, to the policy advocate who needs compelling proof of impact, everyone can look to the numbers and the stories they tell. In this way, California’s expanded learning programs continue forging a path toward equity, guided by the simple but profound goal of making each afternoon count.
Chapter Summary
Data collection and analysis are essential tools for California afterschool programs to demonstrate impact and drive continuous improvement. Effective programs focus on measuring success across three key domains: attendance, academic performance, and social-emotional learning (SEL). By establishing clear Key Performance Indicators (KPIs), implementing consistent data collection systems, and using findings to inform program adjustments, administrators can transform raw numbers into meaningful insights. This data-driven approach helps programs identify strengths and address challenges, whether it's boosting attendance on specific days, enhancing academic support in targeted subjects, or creating more inclusive environments for all students. For California's expanded learning initiatives like ELO-P, ASES, and 21st CCLC, communicating these outcomes to stakeholders—through dashboards, reports, and compelling stories—is equally important, helping secure continued support while demonstrating how afterschool programs narrow opportunity gaps and foster student success both in and beyond the classroom.
Key Takeaways
-
Successful afterschool programs track three primary data domains: attendance (maintaining at least 85% of funded capacity), academic performance (through test scores, grades, and teacher feedback), and social-emotional learning (using behavior metrics and validated assessment tools).
-
Implementing consistent data collection protocols—from attendance systems to academic data-sharing agreements—enables programs to identify meaningful trends and make accurate comparisons over time.
-
The Continuous Quality Improvement (CQI) cycle transforms data from a compliance requirement into a powerful tool for program enhancement, allowing leaders to make targeted adjustments based on evidence rather than assumptions.
-
Effective communication of program impact requires balancing quantitative metrics with qualitative stories, creating compelling narratives that resonate with parents, funders, and policymakers.
-
California's expanded learning initiatives, particularly the $4 billion ELO-P investment, provide unprecedented opportunities to narrow achievement gaps through data-informed programming that addresses the whole child.
Action Checklist
-
Consider reviewing your attendance tracking to ensure it captures both daily attendance and "regular attenders" (students attending 30+ days yearly) for ELO-P and ASES reporting requirements.
-
If looking to demonstrate academic impact, you might select just one indicator (like homework completion or reading scores) to track this semester.
-
For programs wanting a simple improvement process, quarterly data reviews can help identify one area to enhance and one success to celebrate with stakeholders.
-
When communicating with parents or funders, a one-page document pairing a key statistic with a student success story can effectively showcase your program's impact.
-
If certain days show lower attendance, you might try scheduling a popular activity (like cooking, robotics, or dance) on that day to naturally boost participation.
-
For programs interested in measuring social-emotional growth, tools like the Youth Program Quality Assessment can complement your existing evaluation process.