Over the past several months, we’ve spotlighted several organizations who, despite the obstacles in their way, have been able to incorporate virtual delivery into their programs and continue to meet the needs of children and families across the country. From tele-home-visiting for new mothers to remote out-of-school STEM programs and tech-enabled curricula and tutoring, our grantees have been on the forefront of innovation as we approach the one-year mark of in-pandemic education.  

Demand for learning opportunities on how to make virtual programs effective has been high, signaling both a desire by educators and practitioners to provide the best possible experience and concurrently to learn what works. A grant we provided to The Brazelton Touchpoints Center (BTC) to host a webinar series on Virtual Service Delivery in early childhood resulted in six webinars with an average attendance of 2,745 people, significantly higher than expected. Attendees worked across family-facing sectors, with the most frequently reported sector being early childhood care and education (38%), followed by early intervention (25%), home visiting (17%), and family/parenting support (11%). For practitioners, who reported feeling isolated and overwhelmed by the pandemic’s impact on their work, these webinars were opportunities for connection, learning, and validation. 

But despite the proliferation of virtual programs and the field’s desire to learn what works, what eluded many of the direct impact organizations we funded, and us as funders, was evidence that the newly virtual programs that so many were piloting were successful in leading to measurable student outcomes. 

Lack of funding for research in education

There were many reasons for this lack of clarity. COVID-19 brought classroom-based assessment to a halt, canceling state and federal testing, including NAEP, which many organizations depend on to understand and measure impact. Many grantees reallocated all their funding to delivering their programs instead of evaluating them, wanting limited funds to be spent on helping their target populations instead of conducting research. And funders often provided additional funding specifically for program delivery, again leaving organizations unable to measure the impact of their new solutions. 

Even before COVID-19, funding for innovation and research in education was woefully low. In 2019, the Department of Education spent $238 million on R&D projects, just 0.001% of the federal government’s $132 billion R&D spend for that year. And philanthropy follows this pattern, with most of the $64 billion given annually to education initiatives funding existing organizations instead of seeding new ideas, according to an analysis by Jason Weeby, formerly of Bellwether Education Partners.

As funders, we believe it is critical for us to support not only innovation, but also research that ensures that innovation has the desired impact on the families and children it hopes to help.

As a foundation, we believe in evidence as a theory for scale, which is why our COVID-19 rapid response grants supported grantees in not only innovating on how they provide their programs, but also attempting to rethink how they measure success. Instead of traditional RCTs and yearslong research studies, we encouraged organizations to experiment with short-cycle evaluations to help them quickly understand the impact of their newly virtual programs and plan for the future.

And while it’s still early, we are starting to see early evidence of efficacy. Today, I’m excited to spotlight some of these promising early results, as well as some takeaways on shared lessons that may signal a higher probability of success for other virtual programs.

Springboard Collaborative

We previously shared news that an external evaluation of Springboard Collaborative’s summer program, which aims to close the Pre-K through 3rd-grade literacy gap through parent engagement, resulted in evidence that meets a Tier 2 rating from ESSA. Knowing that they wouldn’t be able to run their regular in-person program in Summer 2020, Springboard was able to structure an analysis of its new virtual program, called Springboard Learning Accelerator (SLA), that would compare i-Ready Lexile scores from before the shutdown (February / March 2020) to after the summer intervention (September / October 2020). 

The study leveraged school-based assessment data—as opposed to internal program data—to compare a treatment and control group over a 6-month period. Literacy scores were procured from Springboard’s largest program partner in New York City, which is a charter network whose overall performance mirrors that of district schools across the country. The study captured data for 175 Rising 2nd through 5th graders that participated in SLA. The control group included 439 students at the same schools that did not participate in programming.

Data from analysis of Springboard Learning Accelerator

The findings were not only incredibly positive, but comparable to results from Springboard’s Flagship program. Springboard participants averaged a 256-point gain in reading skills, relative to a control group that grew just 41-points. The study also corroborated an important finding from Springboard’s external evaluation of the Flagship programming: the students furthest behind grade-level are the ones making the most progress.

These powerful results occurred despite the virtual SLA program being relatively light-touch compared to the traditional in-school version of Springboard, and ~80% less costly. The findings helped Springboard secure a $3.3M contract from the Massachusetts Department of Education for a statewide tutoring initiative.

“Me and my son were more connected by doing this program together on one table. There was a feeling of achievement every day of this program.”

Parent of Student,  BellXcel

BellXcel

BellXcel, which traditionally provides families access to evidence-based summer and afterschool learning, knew that summer learning on the computer was going to be a challenge. In developing BellXcel Remote, in partnership with Scholastic Education, the organization created a model that aligned with best practices in summer learning, including the following elements:

  • Six hours of content per day for a five-week program, with various implementation options
  • Modular and flexible sample schedules that could be condensed or expanded as needed
  • A blend of instructional time from a teacher and independent work by students and families
  • Extended learning kits enabling students to work independently at home while accessing teacher support
  • Focus on one primary lesson per day, alternating between Math and ELA
  • Live instruction plus one-on-one check ins with students (by phone or computer) 
  • SEL, wellness, and STEAM enrichment activities (including virtual field trips)
  • Family engagement opportunities supported by a family guide, family resource portal, and teacher outreach
  • Comprehensive teacher and administrator guides and professional development

BellXcel partnered with their affiliate, the Sperling Center for Research and Innovation (SCRI), to collect and analyze data on scholar, family, and educator experiences with BellXcel Remote. SCRI found evidence that BellXcel Remote did help bridge the gap between the 2019-20 school year and the new 2020-21 school year and was able to engage youth in summer distance learning. End-of-program stakeholder surveys demonstrate that the program kept scholars, families, and staff engaged in a variety of academic, enrichment, and SEL activities. 

Over 90% of families and staff agreed that scholars were highly engaged during the program. Scholars themselves (92%) reported being focused on their activities during the remote program, and attendance data collected by partners found an average daily attendance rate of 71%. Nearly all families (97%) said their child enjoyed the program. Over 95% of staff would recommend the program to families/caregivers, would recommend it to other teachers, and found working for the program to be rewarding. And over 90% of scholars said they enjoyed the program, the academic activities, and the enrichment activities. 

LENA Start

LENA, which accelerates child language development by using data from “talk pedometers” (small wearable devices that measure the amount of talk and interaction in a child’s environment), knew it had to rethink its LENA Start group delivery model in order to continue its traditionally in-person parent coaching programs. Past evaluations had confirmed that LENA programs benefited both families and children in gaining language skills and improving language quality, but providing coaching in groups was impossible during COVID-19. 

In response to the pandemic, LENA Start launched a virtual implementation option, with coaching and classes occurring both synchronously and asynchronously. While they weren’t originally sure that the new format would be as effective as in-person coaching, new data reveals that families who participated in virtual LENA Start classes enjoyed the same positive outcomes as families who attended in-person classes.

The virtual participants showed similar gains to pre-pandemic, in-person groups, with lower-talk families increasing conversational turns with their children by 12 percentile points and adult words by 35 percentile points. Children who participated in the virtual program gained language skills twice as fast as their peers, the same accelerated rate as seen with in-person attendance.

The data also affirms the many benefits of virtual programs, especially the ease of attending. Families who participated virtually had slightly higher attendance rates than families who participated in-person, suggesting that virtual classes can reduce traditional barriers to attendance such as transportation or scheduling challenges.

Despite creating new programs on short notice, these three organizations were able to build something that was as effective, if not more so, than their traditional offerings.

Shared lessons from early virtual success

The three organizations above succeeded not only in quickly implementing virtual programs in response to beneficiary need, but also in measuring their program’s ability to have impact. And despite creating new programming on short notice, these three organizations were able to build something that was as effective, if not more so, than their traditional offerings. 

What was the secret behind their success? While it’s too early to tell for sure, here are three shared takeaways that we see from Springboard, BellXcel, and LENA Start.

  • Family engagement is key to success with virtual learning. Treating parents as key users and engaging them in their children’s learning was crucial for all three of these programs. For Springboard, the treatment group was comprised of similar students and families with the very same teachers in the very same schools. Yet the group, which leaned on parents helping their children learn to read at home, made more than six times the progress of the control group. BellXcel and LENA Start both also engaged parents as critical factors to children’s success. 
  • Parents need their own tools. In designing their remote summer program, BellXcel placed additional emphasis on enhancing family engagement strategies to promote and support learning at home. They developed an online parent portal, a comprehensive family guide, and provided additional professional development to staff to encourage stronger and more consistent communication to families. Similarly, Springboard developed a literacy assessment that parents could administer at home, giving them visibility into their child’s reading development.  And LENA Start continued to provide parents with their own dashboards measuring the quality of their conversations with their children. 
  • Create supportive learning environments for youth and families. Trying to replicate the social capital benefits of the in-person group model, LENA Start gave parents the opportunity to absorb the learning content on their own and then sign on for a live interactive follow-up discussion. BellXcel complemented their academic programming with engaging and fun enrichment opportunities, providing children and their families flexibility on when and how they engaged with the material. And Springboard ensured that participants felt supported by developing ongoing touchpoints for families and their child’s teacher. 

As funders, we believe it is critical for us to support not only innovation, but also research that ensures that innovation has the desired impact on the families and children it hopes to help. Despite it being a challenging time for conducting research, doing so is more important than ever given the rapid changes that education has experienced due to the COVID-19 pandemic. We are excited to continue supporting organizations that not only rise to meet the need, but are also dedicated to measuring and evaluating their work with an eye toward impact and continuous improvement.

Join us to hear directly from Springboard, BellXcel, and LENA

Join us at 3pm EST on Thursday, February 25th for “Celebrating Innovation: Short Cycle Evaluation During COVID-19.” This webinar, which will feature leaders from LENA, Springboard Collaborative, and BellXcel, will provide more information on how these organizations succeeded in conducting short cycle evaluations during COVID-19. Panelists will share what they learned about innovation and measurement during a pandemic and delve into how they’re using the results to support their current programs and plan for the future. The webinar will be moderated by Anu Malipatil, Vice President of Education for Overdeck Family Foundation.

Sign Up Now