News & Resources
Best Lessons, Best Mistakes, and What's Ahead in 2020
Posted on Tuesday, January 7th, 2020 by Anu Malipatil
As we look back on 2019, we have a lot to reflect on and share with you all. This past year, we distributed approximately $36M and supported almost 100 grantee partners, reaching over 12.7M students. Additionally, we completed over ten research studies (findings and an overview can be found here in our 2018 and 2019 mid-year and end of year research overview). Mostly, this past year, we are proud of the time we have spent internally redefining all of our strategies aligned to our High School Readiness North Star, creating greater organizational clarity and coherence.
More than ever, through our new communications efforts, we amplified our work and the work of our grantee partners in 2019. Through more than 33 thought leadership pieces and over ten conference presentations, the Foundation shared strategies and partnerships externally and highlighted the work of many partners we support. As we continue to grow, Overdeck Family Foundation sees as its responsibility to use our national platform to share the stories of the work our grantees are doing that is most profoundly impacting families and kids across the country.
Below, we share 2019’s lessons learned, our “best mistakes,” and a look ahead to 2020 and beyond.
Overdeck Family Foundation sees as its responsibility to use our national platform to share the stories of the work our grantees are doing that is most profoundly impacting families and kids across the country.
Biggest Lessons
This year was our Foundation’s largest grantmaking year to date, so it’s not surprising that many key lessons surfaced that will guide our work in 2020 and going forward. Below are our top five takeaways from 2019, spanning across our five portfolios.
1. Data capacity and R&D remain underfunded by philanthropy. Organizations continue to struggle to attract and retain data talent, limiting their ability for in-house learning agendas, strategic research and development, or data-informed learning and improvement.
- Investing in data capacity supports for organizations has been an effective and supportive strategy that allows organizations to reduce reliance on outside experts. More of this type of support would help them build their internal ability to understand, learn, and report on organizational outcomes and improvements (e.g., Harvard Strategic Data Project Overdeck Fellows).
2. Promoting standard grantee metrics and measures is difficult, but not impossible, with the right toolset. Several of our portfolios struggle with defining and tracking standard metrics across grantees, especially those that operate outside the K-12 school system, such as organizations that fall into our Early Impact and Inspired Minds portfolios. However, to measure our Foundation’s effectiveness at supporting meaningful impact, we’ve realized that we need to have grantees align on how they define success. We’ve tried to do this in a way that is collaborative and helpful to all involved (versus top-down from our program teams), and have seen promise using the following tactics:
- Convening grantees to provide an organic and collaborative forum to be part of the generative process of creating shared measures.
- Funding researcher-led efforts on shared measurement (e.g., data analytics cohorts) to provide a more structured path forward that is supported by research and field experts, versus being driven by the Foundation.
- Identifying a high-credibility research partner to work directly with organizations on research and assessment measures while also providing technical assistance to support meaningful decision-making, adoption, and use.
- Tapping into the state, local, or federal funding streams that reward shared measurement (e.g., Race to the Top or Pay for Success) that encourage grantees to work together in service of a larger goal.
3. Evidence has the potential for a more significant impact if the intended purpose or goal is clear at the outset. Evidence requires buy-in from different audiences, and evidence for “proving” vs. “improving” vs. unlocking government dollars requires different approaches and stakeholders from the launch of the efforts. Working together with research, practitioners, and perceived end-users can help drive a greater collective effort towards achieving shared results.
4. One way to help researchers spread their findings is to partner them with high-quality content disseminators, with existing distribution channels. This is particularly effective if a portfolio funds both content creators and content disseminators. For example, our Early Impact portfolio supported cross-grantee collaboration partnerships that brought high-quality, evidence-based parenting content to 18,000 caregivers who would otherwise have missed it. These partnerships helped content creators find new avenues to families, with a symbiotic benefit for content disseminators who did not have to create the content themselves. And while, for research partners used to writing in academic language, it could be daunting to modify research into a shorter format and a more conversational tone without losing meaning and precision, our grantees found that it opened doors of possibility for them in terms of expanding their audience. We expect to see similar synergy in our grant to Roadtrip Nation under the Exceptional Educators portfolio, which will use documentary-style storytelling to attract attention to the teacher retention crisis and opportunities to fix it, as well as our grant to EdTrust under the Data for Action portfolio, which funds a series of podcasts using Sean Reardon’s SEDA database to spotlight Extra-Ordinary school districts around the country.
5. Scaling from early adopters to an organization’s total addressable market is difficult, even for the most “proven” organizations. Several of our grantees had difficulty meeting their reach expectations, which were modeled on a “hockey stick” growth pattern often seen with technology companies. What these organizations found was that linear or stepwise growth was much more likely, despite initial success and traction in increasing their reach. This was true not just for local but also national organizations, which expected to see “many ways to reach many.” Instead, what they saw were many barriers, including difficulties navigating funding, recruitment challenges, competition for student time, district leadership transitions, and even changing federal policies. These challenges make it more important than ever for foundations to ensure that organizations have sound growth plans and the internal capacity and resources needed to scale before setting large-scaling goals. We put many of these organizations in our “Equip” stage and often focus funding specifically on the capacity-building that they would need to do before meeting their scaling goals.
For research partners used to writing in academic language, it can be daunting to modify research into a shorter format and a more conversational tone without losing meaning and precision, our grantees found that it opened doors of possibility for them in terms of expanding their audience.
Best Mistakes
When I was first putting together my list of Best Mistakes from 2018, I was nervous about how the field would receive it. After all, it’s easier to pretend that everything is perfect rather than take a critical look in the rearview mirror. But after all the gracious feedback I received from both grantees and other funders, I decided this was a worthwhile annual exercise. So, in the spirit of transparency and learning, here is my list of 2019’s best mistakes.
1. Knowledge unto itself doesn’t drive behavior change. Investments in new research and knowledge products often have not translated into new practices and policies. These investments require intentional plans for dissemination and encouraging adoption by intended audiences. As a foundation, we spend about 20% of our grantmaking dollars funding research and evidence building. Still, often the partner or we are not clear about the end-goal—what we expect people to do with the new information and what changes will result.
- Being clear about the audience for the knowledge and what intended behavior change we expect to see, or how we would define success, is the best starting place.
- Defining the difference between knowledge and delivery systems can help us to bridge the gap between knowledge and how we expect that knowledge to be understood by audiences or beneficiaries. Understanding the existing behaviors, values, and motivations of beneficiaries can help to support and sustain behavior change towards more evidence-based practices.
2. When it comes to research that finds null or adverse effects, a lack of clarity about our Foundation’s communications goals leads to confusion down the line. As a foundation that spends nearly 20% of its annual distribution on research, we haven’t done enough thinking about how to communicate findings with less-than-desirable outcomes. This year, we’ve run into this situation multiple times, and each time felt a little stuck about what to do. On the one hand, we didn’t think that it was appropriate for us to hurt a grantee by publicizing what can be seen as a “bad” result. But on the other hand, it is our job to share what we’re learning with the field to drive evidence-based decision-making.
After much consideration, we decided to take the following steps for all research that we fund:
- Require all grantees to pre-register their research designs at an open registry such as the Registry of Efficacy and Effectiveness Studies (REES) and ClincialTrials.gov based out of the U.S. National Library of Medicine
- At the beginning of all projects, work with grantees to set clear, agreed-upon expectations, including developing strawman communications plans for possible outcomes, including positive, null, and adverse effects
3. Trying to decide between dichotomies in education misses the point. Funding work that focuses on just social-emotional learning or academic learning is too narrow an approach. It’s become more apparent than ever that social-emotional learning and skills are not a trade-off with, but a pathway to, academic achievement. While it’s true that incorporating a social-emotional curriculum into the classroom can take some time away from academic subjects, multiple studies have found that students who develop social-emotional competencies receive higher grades and test scores. Social-emotional skills—like the ability to regulate emotions and build positive relationships with peers and adults—can lead to 11% gains in academic achievement, in addition to improved behavior and well-being. The student-centered schools we fund in the Innovative Schools portfolio outperform their local districts by 15% on average.
Additionally, we’ve found that out-of-school STEM programs in the Inspired Minds portfolio often improve both social-emotional and academic measures, not just one or the other. These programs are in high demand by low-income and minority families. When it comes to math, they can successfully decrease the achievement gap between low‐ and high‐income students while improving work habits, increasing levels of persistence, and leading to better in-school attendance.
We see other dichotomies continuing to take hold in education, overlooking the amplified impact we can see when both strategies are pursued together: Personalized learning vs. standards-based approaches, data for accountability vs. learning/improvement, top-down vs. bottom-up, and in-school vs. out-of-school.
- In service of seeing the integration of dichotomies as more impactful than either one individually, our Innovative Schools portfolio worked in partnership with Bellwether Education Partners and many other funders to complete a report on how to address learning gaps while helping students attain grade-level knowledge. The report, Insights from Ongoing Work to Accelerate Outcomes for Students with Learning Gaps, was funded by the Instructional Materials Funders Group, an informal community of funders that includes Carnegie Corporation of New York, Chan Zuckerberg Initiative, Bill and Melinda Gates Foundation, William and Flora Hewlett Foundation, W. K. Kellogg Foundation, Overdeck Family Foundation, Robin Hood Learning and Technology Fund, and the Charles and Lynn Schusterman Family Foundation. The goal was to create a more in-depth, shared understanding of ways in which personalization can be paired with a commitment to standards-aligned materials to accelerate learning.
Based on Bellwether’s analysis of the existing evidence and learning science, interviews with over 50 stakeholders, and a closer look at 14 model schools, they hypothesized that there is a path forward that would help students who are behind not only get back on track but also gain grade-level knowledge. This evidence-based approach, called rigorous differentiation, would provide students instruction grounded in high-quality materials alongside differentiated support. In practice, students experiencing rigorous differentiation would:
- Have equal access to grade-level work;
- See the coherence across different materials and learning experiences;
- Be in an environment that fosters engagement and agency; and
- Have a caring relationship with their teacher, with frequent 1:1 and small group learning opportunities.
We believe this report is a significant first step in figuring out how to combine the benefits of high-quality instructional materials and personalized learning. As such, we are eager to continue an authentic dialogue with communities, curriculum providers, educators, funders, and researchers around a ‘both/and’ versus an ‘either/or’ vision.
Looking Forward to 2020 and Beyond
Ever since I started working at Overdeck Family Foundation, we’ve been grappling with setting a meaningful and measurable goal to track progress towards our organizational mission of measurably enhancing education both inside and outside the classroom. We knew this goal—which we began calling our Foundation’s North Star—would have to be something that both inspired and synthesized our grantmaking. Because we have five distinct portfolio areas, we also knew that North Star would have to be broad enough to encompass our focus areas, while being specific enough to ensure that we were using our grantmaking to make the most impact for kids and families given our existing resources.
We are currently in the process of finalizing our North Star goal and connecting it to each of our existing portfolio areas. As part of this process, we have looked to the evidence base and consulted with partners like Strive Together and the Bill and Melinda Gates Foundation who have undertaken similar work. We come to this work with humility and recognize that, for the North Star to matter, it must help us convey the coherence across our grantmaking and show how respective portfolios all contribute toward a collective goal. Our funding spans birth to high school and encompasses both in and out-of-school initiatives, so a common measure of success—drawn from indicators within our existing portfolios—is not a simple thing. However, we believe it is a vital piece to providing us (and our grantees) with a clearer target and a better way to measure the impact of grantmaking across the Foundation. More to come on this in Q2.
We believe we can provide support to both early and growth-stage organizations that are focused on creating measurable impact for children and families.
In addition to finalizing our North Star, we are also in the process of further solidifying our funding model. We know that systems change efforts are complex, messy, and take long to manifest into results. We also know that systems need evidence-based providers of services and programs. Given our unique value proposition around data, research, and evidence, we believe we can provide support to both early and growth-stage organizations that are focused on creating measurable impact for children.
This year, we introduced our two-pronged approach to investing, splitting our funding into grantmaking that 1) helped early-stage initiatives develop and validate their offering and evidence base, and 2) scaled evidence-based projects and organizations. In 2020, we expect to continue with this model while exploring funding options that allow us to take more risk with early-stage organizations and provide more GOS to growth-stage organizations that have demonstrated impact, scale, and cost-effectiveness.
By further aligning our funding model with our goals and our strengths, we expect to 1) encourage innovation by supporting research and development for early-stage efforts; 2) help early-stage efforts validate their impact; 3) equip organizations to scale through capacity building efforts around research / development, cost analysis, and beneficiary feedback; and 4) grow organizations that have demonstrated cost-effective impact at scale. Ultimately, we view our responsibility as building a deeper evidence-base and developing new insights that fuel the adoption of cost-effective programs and solutions by the public sector and in communities across the country.
As you can see from all the above, it’s been an exciting year. We are so thankful to our grantees for all the work they do in the field and are excited to continue partnering with them on what’s to come in 2020.
-Anu Malipatil
Vice President, Education