It is impossible to have a successful applied research project (quantitative or qualitative) without first recruiting a sample. Yet sample recruitment in education has become increasingly more difficult.
In part, the increase in difficulty is due to overreliance on the same few school districts/Charter Management Organizations (CMOs) willing to participate in studies. Many researchers who do this work are in a select few geographies, and they often recruit the same convenience samples based on proximity to their locations. These samples usually consist of large school districts and districts located on the coastal U.S. These samples save time and money, however are usually not nationally representative and lack generalizability .
A closer look at sample recruitment
In our own grantmaking at Overdeck Family Foundation, we have repeatedly worked with partners who faced the challenge of recruiting a large enough sample to detect an effect in impact evaluations. For example, in one recently commissioned study, we contacted 40 schools. Initially few responded, and those who did, did not agree to participate. Given the low-response rate from districts, we worked with our grantee to reevaluate the recruitment plan and implement a new strategy informed by feedback from district staff and best practices. The new plan included:
- Compensation for teachers and districts/CMOs to participate:
- $1K if district signed data sharing agreement (including existing student outcome data)
- $2K if district additionally committed to teacher survey
- $3K if district additionally committed to student survey and interviews (teacher and/or student)
- $30 for each teacher who completed a survey and/or interview
- A lower burden for districts:
- Not asking them to identify classrooms for us to study, but instead identifying them ourselves through survey response
- Asking districts/CMOs to sign a data sharing agreement rather than a research partnership MOU, which districts/CMOs interpreted as a lower commitment
Sample recruitment in education has become increasingly more difficult.
Our new approach resulted in six districts/CMOs signing up to participate shortly after the incentives were implemented.
The figure to the left shows the number of districts/CMOs (40) contacted and the distribution of responses after we tried this new approach.
Best practices for sample recruitment
Given the ongoing challenge of sample recruitment, we have compiled lessons learned and best practices to consider for future grantmaking.
- Recruitment plans: Develop an alternative plan(s) for recruitment if the original is not effective, reevaluate your plan at all stages of recruitment, and allow enough time and money to understand and build the necessary partnerships.
- National vs. local recruitment: Each recruitment effort should be approached with a different strategy:
- To gain a nationally generalizable sample (N=60 schools) that does not use a convenience sample, researchers Furman and Pustejovsky recommend contacting between 600-2000 schools. 
- Local recruitment should put the population of interest at the center of recruitment effort, for example using community advisory boards to spread the word .
- Local recruitment can be effectively done by partnering with intermediaries like Chicago Consortium or nonprofits that already collaborate with the school districts of interest. A research-practice partnership, like the Chicago Consortium, is ideal because both researchers and practitioners are invested, and trust already exists.
- Data infrastructure: Although your data request may seem simple, it may be a heavy lift for school districts especially if they do not have the human (data) capacity, existing work streams, and/or an efficient data infrastructure. Additionally, it’s worth remembering that some districts protect their data for privacy, fear of judgement, or other reasons, and are unwilling to share it.
- Incentives: All participants (teachers, students, and when possible school districts and leaders) should have incentives. It could be direct compensation and/or in-kind depending on what’s in budget and allowable in that district.
- Consider using tiers of incentives and/or flexible options for how the district can engage in the study. For example, districts can receive extra compensation for committing to interviews in addition to surveys.
- Relationships matter: Avoid cold-calling, and instead leverage your personal and professional networks. People are more likely to respond to people they know and those who are like them and/or share their background (i.e. teachers respond to teachers).
- Choose a study partner with an established connection to school districts/CMOs.
- Create a school/district/CMO contact list based on your organization’s contacts and have it readily available if you need to expand your recruitment efforts.
- Be wary that recruitment may depend on teacher buy-in, principal buy-in, district buy-in, or some combination.
- Lower the burden of school districts: Districts have multiple initiatives happening simultaneously, and limited human and data capacity.
- Before approaching a district, browse its website to gain information about district scheduling and timeline considerations. Be mindful of deadlines for research submission requests and student testing periods.
- Communication: Explain your research project using layman’s terms and use consistent messaging in a variety of formats (e.g. handouts, emails).
- Try multiple modes of contact. In addition to email, try calling and mailing materials to get the attention of school districts.
- Ensure objectivity. Researchers and funders need to clearly communicate to districts that they do not have an agenda or anticipated expected finding, but rather will use the scientific method to conduct research objectively.
- Explain what’s in it for them. Frame the study in terms of alignment to the school district’s priorities. Highlight the value of the study and the benefits of participating, as well as financial incentives that could offset any perceived costs. Also state expectations, opportunity costs, and address expected concerns up-front.
While these best practices won’t solve the difficulty of sample recruitment, we believe they will make researchers’ efforts more efficient and lead to studies that use more representative samples, and thus, are more generalizable for future use.
 Roschelle, J., Feng, M., Gallagher, H. A., Murphy, R., Harris, C., Kamdar, D., & Trinidad, G. (2014). Recruiting Participants for Large-Scale Random Assignment Experiments in School Settings. Grantee Submission.
 Tipton, E., Wang, Q., Spybrook, J. (March, 2019) Assessing the relevance of IES Funded Goal 3 and 4 Studies to Policy Populations. Paper presented at Society for Research on Educational Effectiveness.
 Furman, G. & Pustejovsky, J. (March, 2019). Assessing sampling methods for generalization from RCTs: Modeling Recruitment and Participation. Paper presented at Society for Research on Educational Effectiveness
 Wong, V. (March, 2019). The past, present, and future of recruitment and generalization in education. Discussant at Society for Research on Educational Effectiveness
 Mokher, C., & Pearson, J. L. (2017). The Complexities of Recruiting Participants for a Statewide Education Survey. Survey Practice, 10(4).
 Kubicek, K. and Robles, M. (2016, November 11). Resource for Integrating Community Voices into a Research Study: Community Advisory Board Toolkit. Southern California Clinical and Translational Science Institute grant UL1TR001855.