How J-PAL's Evaluating Social Programs course catalyzed a new approach to impact evaluation in Virginia
In this guest post, Massey Whorley, Director of Innovation and Strategic Initiatives at the Virginia Department of Social Services (VDSS), shares insights from attending J-PAL’s Evaluating Social Programs Course and how it informed current efforts to design and implement randomized evaluations of VDSS programs.
VDSS runs more than a dozen social programs in collaboration with our local partners to improve the lives of 1.7 million Virginians each year. As the Director of Innovation and Strategic Initiatives at VDSS, part of my role is to ensure that the programs we implement are proven to be effective. Considering the amount of investment that goes into our social programs, we want to be certain that we are designing and delivering high quality human services that help Virginians achieve safety, independence, and overall well-being.
Assumptions are often made based on anecdotal evidence or personal belief that social programs will be effective. However, we recognized early on that such expectations didn’t necessarily translate into actual impact; we needed a way to collect rigorous evidence to test whether our programs were achieving their objectives.
Take, for example, VDSS’s efforts to increase take-up of the Earned Income Tax Credit (EITC). As part of our mandate, VDSS identifies clients who appear eligible for the credit but did not file in the previous year and notifies them of their potential eligibility. While VDSS sends mailers and emails to encourage take-up, we wanted to determine if text message reminders could generate higher rates of tax filing and EITC claims. How could we learn which outreach effort was more effective in getting eligible Virginians to sign up for this crucial social benefit?
As an agency, we did not have expertise in conducting impact evaluations, so we began to look for external support. In early 2019, we connected with Mary Ann Bates, Executive Director of J-PAL North America, and were invited to attend a convening with other state and local government agencies. We learned more about J-PAL’s work to promote a culture of evidence and their dedication to building the capacity of agencies like VDSS to be better producers and consumers of rigorous research. Intrigued by what we heard, we applied to J-PAL North America’s State and Local Innovation Initiative for support in evaluating our EITC take-up intervention. And to build our internal capacity on the design and implementation of impact evaluations, we participated in J-PAL’s flagship Executive Education course on evaluating social programs in June 2019.
J-PAL’s Evaluating Social Programs Course provided the essential building blocks for launching our impact evaluation of the EITC take-up intervention. During the five-day course, my colleague and I made monumental progress in fleshing out the details of our randomized evaluation. The course was excellent at providing a broad overview of the material before quickly diving into essential details of how impact evaluations are designed and implemented.
Through small group sessions conducted after each lecture, we explored key concepts through case studies led by J-PAL staff, who acted as Teaching Assistants (TAs) throughout the week. It was extremely useful to split into these small groups and participate in practical conversations with the TAs and other attendees, allowing us to directly apply the concepts we were learning. Our group discussions were heightened by the diverse perspectives from other course participants. Insights from our cohort colleagues from both within and outside the US helped us look at our program and evaluation design through multiple lenses.
After attending J-PAL’s Executive Education course, our work on the EITC evaluation progressed rapidly due to the fluency and comfort we gained on the concept and application of randomized evaluations. We partnered with J-PAL affiliated researcher Dayanand Manoli to quickly and carefully administer a randomized evaluation with 270,000 households in the winter of 2020, and we anticipate results by early fall of 2020. Evaluation findings will provide vital insight on how to improve our outreach efforts to encourage greater EITC take-up before the next tax filing season.
J-PAL’s Evaluating Social Programs Course also provided me with the framework to understand how and where randomized evaluations can help inform programming. After attending the course, I shared my knowledge across state agencies in Virginia, which ignited a conversation about what other opportunities were available to introduce randomized evaluation in our work.
We found one such opportunity in a program that helps residents save money for a down payment on a home. Due to the limited resources available, there is a possibility that the program could become oversubscribed. We are now discussing how such oversubscription could present an opportunity to randomly select who participates in the program and evaluate the true causal impact of the program on housing opportunity outcomes.
To be sure, randomized evaluations are not the perfect tool for every evaluation need at VDSS. But through J-PAL’s Evaluating Social Programs Course, I’ve gained an in-depth understanding of where randomized evaluation can be most useful. And in such cases, I now understand what steps are necessary to bring a randomized evaluation from an idea to reality, ensuring that the programs we implement are improving the lives of the people of Virginia.