Partnering with J-PAL North America: A Practitioner Perspective
Benefits Data Trust (BDT), a nonprofit partner, is collaborating with J-PAL North America to identify effective outreach strategies to enroll low-income households into benefits. We asked Rachel Cahill, Director of Policy at BDT, a few questions about her experience partnering with J-PAL North America to design an evaluation that will answer important questions about BDT’s work.
What made you decide to partner with J-PAL North America?
I think there is a benefit to the partnership. We were doing this work and we already had another evaluation although not as an RCT, and we were confident that our program worked to help low-income households apply for benefits. Really we were seeking to answer a related question: is the intervention that we already believe is working on SNAP takeup—is it having an impact on health outcomes?
We got connected through the Camden Coalition, and began working with Amy [Finkelstein] and Matt [Notowidigo], discussing what we already knew, to eventually arrive at the first order research question in this evaluation on the effect of our program on SNAP take-up.
We didn’t begin with this particular research question. It was sort of a negotiation—Matt and Amy found that there really isn’t a lot on the effect of outreach and application assistance on benefits enrollment. We bought into that approach.
It was much more of a partnership to decide on the research question.
Can you discuss one or two examples of challenges in delivering your intervention in context of a randomized evaluation and how you worked with J-PAL to overcome them?
There were various challenges. One of the nuances of the program is that we are using data that belongs to Pennsylvania Department of Health and Human Services. We had to broker a three-way agreement between BDT, MIT, and DHS, which was a very large barrier to overcome.
The current Data Use Agreement allowed us to use DHS data for outreach but not for research. Ultimately we did overcome that barrier, but it did delay the launch of the evaluation.
Doing an RCT with an entitlement program, we had to develop a design that was really just a wait list control. It would be much more straightforward to get to BDT’s core question of identifying the effect of the SNAP program by randomly assigning some people to receive the SNAP program and randomly assigning others not to receive the program. However, people who are eligible cannot be denied the program because it is an entitlement.
Instead, we had to think creatively to develop a high-intensity, low-intensity, and control group in an encouragement design. We had to negotiate with MIT because we were seeking simplicity in the design. Initially J-PAL proposed a dozen different types of outreach letters, to explore the effect of many different variations on the outreach strategy. BDT explained we can manage a lot of nuance, but there was a limit at which the ability for us to manage different treatment arms would decline.
There was a tradeoff: Do you keep the design simpler or do you have a dozen treatment arms and increase the probability that you make a mistake?
What lessons or insights about your program have emerged from this partnership and this evaluation?
We have learned a lot about doing research. I joked with Amy that I estimated I would spend 20 percent of my time on this project. She asked how I was going to spend 20 percent of my time on this project, and I said to Amy that it takes 20 percent of my time to answer her questions!
To do it right, which J-PAL does, it does take a lot of thought and planning. It really takes multiple people with different types of expertise. This requires a big resource commitment from the organization.
We also have to use some political capital with our state partners. There’s an opportunity cost there: you have a limited number of requests that you can make in a given time. Looking back on things now, I still would have done the evaluation, but I would have allocated more time and more resources.
For example, we thought BDT would be involved for the first 12 months, but we didn’t budget time for the data analysis because we figured that would be MIT’s role. We realized that the engagement doesn’t end when the 12 months ends, and that we still have to follow up concerning things like data collection.
This experience will just make us smarter in doing future research.
There’s been tremendous value in terms of learning on our program.
We very quickly saw that the marketing letter formatted in a particular way for the evaluation generated an increased response rate of a half percentage point, which may seem small but is a lot for our field of work. This was significant enough to us, not even to wait until the end of the RCT; we thought about how to incorporate this letter design in other states beside Pennsylvania immediately.
The discipline of setting up an RCT was also helpful.
How will your organization use the knowledge generated by the evaluation?
Our hypothesis is that the high-touch intervention will increase take-up more than the light touch intervention and certainly more than the control group.
A common misperception in this field of social services is that just sending out a letter will be enough to increase take-up. This is founded on the premise that low take-up is just an issue of awareness, but we know that the enrollment process takes a lot more than awareness raising. Many of the people we talk to know that SNAP exists—they just can’t imagine going through all the paperwork and enrollment procedures to access the benefits they are eligible for. This is especially the case for the SNAP program, which has one of the more archaic enrollment processes.
The real game changer, is whether we can demonstrate long-term outcomes on health. Amy and Matt believe that we are not powered enough to detect these effects in this particular evaluation.
People are really thinking about investments in social services to decrease healthcare costs. I don’t want to put it bluntly, but that’s where the money is—healthcare is an area where the government is spending so much money. We think this is a really significant opportunity to generate definitive evidence on whether social services can prevent future health costs rather than just having a hypothesis that social services might be helpful.
That’s really what we’re striving for, and we’re willing to go down the rabbit hole to figure that out.
I really like working with the MIT team. I would explain to any other nonprofit considering doing a randomized evaluation what a big deal an RCT is. It’s still a big challenge to get a nonprofit to go down that rabbit hole to answer really tough questions. With a big emphasis on rapid testing, people often don’t want to wait several years to see longer term outcomes. The staff alone in nonprofits often transition with 3 years, so it can be hard to even have the same people working for the duration of the evaluation.
I don’t say this as a critique but just as advice to J-PAL folks—thinking about partners other than government—building a level of transparency on the level of commitment that is required going into an evaluation. I was not aware of how much this would entail when BDT signed on.
Read partner testimonials from BDT and from the South Carolina Department of Health and Human Services here.
In the United States, enrollment in social safety programs is not automatic, and many social safety programs experience low take-up; many individuals who are eligible for certain programs fail to become enrolled. Researchers studied the impact of providing outreach and assistance to households that are likely eligible for the Supplemental Nutrition Assistance Program (SNAP), previously known as food stamps, on enrollment in the program. Researchers found that informational mailings nearly doubled SNAP enrollment while informational mailings plus application assistance tripled SNAP enrollment, suggesting that both the lack of information and the effort required to apply pose barriers to SNAP take-up.
Problema de política pública
The Supplemental Nutrition Assistance Program (SNAP)–previously known as food stamps–provides a benefit that can be spent on food to eligible, low-income households in the United States. In 2013, nearly one in seven households received SNAP.1 Enrollment in SNAP is not automatic: individuals must apply and demonstrate their eligibility in order to receive benefits.
There are a number of potential reasons why individuals who are eligible for SNAP or other social safety programs might not enroll. Individuals may not know that they are eligible to receive benefits. They may also be deterred by the time and effort required to apply for benefits, such as filling out application forms and providing documentation of their eligibility. Can informational mailings about SNAP eligibility and individualized application assistance increase SNAP enrollment? And which types of eligible individuals respond to these interventions?
Contexto de la evaluación
Many social programs in the United States feature incomplete take-up; for example, Currie (2006) documents take-up rates ranging from a low of 10 to 20 percent for the State Children’s Health Insurance Program in the late 1990s and 60 to 90 percent for cash welfare (TANF).2 There is also substantial variation in take-up rates across eligible populations, and take-up of SNAP benefits is disproportionately low among the elderly: in 2013, only 41 percent of those eligible enrolled in SNAP.3 Benefits Data Trust (BDT) is a national not-for-profit organization based in Philadelphia that designs innovative solutions to generate better economic, health and social outcomes for individuals and their larger communities. BDT’s direct service programs provide targeted outreach and person-centered application assistance to individuals who are likely eligible for benefits and services, including elderly households likely eligible for SNAP. Elderly households in Pennsylvania could qualify for SNAP in one of three ways:
- households may have gross income below 200 percent of the Federal Poverty Level;
- have gross income over 200 percent of the Federal Poverty Level but have net income below 100 percent of the Federal Poverty Level and resources below $3,250; or
- can be categorically eligible if all members of the household receive or are authorized to receive a qualifying benefit such as Temporary Assistance for Needy Families.
Detalles de la intervención
Researchers conducted a randomized evaluation to measure the impact of various interventions on take-up of SNAP. Using application and enrollment data for other public benefits, BDT identified households who were likely to meet the income requirements for SNAP. Researchers randomly assigned 31,188 likely eligible households to one of three groups:
- Information Only: One-third received a letter that informed them that they were potentially eligible for SNAP and provided them with contact information for the Pennsylvania Department of Human Services, the state agency that processes SNAP applications. This intervention group included four (randomly assigned) sub-interventions which altered the exact design of the letter, the wording of the letter, and whether or not a follow-up postcard was sent to households that did not call within eight weeks.
- Information Plus Assistance: One-third were mailed a similar letter, which informed them of their potential eligibility and provided contact information for BDT’s in-house call center. For those who called with an interest in applying for benefits, BDT helped screen the household for potential SNAP eligibility and level of benefits. If the caller wanted, BDT also helped them assemble the necessary documentation, submitted the application electronically on their behalf, and assisted with any follow-up questions from the state. Households that did not call within eight weeks received a postcard containing the same information as the letter. This group includes two (randomly assigned) sub-interventions with variations in the design and wording of the letter.
- Comparison group: One-third received no outreach or intervention from BDT.
Resultados y lecciones de la política pública
Researchers measured the impact of the Information Only and Information Plus Assistance interventions on the number of households that applied to and the number that ultimately enrolled in SNAP within nine months. The Information Only intervention increased enrollment by 5 percentage points from a baseline of 6 percent in the comparison group (an 83 percent increase). The Information Plus Assistance intervention increased enrollment by 12 percentage points (a 200 percent increase relative to the comparison group). These results suggest that both the lack of information and the time and effort required to complete and submit an application pose barriers to enrollment. Among the sub-intervention, the follow-up postcards had a significant impact whereas changing the exact design and wording of the letter did not. Sending a reminder postcard in the Information Only group increased SNAP enrollment by an additional 20 percent relative to those who were only sent the first letter. Both the Information Only and the Information Plus Assistance interventions increased applications proportionally to the increase in enrollment; success rates were similar (about 75 percent) across both intervention arms and the comparison arm.
Researchers also studied the characteristics of the individuals who enrolled in SNAP as a result of the interventions. These enrollee characteristics were similar in both intervention arms, but differed from those who enrolled in the status quo comparison group. Individuals who enrolled because of the intervention were—relative to eligible individuals who enrolled in the comparison arm—older, more likely to be white, and more likely to speak English as their primary language. On average, the entire studied population had a lower income than the general population and was more likely to have chronic diseases. Individuals who enrolled because of the interventions had fewer measured chronic diseases prior to the intervention, and they also received lower monthly benefits than enrollees in the comparison group. Since the monthly benefits are lower for individuals with more resources by design of the progressive SNAP benefits formula, this suggests that enrollees in the intervention groups had higher net resources prior to the intervention than individuals who enrolled under the status quo. Nonetheless, the $1,300 per year in SNAP benefits received (on average) by those newly enrolled outweighed the estimated cost of the intervention ($20-$60 per household enrolled) and the processing costs to the state (approximately $240 per application).4
Researchers developed a behavioral model allowing for misperceptions of the safety net program, and calibrated the model with the experimental results. The calibration results suggest that both interventions are a cost-effective way to redistribute to low-income households relative to other safety net programs.
Loveless, Tracy A. “Supplemental Nutrition Assistance Program (SNAP) Receipt for Households: 2000–2013 (American Community Survey Brief No. 13-08).” Washington, DC: US Census Bureau, 2015.
Currie, Janet. “The Take-up of Social Benefits,” in Public Policy and the Income Distribution, ed. Alan Auerbach, David Card, and John Quigley (New York: Russell Sage, 2006), 80-148.
Eslami, Esa. 2015. “Trends in Supplemental Nutrition Assistance Program Participation Rates: Fiscal Year 2010 to Fiscal Year 2013.” Washington, DC: U.S. Department of Agriculture, Food and Nutrition Service.
Isaacs, Julia. 2008. “The Costs of Benefit Delivery in the Food Stamp Program.” USDA Contractor and Cooperator Report No. 39.https://www.brookings.edu/wp-content/uploads/2016/06/03_food_stamp_isaacs.pdf Last accessed June 29, 2017.