Reflecting on a decade of impactful research at J-PAL North America: Informing crucial policy decisions at scale

Posted on:
Authors:
Three young adults paint a mural

In part five of J-PAL North America’s ten-year anniversary blog series, we dive into informing policy at scale and look at the bodies of evidence on high-dosage tutoring, sectoral employment training, and summer youth employment programs to distill key lessons on how to bridge the gap from research to policy to action.

Over the past ten years, over 35 million people in North America have been impacted by the scale-up of programs evaluated by researchers in J-PAL North America’s network. Rigorous research is a vital component when  developing policies that do a better job of alleviating poverty in North America. But even the highest quality study doesn’t transform policy on its own. 

J-PAL North America has harnessed our unique position as a bridge between researchers and policymakers to catalyze action and evidence-based policymaking by summarizing and synthesizing key evaluation takeaways, communicating findings widely and creatively, connecting decision-makers to the research and researchers, and identifying opportunities for new learning to further inform policy. Below, we share lessons learned from this work.

Sharing generalizable insights from multiple studies is a particularly powerful approach

Bringing applicable research to policy conversations often means looking across multiple high-quality studies to understand what impacts we can have confidence will apply in new settings. Synthesizing takeaways from multiple randomized evaluations can help inform policy in at least two ways. 

First, when critical windows of opportunity or questions in need of urgent answers arise in settings where a study hasn’t previously been conducted, bringing generalizable lessons to the table allows policymakers to apply and adapt findings to their contexts with confidence.  

For example, the Covid-19 pandemic only heightened the importance of a vital question already facing education leaders: How do you best support the millions of students who are behind grade level in North America? Seeing the need for credible insights that could apply across contexts, in 2020 J-PAL North America published an Evidence Review summarizing a meta-analysis of 96 randomized evaluations of tutoring programs and identified tutoring as one of the most effective tools available to educators for accelerating learning. 

These findings enabled conversations with policymakers about effective models and key principles that could be tailored to their districts and schools at a time when adaptability and effective solutions were both sorely needed. As a result, this evidence informed efforts to accelerate learning at both state and national levels, as discussed in part four of this blog series. 

Second, combining evidence from multiple evaluations can fuel policy impact when previous research has formed a muddled or uncertain picture. Layering complementary findings from different studies together can clarify insights that enable policymakers to rally around an effective, evidence-based approach.

For example, job training is often thought to be a common-sense approach for empowering people to find more and better-paying work, especially among workers who face barriers to employment such as not having a college degree. However, randomized and non-randomized research from as early as the 1990s suggested the benefits of traditional job training programs were inconsistent, and programs that initially increased participants’ earnings often saw their effects fade over time. 

Over the last two decades, sectoral employment programs that train participants for employment in specific industries with strong labor demand and opportunities for career growth have emerged as an alternative model. Randomized evaluations in the late 2000s and 2010s demonstrated the potential of such programs to generate persistent increases in earnings and employment and to help participants move into higher-quality jobs. These findings are summarized in a paper by former J-PAL North America co-scientific director Lawrence Katz (Harvard University) and co-authors and a subsequent J-PAL North America Evidence Review

As rigorous research first demonstrated and then reinforced the ability of sectoral programs to improve outcomes for workers, federal policy evolved alongside the evidence. The 2014 Workforce Innovation and Opportunity Act in part pushed for more investment in evidence-based programs by state and regional workforce boards, and competitive job training grant programs run by the Obama administration referenced research on sectoral training models. The 2022 Economic Report of the President discusses sectoral employment training, citing Katz’s work and several randomized evaluations, and calls for more investment in similar programs. And in 2022 the Good Jobs Challenge awarded $500 million to 32 regional workforce partnerships focusing on sectoral training, many of which exhibit some or all of the program components that J-PAL’s evidence review highlighted as most impactful.

Policy impact is iterative and collaborative

Understanding what works best to alleviate poverty and improve lives is not static. Adapting, implementing, and scaling evidence-based programs almost always involves encountering questions and making choices, ranging from tweaking a previous program model to wondering how a program can best serve a new population. Those questions and decisions give rise to new research questions. To continue to learn and improve, it’s important to remain rooted in curiosity and sustain the research-policy cycle.

For example, rigorous research by J-PAL affiliated researchers in partnership with city agencies in Boston, Chicago, New York, and Philadelphia has demonstrated the benefits of Summer Youth Employment Programs (SYEPs) across multiple studies, consistently reducing youth involvement in the criminal legal system. However, they often do not affect future employment rates of participating youth, and while they’ve shown potential to improve educational and youth development outcomes, the evidence is mixed.

As the city agencies and academic researchers continued to collaborate, new ideas arose. In New York, the team piloted a new program component where youth received letters of recommendation from their summer jobs supervisors, as noted in part three of this blog series. In Boston, disruptions to the normal SYEP model during the Covid-19 pandemic led to explorations of how digital programming could utilize the same underlying mechanisms that drove the success of SYEPs. 

Both teams continue to explore innovations together. Notably, this cycle of collaborative learning and iterative policy change hinged on the strength of the relationships between the researchers and government agencies and the dovetailing expertise of both. 

Dissemination matters—the research won’t share itself

Even when dedicated researcher-practitioner partnerships exist, effective dissemination campaigns are vital in order to get information into the hands of other policymakers who can use it.

Synthesis publications, such as the tutoring evidence review, sectoral employment evidence review, and SYEP evidence review mentioned above, are one technique we’ve used to make research insights more accessible for policymakers with limited time and many priorities. J-PAL North America has also published more than 200 evaluation summaries sharing the motivation, study design, findings, and policy lessons from randomized evaluations in North America. These policy publications and evaluation summaries garner more than 25,000 downloads annually. 

We’ve also seen the power of op-eds, written by both J-PAL affiliated researchers and J-PAL North America staff, to point public and policymaker attention toward research findings that speak to acute moments on topics ranging from race-neutral medical school admissions to the social safety net to long-term homelessness.

As one example, in the summer of 2020, Sara Heller (University of Michigan) and Judd Kessler (University of Pennsylvania) authored an op-ed in the New York Times calling on New York City Mayor Bill de Blasio to restore that summer’s youth jobs program, which the city had decided to cancel. Highlighting their evaluations showing the benefits of summer jobs for youth in New York City, Heller and Kessler argued it was vital to find a way to continue the program in the pandemic summer because it would support youth economically, keep youth busy, and save lives by decreasing incidences of violent crime. The city later instituted a virtual-only version of its program.

Throughout our ten years, we have learned that catalyzing evidence-based policy requires distilling the key lessons that matter most from rigorous studies, communicating those insights through a variety of channels, harnessing the varied strengths of researchers, policymakers, and support organizations like J-PAL, and staying curious enough and motivated enough to flow through the cycle again and again. We look forward to continuing to work with policymakers and our research network to act as a bridge from research to policy action at scale, with the ultimate goal of reducing poverty and improving lives in North America.

In J-PAL North America’s ten-year anniversary blog series, we reflect on some of the most impactful randomized evaluations and bodies of research that our organization has supported over the past decade. We also celebrate the tremendous contributions of our researcher network and the policymakers and practitioners who have made this research possible. Part one kicks off the series with reflections from our scientific leadership. Part two explores the role of study design and implementation. Part three dives into effective collaboration between researchers and practitioners. Part four discusses how credible evidence can identify effective strategies to reduce poverty, regardless of the impact estimate. 

Authored By