Building research partnerships to address wildfire risk in Jackson County, Oregon
In 2021, Bob Horton was serving as a fire chief in Oregon’s Jackson County. Their district had just lived through the 2020 Almeda fire, which destroyed 2,600 homes and was the most destructive wildfire in Oregon’s recorded history. As community members came together to rebuild, Bob Horton strategized how to better equip households to be resilient to wildfire and prevent future catastrophes. He turned to research to inform his decision-making and reached out to J-PAL North America for support.
Through J-PAL North America’s Evaluation Incubator for state and local government agencies, Bob attended J-PAL’s evaluating social programs course, received funding, technical assistance from J-PAL staff, and connections to researchers in the J-PAL network. With the help of this support, Bob and his research partners Judson Boomhower (University of California, San Diego) and Patrick Baylis (University of British Columbia) received funding for a randomized evaluation and research management support (RMS) from J-PAL to evaluate a program encouraging households to take-up wildfire prevention practices. We spoke with Bob Horton, Judson Boomhower, and Patrick Baylis to learn more about their partnership.
Bob, why was Jackson County Fire District 3 interested in evaluating an intervention to reduce wildfire risk in Oregon?
We spend a lot of money on mitigating wildfire risk—both as a fire district and as a nation. I wanted to make sure our investments were strategic and meaningful. I was confident that entering into a project with rigorous evaluation would ensure this was the case. Todd Pugatch, a professor of mine at Oregon State University and researcher in J-PAL’s network, knew I was interested in wildfire risk and that I worked in local government. He connected me to J-PAL North America’s Evaluation Incubator opportunity that he had learned about through a newsletter.
We had a lot of encouragement, excitement, and enthusiasm around growing our fire district’s capacity for evaluation through partnership with J-PAL. We simply don’t have the administrative capacity, knowledge, skills, and abilities to do something like this on our own, which is a shame. I think local government and special districts should have teams to do these kinds of evaluations, but we don't, so we're grateful that J-PAL offers support.
Judd and Patrick, what drew you to take on this project as academic economists?
It was immediately clear that this was a great opportunity. We would have the chance to work with expert practitioners who were committed to careful evaluation, and we would get to shed light on first-order economic and policy questions around adaptation to growing climate risk. It was great to have J-PAL building the bridge between us as academic researchers and Chief Horton and Jackson County Fire District 3 as practitioners.
Bob, what was needed to design the intervention and prepare for evaluation?
I didn’t have a good idea of how much work it is to go from an evaluation concept to implementation. A lot of our time was spent thinking about what was actually feasible. We could dream about all these different designs or interventions we would want to do, but at the end of the day, they needed to be ideas that we could practically implement.
I was at an advantage because I was the chief executive of the Fire District and the board of directors I reported to, and my team supported this work. My team probably didn't like all the unusual ideas I brought forward, however, we found ways to get what we needed done. Getting buy-in from higher-ups or other other partners can become a barrier. In our case, this project was entirely within the scope of our fire district and we had resources invested in wildfire risk reduction already, which made moving this work forward a lot easier.
What role did J-PAL’s technical assistance and research management support play?
Bob: J-PAL’s support was just flat instrumental—it was as if the staff on our project were part of our organization. It was like instantaneous capacity growth.
The staff took the time to understand our community, our organization, and the risks for this work to go sideways. They encouraged us to think about all of that ahead of time, before we got to the actual intervention design. Because we don’t have experience running evaluations at a large scale, we simply needed the thought partnership that J-PAL provided. Then, when Judd and Patrick came in, they helped us to think about what had been done in similar projects. In partnership with them, we refined, refined, and refined, until we ultimately had interventions to test.
The partnership we’ve had with Judd and Patrick has been incredible. I was unsure at the front end that any researchers would even want to be a part of something like this. J-PAL connected us with two absolute gems for this kind of work. They’ve been very understanding and were willing to open up our partnership for a qualitative researcher at Oregon State University to also help us. Throughout it all, nothing got in the way of putting the members of our community first.
Judd and Patrick: Designing and implementing a field experiment is a big lift. It involved developing outreach materials, securing appropriate research approvals, and building working relationships with our partners in Oregon. J-PAL's RMS team was instrumental in helping us make continuous forward progress on all of these fronts. Their support was also essential for helping us think clearly about the tradeoffs between what is possible and what would be perfect (for example, by setting up meeting agendas, taking notes, and generally keeping us focused on key objectives). It also expanded the scope of the possible by helping us overcome challenges to implementation (for example, experimenting with other mailer designs to increase participation).
Finally, working alongside Bob and his team has been a real pleasure. Their deep understanding of the Jackson County District 3 community and the wildfire risk challenges they face has helped sharpen our thinking about the broader key economic tradeoffs rooted in public policy-making around natural disaster risk. We are extremely grateful to J-PAL and the RMS team for making such a great partnership possible.
Where do you see this work going from here?
Judd and Patrick: We are really excited about what we are learning in the current pilot experiment. We have already had lots of good conversations with our fire district partners about follow-up questions and ideas. Stay tuned!
Bob: We’re excited to think about how to scale up and look more broadly than Jackson County. Many states and counties are learning that they’re more wildfire prone than they thought. Many more are learning that in the next thirty years they will become wildfire prone as a result of changing climates. With this in mind, answering these questions is critical to build a knowledge base focused on promoting communities resilient to wildfire. We hope that putting more evaluation and evidence behind these questions will help us learn more about how we can be prepared.
This piece is the seventh in an ongoing series highlighting research partnerships with state and local governments fostered through J-PAL North America’s Evaluation Incubator. The fifth & sixth pieces feature New York City’s evaluation of their summer youth employment program.
Mireille Jacobson (University of Southern California), Weston Merrick (Minnesota Management and Budget or MMB), and Adam Sacarny (Columbia University) sit down with J-PAL staff to discuss the results of their randomized evaluation assessing how various letters affected physicians’ use of Minnesota’s prescription monitoring program (PMP).
Mireille Jacobson (University of Southern California), Weston Merrick (Minnesota Management and Budget or MMB), and Adam Sacarny (Columbia University) sit down with J-PAL staff to discuss their recent paper in Health Affairs. The paper reports on the results of their randomized evaluation assessing how various letters affected physicians’ use of Minnesota’s prescription monitoring program (PMP). The PMP is an electronic database that tracks controlled substance prescriptions in the state. They discuss the results, broader implications, and future opportunities for evaluation.
This evaluation was awarded funding and technical support through J-PAL North America’s State and Local Evaluation Incubator and received Research Management Support. In part one of this blog series, published in the summer of 2021, Weston and Adam discussed their research partnership and the development of the study.
What were the key outcomes or results of the study?
Adam: The first main outcome was whether the clinicians in the study searched the prescription monitoring program (PMP). On this outcome, we see pretty substantial and durable effects. Search rates rose by about four percentage points from the most effective letters, and that effect lasted at least eight months.
The second outcome was whether the clinicians changed their co-prescribing of opioids with two other classes of medications, benzodiazepines and gabapentinoids. We focus on co-prescribing because these drug classes have interactions with opioids that can increase the risk of overdose. We don't detect any effects there, which probably reflects that prescribing is a lot harder to change than PMP use.
Mireille: We also saw that some clinicians who received letters made new PMP accounts, as they didn’t previously even have an account, even though they should have been searching the PMP based on both the law in the state and the best clinical practice. So these letters brought a hard-to-reach population into using the PMP who may not have otherwise engaged with it.
What are the broader implications of the results?
Adam: Every state has or is setting up a PMP, but the effects of these programs from recent meta-analyses look a bit disappointing. One of the reasons might be that many clinicians don't use them. What our study shows is that simple letters can move the needle on clinician engagement with PMPs. As a result, these letters could be of interest to other PMPs or even healthcare organizations that want to get their clinicians to use the PMP.
In addition, while changing prescribing with letters seems to be more difficult, getting clinicians to use the PMP could still be useful because it could help clinicians become better informed about their prescribing. They then have a reason to inform patients about the risks that they might face from taking these prescriptions, like overdose, and tell them about naloxone, the overdose reversal medication.
How are these results informing future policy choices?
Weston: The Minnesota Board of Pharmacy was an extremely engaged partner in this work. They care a lot about making sure that the PMP is an effective system for providers to use. They used the results of this study to create and inform some automated messages sent through the PMP, using the behavioral lessons we used in this study.
The Board of Pharmacy is also very interested in looking at further interventions to increase the likelihood that folks are using the PMP and using it correctly to prescribe in a safe way.
More generally, we've started to realize that these nudge messages are about improving the customer experience of accessing the government. There's a reason why these prescribers aren’t using the PMP— maybe they don't know about it, maybe they don't know how to sign up, maybe they forgot their login, whatever it may be. This simple letter was able to prompt them to better use a service that was available to them. This is just one example of how we can continue working to improve access to, and use of, government services across the state.
Are you considering future collaborations in this partnership?
Adam: We’re certainly trying to build on these successes and use the groundwork already laid to launch new projects. I’m hoping we can work on interventions that not only try to stop risky prescribing but try to promote beneficial care. Weston and I and our collaborators have been talking and trying to think about both sides of the issue—not only reducing bad things but also increasing good things—and how nudge-type interventions can be useful there.
Mireille: Yes—and all of these strategies are fundamentally about trying to mitigate the harms of the opioid epidemic, from every angle possible, whether through better prescribing or better access to treatment.
Weston, more generally, how has this partnership impacted the ways you and your team think about the role of evidence?
Weston: This project was a really important example of a research partnership that worked really well. We were able to leverage outside resources and expertise to identify a meaningful policy question, use our existing data to answer that question, change policy, and then publish results in a journal so that places outside of Minnesota can learn from our experience. Having that kind of example allows us to go back to state leaders and say “this is why we should do an evaluation, and this is what you’ll get back,” which makes it much easier to start future projects that can meaningfully impact the health and well-being of Minnesotans.
Any last thoughts?
Weston: I'll just end on the fact that we're so grateful to the Board of Pharmacy and PMP staff to endeavor on this project with us. They've been a really wonderful partner and we’re hoping to work with them more on future projects.
Adam: I would second that—we are really grateful that the PMP was willing to devote their time to this rigorous evaluation.
Mireille: I also want to emphasize that it was invaluable to have active collaborators, like Weston and Ian Williamson, who work in the government. They were able to help manage the implementation process given their knowledge of organizational constraints but, more importantly, were able to help design and implement a project that is relevant not only for academia but also for policy.
This piece is part of an ongoing series highlighting research partnerships with state and local government agencies fostered through J-PAL North America’s State and Local Evaluation Incubator. The second & third pieces feature Shasta County's evaluation of text message reminders to reduce failure to appear in court.
J-PAL North America talked with Project Manager Shawn Watts of Shasta County Superior Court, who shared the Court’s takeaways from the process of designing a randomized evaluation through the State and Local Evaluation Incubator.
In 2019, California’s Shasta County Superior Court applied to J-PAL North America’s State and Local Evaluation Incubator (then “Innovation Competition”). They sought to design a randomized evaluation to test strategies to reduce the likelihood that those awaiting court processing in the community fail to appear (FTA) for their arraignment. FTA can be costly for those summoned to court. Even for minor offenses, an FTA can lead to additional fines, and in some cases, an arrest warrant, which can have serious long-term effects on an individual’s record.
Through their partnership with J-PAL North America, the Court connected with researchers Emily Owens and CarlyWill Sloan to conduct a randomized evaluation testing the effectiveness of text reminders on reducing FTA among both the housed and unhoused population living in Shasta County. We talked with Project Manager Shawn Watts of Shasta County Superior Court, who shared the Court’s takeaways from the process of designing a randomized evaluation through the State and Local Evaluation Incubator.
What made Shasta County interested in J-PAL North America’s Evaluation Incubator? What ultimately made you decide to apply?
The research we did on FTA revealed that there were few strategies available to reduce it. At the same time, the number of people experiencing homelessness was increasing across the country, including in Shasta County. One of the only effective strategies was reminder notifications, including telephone calls, postcards, and text messages, which were proven through randomized evaluations to reduce FTA rates. However, most studies were conducted in urban areas. We were interested in determining if reminder systems were also effective in a semi-rural setting such as Shasta County, especially among those experiencing homelessness.
J-PAL North America’s mission and their use of randomized evaluations to understand the effectiveness of a strategy seemed to be just what we needed. Additionally, the grant funding from J-PAL made such a rigorous evaluation possible despite our own limited budget.
Can you talk about the role that J-PAL North America’s Technical Assistance (TA) played in launching this study, and any ways that TA may have contributed to its success?
J-PAL North America’s technical assistance played a crucial role for us from the beginning of the project. Staff provided assistance with preparation to apply for J-PAL North America’s State and Local Innovation Initiative Request for Proposals and were always available to answer questions. They were supportive and served as good project coordinators, always keeping track of the progress of our work and keeping us on schedule.
Prior to this project, the Court had no experience with randomized evaluations. J-PAL North America funded attendance at a one-week course at MIT for our project manager and that allowed us to gain an understanding of the basics of good, predictive research. Our understanding of that process helped us to be successful when working with Emily Owens, our research partner from the University of California, Irvine with whom we connected through J-PAL North America.
What did you learn from the TA process and this randomized evaluation? What advice would you have for a governmental body looking to evaluate one of its programs?
We learned a lot from the J-PAL and the TA process. First, we learned that there are resources available, even to small organizations like our own, to help identify solutions to issues facing our communities or improve our processes. Before this project, we had no idea that funding and high caliber training was available to small, local government jurisdictions.
We also learned the value of randomized evaluations. Most governmental agencies, especially city or county agencies, do not have resources to conduct research. Consequently, these agencies come up with what they think are solutions to their issues, but in reality, good evidence doesn’t exist to show that what they are doing is making an impact. By running a randomized evaluation, the agency can be sure that what they are doing is truly effective.
What are some features that you would consider critical for a government agency to have in place prior to engaging with J-PAL North America through the Evaluation Incubator process?
A specific individual who can interact with J-PAL and is knowledgeable about most of the process of the agency is helpful to the success of the Evaluation Incubator process. It is also helpful if that individual has the ability to interact with all levels of the organization and local partners who also may be involved in the project.
Automated data that the agency may collect is helpful to the success of any project. I would suspect most research that is done relies on the availability of easily queried data.
A strong drive to solve a problem is also helpful to keep the agency engaged in the process and the research on track.
What are some ways that you see government organizations as uniquely positioned to generate and use evidence?
Governmental agencies touch every aspect of our lives and are charged with serving the public. They have access to, and collect, data that is not readily available to the private sector. Governmental agencies should maximize the use of that data for the public good. In our case, we have the responsibility to make access to the Court as easy as possible for the public, and to use our data to determine how to help our constituents, such as by sending court reminders through texts.
This piece is the first of a two-part blog series highlighting our research partnership in Shasta County. The second focuses on the researcher-practitioner partnership cultivated between Shasta County Superior Court and J-PAL affiliate Emily Owens.
This piece is also part of an ongoing series highlighting research partnerships with state and local government agencies fostered through J-PAL North America’s State and Local Evaluation Incubator. The first piece features Minnesota Management and Budget and the Minnesota Board of Pharmacy’s evaluation of the state’s prescription monitoring program (PMP).
In the first of a two-part blog series, Judd Kessler (University of Pennsylvania), Sara Heller (University of Michigan), and Julia Breitman (New York City Department of Youth and Community Development) discuss their research partnership to evaluate summer youth employment programs and the development of their research question.
Summer Youth Employment Programs (SYEPs) are municipal programs that provide qualifying youth and young adults ages 14 to 24, often from households with low incomes, with a paid, part-time job and related supplemental services during the summer months. Rigorous research, like that showcased in J-PAL North America’s Evidence Review, shows that participating in an SYEP can have many benefits for youth both during and beyond the summer.
As the result of a long-term existing partnership, in 2017 J-PAL-affiliated researchers Judd Kessler (University of Pennsylvania) and Sara Heller (University of Michigan) worked with the Department of Youth and Community Development (DYCD) in New York City (NYC) to evaluate a component of the NYC SYEP that provided letters of recommendations to youth participants.
We interviewed Judd, Sara, and Julia Breitman, Assistant Commissioner of Youth Workforce Development at DYCD, to shed light on their partnership. In part one of this blog series, Julia, Judd, and Sara discuss their partnership and the development of their research question.
What motivated DYCD to evaluate New York’s SYEP?
Julia: The Summer Youth Employment Program is an institution in NYC, serving countless New Yorkers since the program’s inception in 1963. While SYEP has always been a foundational experience for NYC youth and is the city’s largest workforce program, we wanted to see if there was an economic impact to this program. SYEP had never been formally evaluated, and while we instinctively knew that the program changed lives, we were looking for a formal evaluation partner. Judd came in at the perfect time for us. The project initially started by looking at the economic impact of the program, followed by questions about public safety. Now, we can say, without a doubt, that this is how many lives were saved through participation in the program, and we have the data to back it up. It was a really interesting partnership that led us to change the focus of the program, the groups of young people we focus our resources on, and how services are delivered.
Why were you interested in working with J-PAL North America to support this evaluation, and what drew you to partner with DYCD?
Judd: My partnership with DYCD started when I first became an assistant professor at the University of Pennsylvania. The first project we did together was using the existing randomization in the lottery for spots in the NYC SYEP, evaluating the effect of the program on youth outcomes. DYCD was a phenomenal partner in that project—they were eager to learn what the impact of the program was and not afraid of doing research that might uncover limitations of the program. We found that SYEP had tremendously positive outcomes in terms of incarceration and mortality: SYEP was keeping youth out of prison and keeping them alive. What was surprising was that the program did not have positive impacts on youth participants' future labor market outcomes. That was a question that interested me as a researcher and a finding that troubled DYCD. They wanted the program to help youth with future job prospects as well, which is what led to the project that J-PAL North America funded. We built a letter of recommendation software, so supervisors could easily create letters of recommendation for randomly selected youth. Given our interest in this project and how well it fits with J-PAL North America’s mission, we thought J-PAL would be interested as well.
What about this partnership stood out to you?
Sara: DYCD is a really extraordinary partner. They are a public agency that is genuinely interested in better understanding the impact of their programs and improving what they do. They took the lessons of Judd’s initial SYEP study to heart, and spent a lot of time thinking carefully about program changes that could amplify the strengths and address the weaknesses of what they were doing. Working with an agency like that is an amazing privilege. A group of staff who have a real interest in learning and improving is perhaps the single most important thing for researchers to accomplish policy-relevant research. The staff’s commitment to learning and self-reflection leads to great communication and allows us to understand the relevant institutional details, and helps us improve our ideas.
Can you share how your partnership with Sara and Judd began?
Julia: We were working with Judd first, when the team was looking at the economic impact of the program. Sara joined Judd to look at the criminal justice aspects of the program. The initial study showed that the economic impact is great during the summer, but the long-term economic impact is not as strong as we could have hoped for. This was understandable, given that it is a six-week summer program. The evaluation showed that the program does what it always intended to do, keeps kids safe and off the streets; it keeps them motivated and allows them to explore careers and meet mentors they would have otherwise not been able to. It especially helps young people already involved in the justice system or at risk of becoming so.
How did the team (of researchers and DYCD) arrive at a research question that was of mutual interest and feasible to assess through a randomized evaluation?
Sara: Both Judd and I have done prior studies of SYEPs, and we both had a very basic question come out of that work: why doesn’t the training and work experience that SYEPs offer improve future labor market outcomes for participants? Often, we just have to relegate these kinds of follow-up questions to the discussion section of our initial papers. But in this case, the existing relationship with DYCD meant we could approach them with the question. At first, it didn’t seem totally plausible to them that letters of recommendation—a few pieces of paper—would do anything. But they trusted us enough to help us find out. In the end, they were excited to understand why letters of recommendation seem to matter and for whom. DYCD is now working to incorporate what we learned about the value of skill signals into their programming so that future young people can benefit from the study’s findings.
Judd: DYCD recognized the value of randomization in generating research results that can help improve the program, and being fair when allocating a scarce resource. Supervisor time is a scarce resource: many supervisors oversee dozens of youth, and asking them to write letters is time consuming. The fair way of allocating this resource is through randomization, so it was appropriate to conduct a randomized evaluation in this context.