Ideas to Implementation: Engaging the community to evaluate an extended stay detoxification program in Pierce County, Washington
In the fourth part of J-PAL North America’s Ideas to Implementation blog series, Pierce County Human Services (PCHS), one of our 2024 LEVER Evaluation Incubator partners, reflects on how they incorporated community engagement in their randomized evaluation. Through community engagement practices, PCHS gained a better understanding of the different needs and perspectives of relevant stakeholders, identified potential challenges, and uncovered opportunities for ongoing collaboration during the implementation of a randomized evaluation of their extended stay detoxification program.
To start off, please tell us about the program PCHS is planning to evaluate and its evaluation goals.
The program we’re evaluating is aimed at improving the detoxification process for individuals with opioid use disorders in Washington State. Currently, individuals are approved to stay for up to five days in detox facilities. However, this duration has not been re-evaluated in years, despite drastic changes in the types of opioids in use, including the high prevalence of fentanyl.
Direct service providers in the detox field believe that extending the detox period could significantly benefit those seeking treatment. By allowing for a stay of up to ten days, individuals would have more time to stabilize, manage withdrawal symptoms, and prepare for the next steps in their recovery journey. Another key advantage of this extension is that it provides individuals with more time from their last episode of use, better positioning them to transition to rehabilitation after their detox stay. Our goal is to assess whether extending the days in detox improves health outcomes for those undergoing opioid detoxification.
We anticipate that a longer detox period will lead to better health outcomes, higher rates of successful transition to rehab, and ultimately, more sustainable recovery. By re-evaluating and potentially extending the detox duration, we aim to provide the best possible support for individuals undergoing treatment, helping them to build a solid foundation for their recovery journey.
Before starting this evaluation, were you familiar with the concept and framework of community-engaged research?
Pierce County was already familiar with the randomized evaluations that J-PAL North America focuses on, as we previously collaborated on an evaluation of an eviction prevention project. However, participating in several sessions led by J-PAL staff, along with consultant support, helped us become more familiar with effective strategies for fostering community engagement as we designed our Fentanyl Detox program evaluation. We believe that any opportunity to use data to support our investments in the community is invaluable. Additionally, we are committed to including the community in projects that will have an impact on them, ensuring their voices are heard and their unique needs are met.
How did the team incorporate community engagement into the Fentanyl Detox program evaluation design?
By engaging with individuals who have survived substance use, we identified their most pressing needs, which were taken into consideration in the planning process. Specifically, we shared information about the opioid settlement dollars coming into our community at recovery meetings and nonprofit organizations—places where those with lived experiences often go to seek services. We also provided time for them to voice their perspectives on what resources would have helped them during their recovery process.
As transparency is a crucial component of community engagement, we brought the community along with us through meetings at various nonprofits that host group sessions. During these sessions, we provided a brief overview of the program and then facilitated brainstorming discussions, writing participant feedback on large wall stickies. We wanted to maintain that momentum of meaningful engagement and incorporate their valuable insights into actionable steps for the Fentanyl Detox Program as we moved forward.
In addition, we worked closely with our Behavioral Health Advisory Board, which is composed of treatment providers, individuals with lived experiences, and subject matter experts, to develop an opioid spending plan that addresses the most pressing needs of the community. We believe that this transparency strategy not only built trust but also ensured that the community was included and valued in the process.
How has your perspective regarding including community engagement in research design changed after going through this process?
Initially, including the community was just an overarching thought. However, as we delved deeper, we realized the immense value in defining who the community was for this specific project. By refining our understanding of the community related to a detox facility, we changed our engagement strategies.
We are still working on what community engagement will look like in practice, but it will include people with lived experience, subject matter experts, and community members alike.
Given that the detox facility has been physically located in the community for a long time, extending the stay won't significantly impact the surrounding neighbors in the traditional sense. However, it will impact the community of providers and individuals attending rehab. In this instance, the important community members to bring along in the process have been the agencies working closely with folks in rehab and the patients in rehab themselves. Hopefully, in the long run, it will positively affect the broader community by contributing to a reduction in overdoses.
Do you see any other potential applications for community engagement in evaluation design beyond the work you’ve done so far?
We believe there is an opportunity to share more with the people of Pierce County about the process of designing evaluations and how it can help direct where our limited funding goes. By evaluating our contracts through an outcome-oriented lens and assigning clear metrics to each one, we can inform our community how their money is being spent, the benefits they are gaining from these investments, and the specific measurable outcomes achieved.
As we expand our community engagement efforts, we are now informing community members through various communication channels—such as social media, email, and our website, Open Pierce County—where they can access dashboards and metrics to stay updated about projects. Our communications team's efforts ensure accuracy and accessibility in information sharing to drive greater resident participation, reduce stigma, and foster a more informed and engaged community. Involving them in this journey ensures they understand the importance of these projects and feel a sense of ownership and involvement.
What advice do you have for other state and local agencies who may be interested in exploring community engagement in their programs?
The key takeaway is to start early by defining who your community is and what it would look like to involve them in the programs. It's essential to decide where in the project there are opportunities for community input and how that feedback will be incorporated. Finally, building and maintaining strong relationships with community members over time is crucial for meaningful engagement and long-term success.
Pierce County Human Services (PCHS) discusses the evaluation considerations for its mobile medication for opioid use disorder (mobile MOUD) program.
In J-PAL North America’s Ideas to Implementation blog series, our 2024 LEVER Evaluation Incubator partners share their experience on the different steps in developing a randomized evaluation of their innovative programs. In part one of the series, we sat down with Pierce County Human Services (PCHS) to discuss evaluation considerations for its mobile medication for opioid use disorder (mobile MOUD) program. Through an engaging and comprehensive session led by J-PAL staff, PCHS developed a Theory of Change to gain an in-depth understanding of the program and facilitate planning for a randomized evaluation.
A Theory of Change is a conceptual framework that maps the causal link between the program being implemented and the final outcome(s). A Theory of Change is used to better understand each step a program must go through, and the assumptions that must hold to do so, before having the intended impact on the target population. It forms the starting point for developing a randomized evaluation by identifying the outcomes that need to be assessed in order to determine if a program is operating as expected and can also be used to gather further insights into a program’s structure and goals.
To start off, please tell us about the program you are planning to evaluate and your evaluation goals.
Sarah: We are planning to evaluate our mobile medications for opioid use disorder (MOUD) program. After talking to treatment providers and individuals seeking services, we identified a need for more accessible treatment options for people in rural areas of the county that are not just centralized in Tacoma. With this evaluation, we want to see which groups and services are most impacted by having a mobile MOUD program. If there is a large impact, it will help inform future decisions on how to make MOUD services more accessible for rural communities, such as opening a brick and mortar treatment access point.
What was the process like for developing your Theory of Change? Was this concept new to you?
Anika: There were varying levels of expertise with a Theory of Change on our team, so having a session led by J-PAL staff that took us through the essential components of the Theory of Change was very helpful in setting up a base understanding for everyone to build on.
Sarah: I had heard of a Theory of Change, but was unsure of how it applied to research. The session with J-PAL staff was very in-depth—we dug into the program, allowing us to ask ourselves questions like ‘What is this project about?’ ‘How does this program actually look like for the people we are trying to reach?’ We broke our program down into a granular level, giving us an opportunity to think about things we hadn’t thought of before, such as how this program might affect fixed-site MOUD services or how we can establish this program to reduce the stigma associated with accessing these services.
Were there any parts of the Theory of Change that were helpful to think through or that surprised you?
Trevor: It was helpful to think through the threats and assumptions for each step of our Theory of Change. Oftentimes, we build out our plan for an evaluation or program and it can be easy to forget about the parts which might not work as intended. Taking the time to think about those potential issues was particularly helpful, as that is not usually at the top of the list when discussing a program.
Sarah: I agree with Trevor that the threats were important to think through. We tend to be optimistic about our programs, so thinking more deeply about the threats allows us to preemptively prepare for them, fostering a stronger, more robust program.
I would also add that working through our Theory of Change allowed us to hone in on what aspects of our program were fixed and flexible. For example, we had always thought of using vans to deliver the MOUD program, but when discussing the inputs of our Theory of Change, we realized this input is flexible and the van could be an RV instead. Finding those areas of flexibility and rigidity will also help us down the line when thinking about finding a provider for this program.
Do you see any other use cases for the Theory of Change now that you have developed it?
Anika: Developing our Theory of Change makes me want to implement this in all of our projects, not just the ones where we embed an evaluation. This session allowed our team members to build new skills and can now apply this deliberate and thoughtful process in other areas of work. It can be hard to do, given resource constraints, but this practice can be very valuable in gaining a deeper understanding of our programs.
As we are in the process of finding a provider for the mobile MOUD services, our Theory of Change will help us write our Request for Proposals and make sure we are including the most important requirements for our providers.
And since we thought about the threats to our program and evaluation, when these questions come up from leadership, we can show that we have thought about this and have a plan in place to mitigate these threats.
Margo: Currently in Pierce County, mobile treatment programs have a lot of interest and buzz. If we can show that we are being thoughtful about our program by using a Theory of Change—that we are not just jumping into it—it makes our work, and the evidence we do find, more credible and useful to inform future iterations of the program.
J-PAL staff, the Youth Development Department (YDD) of the City of Los Angeles, and the University of Southern California (USC) explore how they established their research collaboration for evaluating the Student Engagement, Exploration, and Development in STEM (SEEDS) program.
In J-PAL North America’s Ideas to Implementation blog series, our 2024 LEVER Evaluation Incubator partners share their experiences on the different steps in developing a randomized evaluation of their innovative programs. In part two of the series, J-PAL staff sat down with the Youth Development Department (YDD) of the City of Los Angeles and the University of Southern California (USC) to explore how they established their research collaboration for evaluating the Student Engagement, Exploration, and Development in STEM (SEEDS) program as part of the Evaluation Incubator.
To start off, please tell us about the program you are planning to evaluate and your evaluation goals.
The Student Engagement, Exploration, and Development in STEM (SEEDS) program is a culturally responsive, intergenerational mentorship initiative with a gaming-based learning component launched in 2022 by Dr. Darnell Cole (USC) and Dr. Christopher Newman of California State University, Fullerton (CSUF) in partnership with the Youth Development Department (YDD) of the City of Los Angeles. The program targets racially minoritized college students who participate in the mentorship program through an internship to mentor middle school students in a local college preparation program. College students are mentored by academic professionals from similar backgrounds (i.e., race, ethnicity, gender, socioeconomic class). Each SEEDS session utilizes an online game-based learning component, where students engage with STEM-based games to promote racial equity for Black, Indigenous, and People of Color (BIPOC) middle school and college students in Los Angeles. Our evaluation examines whether participation in the SEEDS mentorship program enhances STEM identity, community cultural wealth, degree attainment, and STEM career aspirations among college students. This asset-based program1 provides culturally responsive mentorship to address the systemic barriers facing BIPOC middle school and college participants in a metropolitan urban setting. By addressing systemic barriers, SEEDS aims to advance educational, racial, and socio-economic equity in STEM pathways.
What factors influenced the University of Southern California’s (USC) decision to collaborate with the City of Los Angeles for this evaluation?
When we launched SEEDS, we saw a unique opportunity to partner with the then-new YDD of the City in several important areas to leverage resources, coordinate research, and strengthen research-practice-community collaborations. Notably, the City was crucial in leveraging local funding to incentivize college students to participate in SEEDS by providing paid internship opportunities for college students, facilitating partnerships with community-based organizations to hire, train, and compensate students who mentor Neighborhood Academic Initiative (NAI) middle school students, hiring additional college students who receive professional development, and providing administrative support to the SEEDS program.
Who are some of the important stakeholders in this research partnership, and what considerations were taken in deciding roles and responsibilities?
The SEEDS project, led by Dr. Cole and Dr. Newman, follows a research design that is both systematic and meaningful. With the help of UCS researchers including Dr. Ting-Han Chang, Dr. Mabel E. Hernandez, Dr. Milie Majumder, and Dr. Tr’Vel Lyons, the team has built strong stakeholder relationships since the project's launch. In addition to YDD, key partners include local community-based organizations, the Neighborhood Academic Initiative (NAI) at USC, STEM professionals who mentor college students, and college students who, in turn, mentor middle school participants.
Shared goals between the research team, YDD, and community-based organizations have played a crucial role in the program’s success, allowing the research team to take the lead in implementation and research while receiving guidance and endorsement from these key partners. The City of Los Angeles has intentionally advocated for and allocated funds to support the SEEDS program by creating paid internship opportunities for the program’s college student participants, managed through its partnered community organizations. These organizations hire, train, and compensate students who mentor NAI middle school students. These organizations are dedicated to supporting youth development in the City of LA for those who come from underrepresented communities and often face barriers in their career development. The research team has thoughtfully engaged with these community organizations to advance the goals of the SEEDS program, including recruiting interns who meet the organizations' criteria and aligning with their mission to invest in underserved youth in LA. The research team continues to foster connections with STEM professionals nationwide to mentor college students and provide career guidance.
How did the team decide on the key research questions, and how will these questions support program developments?
The research questions for this evaluation originated from a National Society of Black Physicists (NSBP) pre-conference workshop in 2017. The workshop explored strategies to increase African American participation in physics and astronomy and foster collaboration between high-producing and aspirational programs. Over the next few years, we refined our focus to the following research question: What is the impact of the SEEDS program, an intergenerational STEM mentorship program, on supporting minoritized college students’ STEM identity, college (and advanced) degree attainment, and economic mobility beyond college? The research team eventually connected with the City of Los Angeles officials through our school networks in 2021 and pitched the idea of SEEDS to continue supporting youth career development across the city.
By incorporating both quasi-experimental designs and randomized evaluation, study findings can help inform practices and policies that further support university-government and agency-community organization partnerships in providing culturally responsive mentorship, intergenerational mentorship, and professional development opportunities to BIPOC college students across the city. The findings could also help scale the program to other cities focused on youth development and career preparation, especially BIPOC, first-generation, and/or low-income students from urban city settings.
What are some of the key challenges involving navigating different priorities, timelines, and institutional structures you’ve encountered so far, and how have you addressed them?
One key challenge is aligning the SEEDS program’s timeline with various stakeholders, including the city, community organizations, NAI, the Los Angeles Unified School District (LAUSD), and participating colleges, each with different schedules. In addressing this challenge, the research team has been very mindful and communicative with all the stakeholders to find workable schedules that accommodate everyone and meet the SEEDS program’s implementation and research purposes. For YDD, the primary challenge has been identifying and leveraging funds to support the wages of the college student participants, since the city is undergoing some financial challenges as a whole. Nonetheless, YDD remains committed to advancing the program and leveraging any funds available to support the program participants in future cohorts of the program.
What are your hopes for the impact of this research once completed? How do you envision the findings being used to inform policy and program decisions in the City of Los Angeles?
YDD has conducted research in the areas of Positive Youth Development and Youth Program Evaluation. The implementation and evaluation of the SEEDS program have allowed the city to put parts of that research into practice. For example, YDD has established partnerships to develop tools, training, and other resources for government employees who mentor youth across city departments, which builds on the value of the mentoring approach in SEEDS. Overall, lessons from SEEDS implementation and evaluation will inform future collaborations with YDD and academic institutions to pilot and assess youth programs. By using data-driven insights and strengthening an evidence-building culture in Los Angeles, YDD aims to inform policy and program decisions, ensuring that resources are directed toward initiatives with measurable impact. Ultimately, this research will shape a more coordinated and effective approach to improve youth services and outcomes across Los Angeles.
1 Yosso*, T. J. (2005). Whose culture has capital? A critical race theory discussion of community cultural wealth. Race ethnicity and education, 8(1), 69-91.
The County of San Diego’s Office of Evaluation, Performance, and Analytics (OEPA), Planning and Development Services (PDS) teams, J-PAL staff, and UCSD researchers discuss their efforts to explore innovative evaluation possibilities aimed at strengthening the implementation of the Climate Action Plan and turning strategic ideas into effective climate solutions.
In J-PAL North America’s Ideas to Implementation blog series, our 2024 LEVER Evaluation Incubator partners share their experience on the different steps in developing a randomized evaluation of their innovative programs. Through the Evaluation Incubator, the County of San Diego’s Office of Evaluation, Performance, and Analytics (OEPA) and Planning and Development Services (PDS) teams worked together with J-PAL staff and researchers from the University of California, San Diego (UCSD) to scope potential randomized evaluations to strengthen implementation of the County’s extensive Climate Action Plan. We spoke with Ariel Hamburger, PDS Planning Manager, and Courtney Hall, OEPA Principal Data and Research Analyst, to learn more about their experience in this process.
The goal of the scoping exercise was to highlight areas where randomized evaluation can strengthen San Diego County’s Climate Action Plan implementation. The scoping process involved a systematic review of select Climate Action Plan actions to understand available evidence, evaluation feasibility, and implementation priorities. The OEPA and PDS teams brought a mix of subject matter and technical expertise to the Evaluation Incubator. PDS led the development of the Climate Action Plan and is one of six departments that make up the County’s Land Use and Environment Group. OEPA works across County departments to integrate evidence-building capabilities to inform policy, budget, and operational decisions.
Tell us a little about the goals of the LEVER Evaluation Incubator project. Why did San Diego County initially apply for the program?
Courtney: This is the second LEVER Evaluation Incubator that the OEPA team has participated in. We found Evaluation Incubator support for our evaluating rental subsidies program to be extremely helpful, and we were interested in collaborating with J-PAL North America again to think more deliberately about opportunities for evaluation within the Climate Action Plan. It was also an opportunity to work more formally and build our relationship with the PDS team.
What were your primary questions about how to set a learning agenda for the Climate Action Plan at the start of the scoping process?
Courtney: OEPA is committed to building a culture of evaluation across County operations. Our main goal was to utilize J-PAL’s technical expertise, UCSD’s academic research expertise, and PDS’s expertise in Climate Action Plan implementation to think strategically about areas that would benefit most from evaluation.
Ariel: The actions within the County’s Climate Action Plan are based on guidance from ICLEI (Local Governments for Sustainability), which identifies actions that are known to reduce greenhouse gas emissions, so we came in with a clear idea of policy areas to prioritize. The piece that was less known was how randomized evaluations can help build out those actions. Climate Action Plan actions really run the gamut and involve program development, policy design, education, and more. Being able to think about evaluation for some of these specific actions is helpful from both a process and program evaluation perspective so we can improve implementation. Building this relationship with the OEPA team has been very valuable.
What were the most valuable aspects of conducting the Climate Action Plan scoping exercise?
Ariel: PDS did a lot of scoping work when we originally developed the Climate Action Plan. One of the first things we do when developing a new program is look at evidence-based or practice-based research. We also talk to fellow jurisdictions who have run similar programs to the ones we are thinking of developing. Bringing different perspectives to this process through the Evaluation Incubator was very valuable. We approach things in terms of what is possible and feasible for our jurisdiction. Bringing additional brainpower to look at the academic literature on different programs has been really helpful. It makes our research more comprehensive.
Courtney: OEPA came into the Evaluation Incubator with a lot of evaluation expertise, but not a lot of content knowledge about climate actions. This was a useful process for us to gain a sense of research in the climate space. We are also nerds for spreadsheets, so going through the literature in an organized and systematic way with research tied back to each Climate Action Plan action has been a valuable way to identify evidence gaps. This will hopefully enable us to generate evidence to strengthen the implementation of our Climate Action Plan and help other jurisdictions in the climate field as well. As the Evaluation Incubator comes to a close, having additional content knowledge on climate programming will help us support PDS and continue to engage in evaluation thinking.
Was there anything that surprised you about the process?
Ariel: Having an in-person training with J-PAL, UCSD, PDS, and OEPA teams was unexpected and an awesome bonus. It helped solidify the relationship between PDS and OEPA that Courtney mentioned. I don’t think anything necessarily surprised me in terms of research findings from the scoping exercise, but there were other unintended benefits. For example Gordon McCord (UCSD), a researcher in the J-PAL network, connected us with another UCSD professor, and we are intending to work with them to get additional student brainpower on some other Climate Action Plan actions.
Courtney: Yes, the in-person training was a nice add-on and way of spreading the learning we were doing as a smaller team to other folks within the County. Hopefully this will have a bigger impact beyond this project as we continue to explore evaluation possibilities within the Climate Action Plan. The theory of change workshop during the training was particularly helpful. For instance, mapping out the theory of change (as part of the scoping process) for the Sustainable Operations in Land Stewardship (SOILS) program helped us understand program elements so we can identify what’s reasonable and feasible for randomized evaluation. Ultimately, this process will help us design an evaluation that yields results without disrupting the intent of the program. Having different perspectives and stakeholders in the room also helped illuminate some of the nuances of the SOILS program I had not really understood based on higher-level conversations. As we are designing a randomized evaluation, we can hopefully avoid making assumptions that hinder evaluation planning.
How are you planning to use this information going forward?
Ariel: The root cause analysis, which is a component of the theory of change framework, was also very helpful for us. Team members don’t always have time to think about why people have not made a particular behavior change and what the impetus would be to really motivate a targeted change in behavior. Taking the time to do that sort of work will definitely be something we do moving forward.
Courtney: Working through the Climate Action Plan in such detail to identify opportunities for evaluation will also help us think about evaluation opportunities for years to come. It is nice to think ahead strategically instead of addressing things as they come up. It's a real long-term benefit.
Ariel: That’s a really good point. Scoping Climate Action Plan actions ahead of time will enable us to not piecemeal things together later because we took a more systematic approach upfront.
What advice would you give to other jurisdictions that are interested in developing learning agendas or building evaluation plans for their own Climate Action Plans or programs?
Courtney: I would advise other jurisdictions to form cross-sector partnerships as soon as possible. We had the luxury of being able to consider the Climate Action Plan as a whole right after it was adopted by the board (in September 2024). That is not necessarily the case for every jurisdiction, but having all the different stakeholders (researchers, San Diego County staff and evaluators) in the same room was a huge benefit and made the scoping process more effective. Even if there were just a subset of these groups in the room, it would not have been as useful as having everyone thinking about this together—each group has unique knowledge that is essential to bring to the conversation. We are hoping this process will really increase the likelihood of conducting successful, meaningful, and useful evaluations.
Ariel: I would add that, as much as possible, jurisdictions should involve the community that is going to be affected by the action they are evaluating and hear from them about what they think is important to evaluate. For instance, with the SOILS project, it would have been ideal to have some agriculture folks at the table representing that perspective. So if people have a longer timeline or those relationships already in place, I think that is a really important piece.