Caitlin Tulloch, J-PAL ‘13, on innovation in cost-effectiveness analyses and building effective policy partnerships

Posted on:
Caitlin Tulloch
Caitlin Tulloch stands alone on a dangling bridge in a forest
Caitlin Tulloch in Ghana in 2012, where she supported the cost-effectiveness analysis of a large-scale evaluation of Teaching at the Right Level.
Photo: Hannah Ratcliffe

The Alumni Spotlight series highlights J-PAL alumni who are making an impact across industries and around the world. To nominate a J-PAL alum to be featured in a future Alumni Spotlight, please fill out this form.

Caitlin Tulloch joined J-PAL Global in 2009, where she played a central role in building the infrastructure for our policy efforts, establishing our approach to cost-effectiveness analysis, and building partnerships with policymakers. Now the associate director of Best Use of Resources at the International Rescue Committee (IRC), her work remains centered on improving cost-effectiveness analyses. She joined us to talk about her time at J-PAL and her work at the IRC.

Could you describe your background and what drew you to the field of development?

I am an economist by training. When I try to explain how I came to this field, I usually think about my dad, who has an MBA and is a small business owner. He started teaching me about things like compound interest when I was ten years old. My mom is a primary school teacher and has been a political activist for much of her life. When you take the average of those two, you get an economist who works in development!

I studied political science and economics in college, and I knew I wanted to focus on poverty alleviation. I ended up working for Emily Oster, now an economist at Brown University, who was doing development and health research at the time. That was my introduction to the field.

Once you joined J-PAL, you were on the policy team for about four years. Could you tell us a bit about what your day-to-day work looked like?

I was hired to work on cost-effectiveness analyses (CEAs), a method for determining what programs are likely to provide the greatest value for money in a given context. This was shortly after the policy group was formed as a distinct entity within J-PAL, and Rachel Glennerster, the Global Executive Director at the time, really wanted to place an emphasis on cost-effectiveness analysis. I had done some cost-benefit analysis in my undergrad as part of an environmental economics course, so I jumped in. Aside from some work on the J-PAL website and writing many, many evaluation summaries, that was the crux of my work.

Could you tell us a bit about any projects or achievements during your tenure that you were particularly proud of?

There are two that I would call out. The first is the cost-effectiveness studies. We released a methodology paper in 2012, which catalogued all the problems that we had run up against trying to do these analyses and the best solutions we could figure out.

The paper included analysis on about fifteen different education programs and their cost-effectiveness. We were, in some sense, doing primary data collection as best we could to figure out how much it cost to run the programs, and were then thinking through what it meant to be comparing dramatically different programs in terms of a common outcome. The data set later expanded into a comprehensive policy bulletin on getting children into school.

I’m also particularly proud of the work that I did in the Dominican Republic. In the time I was at J-PAL, studies were emerging in Latin America, including one from the Dominican Republic, showing that giving students information on the returns to education was a very cost-effective way of reducing dropouts. Then, while at a conference for J-PAL, I met the Education Officer for USAID in the Dominican Republic. I had just presented evidence from the Dominican Republic study, and it turned out he was interested in moving a cost-effective program like that forward.

Over two years of conversation, that initial interest evolved into the policy pilot. We began to build relationships with local civil society organizations, private foundations, and the Dominican Republic’s Ministry of Education. Our goal was to develop a scalable version of the intervention, test it rigorously, and then if it worked, scale it. That took a huge amount of relationship building, but by the time I left for grad school, the program was being implemented in about 400 schools, with plans for a nationwide scale up after undergoing further evaluation

Could you tell us about some of the challenges of or lessons from developing the pilot in the Dominican Republic?

When we first began conversations in the Dominican Republic, the program didn't really align with the priorities of the Ministry of Education at the time, and they weren't convinced by the mechanism. They also didn't know what it would look like to scale this up. We thus had two tasks before us: achieving a shared understanding of what would make such a program effective, and then helping to solve the practical questions about how it could be implemented.

What served us well was the strong leadership from J-PAL Latin America and the Caribbean, who knew much more about the local context, and from a local education foundation that had a good relationship with the Dominican government. They were essential in helping us understand the policy dialogue and what it would take to get the government more involved, along with researchers in the DR like Daniel Morales (IDEICE) and others.

That was a really important lesson: One can't just come in and say we have an intervention that is proven to be cost-effective for these outcomes and expect enthusiasm. When doing evidence-based policy work, you need the base of evidence, but you also need the time, space, skills, and relationships to figure out how the evidence fits into the priorities of people locally.

Another important lesson from the experience is that it’s crucial to get your partners invested in the results. To this end, we conducted an evaluation in partnership with the government, including running many surveys through the Census Department and involving the Vice Ministry for Education Evaluation in the study design. From this experience, I recommend thinking ahead of time about who is supposed to use the evidence you're going to produce, and then bring them in at the inception to collaborate on the design of the research. Then they can directly tell you what questions that they need to be convinced of, and collaborate in finding out the answers.

You were there at the founding of the policy group at J-PAL. Could you tell us about what it was like to be a part of building out the office in that way?

It was around my second or third year at J-PAL when we shifted from producing the public goods and necessary infrastructure for evidence-based policymaking conversations, to really having the conversations with policymakers. Over the course of a year and a half, I and many others wrote about 200 evaluation summaries. We built a new website and came up with a tracking system for projects, so that when someone needed evidence on a topic we could click a button and get summaries of all the studies in our evidence base.

By 2011 or 2012, we had the evidence base in better shape and we were ready to build the relationships, trust, and understanding of what people needed to funnel information to the right places.

A cartoon drawing of Caitlin saying "There are rich data sets just sitting unused and unaccessed in big NGOs"
Caitlin's passion for cost-effectiveness analyses was immortalized in a cartoon during the ELRHA R2HC conference in 2019.

Right now, you're the associate director for the Best Use of Resources at the IRC. Can you tell us about what drew you to the IRC?

While working at J-PAL, I came to IRC to give presentations about how to think of their existing impact evaluations in terms of impacts per cost. In 2015, they launched a five-year strategy that prioritized “best use of resources”, which is another way of saying cost-effectiveness, and I joined them soon after. In my initial role, we were in-house working with the rest of the IRC research team on concurrent cost-effectiveness analyses (rather than ex-post). More broadly, they gave us the mandate to think about what it would take to improve the cost-effectiveness of the IRC as a whole.

What has been the most interesting project you’ve worked on at IRC?

When you think about how evidence is generated in cost-effectiveness studies, you get very precise estimates of the costs and effects of a program in one very specific context. Normally, I can't observe the same program running in the same country ten times through ten cost-effectiveness studies. But within IRC, for example, we have run ten latrine construction programs in Ethiopia in the same two-year period, and I can get monitoring and finance data on all of them. Having this type of data is important because it allows you to look at how the costs of the same program vary in different contexts or with slightly different program designs.

The most interesting project I have been a part of has been unpacking the question of why program costs can vary so much for the same intervention. Using the large amounts of data available from working within an implementing agency, I was able to publish a paper last year on how production functions provide a better framing than “external validity” for examining how costs vary. Let's throw some data into this framework and test some hypotheses to find out the most important things determining why costs vary. That's going to have a huge influence on the cost-effectiveness measure.

How has your work or that of the IRC changed in response to Covid-19?

IRC works in 35 countries around the world. As an organization serving vulnerable people, where direct service delivery is the majority of what IRC does, Covid-19 has been a challenge. The main way it has impacted my work is that we have backed off on trying to do the kind of analyses I was just mentioning. Our leadership has been very clear, and rightly so, that our priority is continuity of life-saving services. I’ve been trying to step back and give our country teams the space they need to do whatever they need to be doing.

We have explored how we can support that. For example, we pulled data from the Ebola responses in 2014 and 2017-18 to answer what are the costs of covering different types of facilities with the personal protective equipment they're going to need. I think that's useful, but the big picture is that this is when our field staff who are doing amazing work should shine. Our role is to get them what they need.

Authored By