Michael Eddy, J-PAL ‘10, on using evidence to support high-impact organizations
The Alumni Spotlight series highlights J-PAL alumni who are making an impact across industries and around the world. To nominate a J-PAL alum to be featured in a future Alumni Spotlight, please fill out this form.
We caught up with Michael Eddy, formerly of the research teams at J-PAL South Asia and J-PAL Latin America and the Caribbean (LAC). Michael supported several seminal J-PAL health-related evaluations while a research assistant in Udaipur, India, and then traveled to Chile to play a central role in establishing J-PAL’s nascent Latin America office. After an exciting early career in results-based financing and social impact investing, Michael advises major foundations, aid organizations, and social enterprises on evidence use. He reflects on his lessons learned at J-PAL and beyond in this latest Alumni Spotlight.
What first inspired you to build a career in the development and public policy?
I see development as a justice issue—every life has equal value. And yet not every life has equal opportunities. Where I grew up, I didn't have to worry about water being polluted or breathing air that slowly brought on asthma and chronic disease. Higher education was an expectation, not an exception. The lottery of where we were born ought not to determine our outcomes in life. This has shaped my perspective and led me to focus on expanding opportunities for those who otherwise may not have them.
Early on, I was striving for a toolkit to contribute in a concrete way to addressing development challenges. I was an undergraduate around the time when microcredit was receiving a lot of attention. Some people were promoting it as a way to eradicate poverty.
I was working in a refugee camp and NGOs (driven by the donor interest) were offering microcredit to refugees. Yet these refugees lacked secure property rights and access to markets to be able to pay back these loans. Did loans do more harm than good? This inspired me to write my undergraduate thesis on the lack of evidence around microcredit’s effect on poverty and propose solutions for filling the evidence gap in addition to just critiquing it.
You started off at J-PAL South Asia as a research assistant, and then you became a research manager at J-PAL LAC. Could you tell us a bit about these roles?
I joined J-PAL right after college and started working on a number of health and education projects in the city of Udaipur in Rajasthan, India. I was very fortunate to be working under Esther Duflo and Abhijit Banerjee, who were wonderful at helping me to build a rigorous analytical toolkit, while also having the opportunity to get into the weeds on how to run a field operation. I worked on a number of RCTs in India, including the endline surveys associated with the incentives for immunizations evaluation, some teacher and nurse monitoring projects, and an iron fortification project.
After that, I moved to Chile to help set up J-PAL LAC. On my first day on the job, I walked into an empty, furniture-less office at PUC Chile and met Ryan Cooper, the executive director at the time, and Pancho Gallego, who is still a scientific director of J-PAL LAC. I worked with them on introducing J-PAL to Chile and developing our Executive Education course for policymakers and academics, and also on the smaller, yet important, things like building an onboarding manual for research assistants. It was an honor to be part of that journey and to work with those phenomenal people. I’m so excited to see how far the J-PAL LAC team has come since those early days.
Much of your career has focused around results-based financing and investing for social impact, leading you to co-found Instiglio. What drew you to these areas of work?
When I was in graduate school, I grew increasingly frustrated by this huge gap between the evidence base and what was actually being implemented in policy. For example, the immunization evaluation in Udaipur I mentioned earlier showed that very small incentives can have an outsized impact on immunization rates. And yet, despite the fact that over 10 million children a year miss out on life-saving vaccines, evidence was barely being used outside of research contexts.
While we generate more evidence, we also need to be using the evidence we already have. In the private market, successful companies generally scale because there’s an incentive to deploy capital to companies that can generate profits. But in the public sector, there’s no similar mechanism to ensure that money seeking outcomes, gets to programs that can generate outcomes. As a result, markets for social outcomes are incredibly inefficient.
That motivated me to start thinking about this challenge at a more systemic level—not just generating new evidence around a specific program, but about the systems that fund (or don’t fund) programs in general. When you look at this, you realize that most funding is against a rigid set of inputs or activities. What if funding were directly tied to outcomes on people’s lives?
Avnish Gungadurdoss, Michael Belinsky, and I co-founded Instiglio to improve the effectiveness of social services by tying funding directly to measurable results. Instiglio got its name by launching the first impact bond in a developing country. Impact bonds are a new tool in the impact investing toolkit and are one way that some impact investors are improving their measurement of impact.
But Instiglio does far more than impact bonds—it works on a broad portfolio of results-based financing instruments with governments, the World Bank, and large donors. When you tie funding to measurable results, suddenly there’s a return on investing in the type of operational research, performance management, and use of rigorous evidence that is necessary to deliver outcomes at scale.
I learned a lot in launching Instiglio, mostly by making a lot of mistakes and coming back from them. It was tremendously intellectually challenging, but also challenging in terms of the soft skills needed for building a team and an organization. I’m incredibly thankful to Instiglio’s early team, which had the patience to bear through those first few years.
What skills or experiences gained from working at J-PAL have you found useful in your own career?
I've always been at the intersection of research and policy. The core skill-set of understanding a counterfactual, knowing how to read a piece of evidence and understand its strengths and weaknesses has been critical.
But counterfactual thinking is also a mindset. Even if you’re not always able to measure the counterfactual, the training in counterfactual reasoning I got at J-PAL has been broadly helpful in a wide variety of decision-making environments. Counterfactuals are everywhere!
In addition, employing the scientific method of testing out ideas and iterating in a data-driven way has been a common thread throughout my career. We tried to do that in small ways at Instiglio, experimenting with different products and services.
At Global Innovation Fund, we invested in both non-profits, as well as for-profit ventures that benefit the poor. Experimentation can look a bit different in for-profits, but you still need to think rigorously about what will drive both profitability as well as social impact.
What advice do you have for development professionals interested in converting their own ideas and passions into a social venture?
Starting a new venture is not an easy endeavor. It’s important to find a problem that really motivates you, and that motivates you on a very personal level, because you're going to have to draw on that motivation to get through a lot of ups, downs and curves along the way, as you iterate towards that right product-market fit.
It’s also a matter of surrounding yourself with people who know more than you, who can hold you accountable but can also be a support network as you go through the rollercoaster of building a new venture.
I've been really fortunate to draw on phenomenal peer networks both at J-PAL and beyond that helped me through these challenges. I still keep in touch with many of the people from my original research assistant cohort and the broader J-PAL alumni community. The friends and colleagues I’ve met at J-PAL have been truly wonderful resources throughout my career.
Caitlin Tulloch joined J-PAL Global in 2009, where she played a central role in building the infrastructure for our policy efforts, establishing our approach to cost-effectiveness analysis, and building partnerships with policymakers. Now the associate director of Best Use of Resources at the International Rescue Committee (IRC), her work remains centered on improving cost-effectiveness analyses.
The Alumni Spotlight series highlights J-PAL alumni who are making an impact across industries and around the world. To nominate a J-PAL alum to be featured in a future Alumni Spotlight, please fill out this form.
Caitlin Tulloch joined J-PAL Global in 2009, where she played a central role in building the infrastructure for our policy efforts, establishing our approach to cost-effectiveness analysis, and building partnerships with policymakers. Now the associate director of Best Use of Resources at the International Rescue Committee (IRC), her work remains centered on improving cost-effectiveness analyses. She joined us to talk about her time at J-PAL and her work at the IRC.
Could you describe your background and what drew you to the field of development?
I am an economist by training. When I try to explain how I came to this field, I usually think about my dad, who has an MBA and is a small business owner. He started teaching me about things like compound interest when I was ten years old. My mom is a primary school teacher and has been a political activist for much of her life. When you take the average of those two, you get an economist who works in development!
I studied political science and economics in college, and I knew I wanted to focus on poverty alleviation. I ended up working for Emily Oster, now an economist at Brown University, who was doing development and health research at the time. That was my introduction to the field.
Once you joined J-PAL, you were on the policy team for about four years. Could you tell us a bit about what your day-to-day work looked like?
I was hired to work on cost-effectiveness analyses (CEAs), a method for determining what programs are likely to provide the greatest value for money in a given context. This was shortly after the policy group was formed as a distinct entity within J-PAL, and Rachel Glennerster, the Global Executive Director at the time, really wanted to place an emphasis on cost-effectiveness analysis. I had done some cost-benefit analysis in my undergrad as part of an environmental economics course, so I jumped in. Aside from some work on the J-PAL website and writing many, many evaluation summaries, that was the crux of my work.
Could you tell us a bit about any projects or achievements during your tenure that you were particularly proud of?
There are two that I would call out. The first is the cost-effectiveness studies. We released a methodology paper in 2012, which catalogued all the problems that we had run up against trying to do these analyses and the best solutions we could figure out.
The paper included analysis on about fifteen different education programs and their cost-effectiveness. We were, in some sense, doing primary data collection as best we could to figure out how much it cost to run the programs, and were then thinking through what it meant to be comparing dramatically different programs in terms of a common outcome. The data set later expanded into a comprehensive policy bulletin on getting children into school.
I’m also particularly proud of the work that I did in the Dominican Republic. In the time I was at J-PAL, studies were emerging in Latin America, including one from the Dominican Republic, showing that giving students information on the returns to education was a very cost-effective way of reducing dropouts. Then, while at a conference for J-PAL, I met the Education Officer for USAID in the Dominican Republic. I had just presented evidence from the Dominican Republic study, and it turned out he was interested in moving a cost-effective program like that forward.
Over two years of conversation, that initial interest evolved into the policy pilot. We began to build relationships with local civil society organizations, private foundations, and the Dominican Republic’s Ministry of Education. Our goal was to develop a scalable version of the intervention, test it rigorously, and then if it worked, scale it. That took a huge amount of relationship building, but by the time I left for grad school, the program was being implemented in about 400 schools, with plans for a nationwide scale up after undergoing further evaluation.
Could you tell us about some of the challenges of or lessons from developing the pilot in the Dominican Republic?
When we first began conversations in the Dominican Republic, the program didn't really align with the priorities of the Ministry of Education at the time, and they weren't convinced by the mechanism. They also didn't know what it would look like to scale this up. We thus had two tasks before us: achieving a shared understanding of what would make such a program effective, and then helping to solve the practical questions about how it could be implemented.
What served us well was the strong leadership from J-PAL Latin America and the Caribbean, who knew much more about the local context, and from a local education foundation that had a good relationship with the Dominican government. They were essential in helping us understand the policy dialogue and what it would take to get the government more involved, along with researchers in the DR like Daniel Morales (IDEICE) and others.
That was a really important lesson: One can't just come in and say we have an intervention that is proven to be cost-effective for these outcomes and expect enthusiasm. When doing evidence-based policy work, you need the base of evidence, but you also need the time, space, skills, and relationships to figure out how the evidence fits into the priorities of people locally.
Another important lesson from the experience is that it’s crucial to get your partners invested in the results. To this end, we conducted an evaluation in partnership with the government, including running many surveys through the Census Department and involving the Vice Ministry for Education Evaluation in the study design. From this experience, I recommend thinking ahead of time about who is supposed to use the evidence you're going to produce, and then bring them in at the inception to collaborate on the design of the research. Then they can directly tell you what questions that they need to be convinced of, and collaborate in finding out the answers.
You were there at the founding of the policy group at J-PAL. Could you tell us about what it was like to be a part of building out the office in that way?
It was around my second or third year at J-PAL when we shifted from producing the public goods and necessary infrastructure for evidence-based policymaking conversations, to really having the conversations with policymakers. Over the course of a year and a half, I and many others wrote about 200 evaluation summaries. We built a new website and came up with a tracking system for projects, so that when someone needed evidence on a topic we could click a button and get summaries of all the studies in our evidence base.
By 2011 or 2012, we had the evidence base in better shape and we were ready to build the relationships, trust, and understanding of what people needed to funnel information to the right places.
Right now, you're the associate director for the Best Use of Resources at the IRC. Can you tell us about what drew you to the IRC?
While working at J-PAL, I came to IRC to give presentations about how to think of their existing impact evaluations in terms of impacts per cost. In 2015, they launched a five-year strategy that prioritized “best use of resources”, which is another way of saying cost-effectiveness, and I joined them soon after. In my initial role, we were in-house working with the rest of the IRC research team on concurrent cost-effectiveness analyses (rather than ex-post). More broadly, they gave us the mandate to think about what it would take to improve the cost-effectiveness of the IRC as a whole.
What has been the most interesting project you’ve worked on at IRC?
When you think about how evidence is generated in cost-effectiveness studies, you get very precise estimates of the costs and effects of a program in one very specific context. Normally, I can't observe the same program running in the same country ten times through ten cost-effectiveness studies. But within IRC, for example, we have run ten latrine construction programs in Ethiopia in the same two-year period, and I can get monitoring and finance data on all of them. Having this type of data is important because it allows you to look at how the costs of the same program vary in different contexts or with slightly different program designs.
The most interesting project I have been a part of has been unpacking the question of why program costs can vary so much for the same intervention. Using the large amounts of data available from working within an implementing agency, I was able to publish a paper last year on how production functions provide a better framing than “external validity” for examining how costs vary. Let's throw some data into this framework and test some hypotheses to find out the most important things determining why costs vary. That's going to have a huge influence on the cost-effectiveness measure.
How has your work or that of the IRC changed in response to Covid-19?
IRC works in 35 countries around the world. As an organization serving vulnerable people, where direct service delivery is the majority of what IRC does, Covid-19 has been a challenge. The main way it has impacted my work is that we have backed off on trying to do the kind of analyses I was just mentioning. Our leadership has been very clear, and rightly so, that our priority is continuity of life-saving services. I’ve been trying to step back and give our country teams the space they need to do whatever they need to be doing.
We have explored how we can support that. For example, we pulled data from the Ebola responses in 2014 and 2017-18 to answer what are the costs of covering different types of facilities with the personal protective equipment they're going to need. I think that's useful, but the big picture is that this is when our field staff who are doing amazing work should shine. Our role is to get them what they need.
Patrya Pratama joined J-PAL Southeast Asia as a senior research associate in 2014. Now the founder and director of the INSPIRASI Foundation, he joined us to reflect on lessons learned over the course of his public policy career and chat about his work at INSPIRASI.
The Alumni Spotlight series highlights J-PAL alumni who are making an impact across industries and around the world. To nominate a J-PAL alum to be featured in a future Alumni Spotlight, please fill out this form.
Patrya Pratama joined J-PAL Southeast Asia as a senior research associate in 2014, where he laid the groundwork for an intervention aimed at increasing enrollment in public health insurance among informal workers, and trained policymakers in program evaluation fundamentals. After working as a policy advisor in both national and local government in Indonesia, he founded and now leads the education-focused INSPIRASI Foundation. He joined us to reflect on lessons learned over the course of his public policy career and chat about his work at INSPIRASI.
What drew you to the field of public policy?
It began with my very first job at Gerakan Indonesia Mengajar (Indonesia Teaching Movement), which is essentially like the Indonesian version of Teach for America. I was placed in a village of East Kalimantan province on the island of Borneo—a pretty remote area—where I taught for a little over a year.
At the school where I worked teachers were undersupplied, infrastructure wasn't that great, parental engagement with the student learning was very minimal, and local leadership seemingly couldn't care less about education. That experience grew my interest in public policy—I knew that those conditions must have stemmed from certain decisions made at either the local government level, municipal level, or even at the central government level.
I didn’t go into policy immediately after that experience. I took on another job as a communications consultant for the European Union office in Indonesia, but then my work for them was accidentally related with their development program. Then I decided to take on my master's study in public policy after that.
You worked at J-PAL as a senior research associate on a public health intervention for informal workers. Can you tell us about your role?
I worked with [J-PAL SEA Co-Scientific Directors] Rema Hanna and Ben Olken on the pre-pilot of an evaluation of an intervention aimed at increasing voluntary enrollment in Indonesia’s national health insurance program among informal workers.
That was my very first time doing the design work, implementation planning, and then actual field work with the enumerators themselves. It was kind of overwhelming at first. I quickly learned that it’s one thing to design everything on paper, looking at what has worked in the past and what the principal investigators (PIs) think might work, but it's completely different when you’re on the ground and see what's actually working and what the beneficiaries think of the intervention. And then you have to serve as the bridge between what’s happening on the ground and the PIs, helping them to effectively communicate learnings to our government partners.
This communication element is really crucial—sometimes more often than not, implementation is the policymaker’s blind spot. They may have this concept and plan, but then they don't really know what's actually happening on the ground. At the same time, if you communicate about the intervention and evaluation in ways that are too complicated or missing the big points, then it doesn't really mean much.
Research roles at J-PAL often have a component of training others in government and NGOs in impact evaluation methodology. What are some interesting takeaways from your experience leading evaluation trainings?
Through doing these trainings, I learned that what tends to be of most value for government participants is simply providing them with the tools for understanding how to think about a program’s effectiveness. This was often as simple as thinking about input activities, output, and then impact. Many of them have heard of this, but have never really looked at their programs in that way before. The training is a good opportunity for them to look at their work from a more critical perspective.
That’s why in my trainings we talked a bit about randomized evaluations, so they have a general idea, but the majority of the training is about basic evaluation. What is interesting for us is that after the training, participants are often left with more questions than answers because now they question a lot of the components of the program that they're running.
As the founder and executive director of the INSPIRASI Foundation, could you tell us a bit about your organization and what your motivation was for launching it?
The INSPIRASI Foundation is a nonprofit focusing on school leadership—specifically school principals in Indonesia. We’re trying to find an alternative model of effective professional development programming for principals across Indonesia.
In the majority of Indonesian schools, principals are not well equipped to lead learning in their own schools. We call this the learning crisis: where kids are in school, but they don't really learn. One reason behind the learning crisis may be that teachers don't teach well, but another may be that the institution itself is not really geared towards learning. We are testing out interventions that can help principals better guide learning in their schools.
I founded INSPIRASI two and a half years ago, and we now have a team of eight. We are already in the second pilot of our program, working with close to 83 school leaders in Indonesia. We’re hoping to wrap up our pilot next year, and then decide how we could scale it up with the government.
I’d say we are somewhat unique in Indonesia’s education nonprofit space because we are focusing on system-level change, which means not only do we pilot our intervention program on the ground, but also we make sure the local structures and related government agency are able to adopt and develop capacity to manage the program sustainably. So, the full picture of the results could happen in a long term. We don’t focus on reach so much as whether at the end of the program, did student learning really improve? Does the local agency have enough capacity to take over after our direct involvement ends?
It sounds like every day is a learning experience. How have your experiences working in government or at J-PAL helped you with launching this foundation?
Every role in my past has come into play now. For example, we want the government to be involved in the planning part from the earliest stages, because they need to have a sense of ownership of the program. This is especially important in the Indonesian context, where education is very decentralized, and the success of a program really falls into the hands of whether the local government has an interest in its success. My experience working in government has helped me to identify and engage the right people within the government.
My time at J-PAL taught me to pay attention to the nitty gritty of the implementation side. For example, education interventions often involve some sort of training. But trainers have a tendency of doing whatever he or she can to help the trainees to perform well, which often goes outside the initial design of the intervention. For the sake of being able to properly evaluate the impact of the program, I’ve learned to really ensure that trainers stick to the designed training.
Having worked across a wide spectrum of policy roles, from research to politics to nonprofit administration, what advice would you give to those who are interested in pursuing a career in public policy?
At the end of the day, we’re in the development sector, so what matters is whether or not the thing you’re doing has an impact on the ground—are you actually helping the people you’re trying to help? It sounds pretty simple, but in my role now, for example, if you lose your focus on that objective it can be easy to be swayed by other factors.
One such factor is, of course, the political implications of your evaluations. For example, let’s say if the program doesn't work and you’re afraid of angering your government partner, it can be tempting to not express this concern. But however you do it, the message that change is needed must be shared. You just have to figure out a way of getting that across.
We also ought to keep trying to connect the many different stakeholders in public policy: government, researchers, political groups, nonprofits, funding/donor organizations, and the public or beneficiaries themselves. These groups often think and operate in silos, so those who pursue careers in public policy need to be the connector.
Finally, I would say keep monitoring and evaluation top of mind to continue learning and improving our program or policy for our beneficiaries. It is indeed easier said than done, but definitely worth it.
Tanya joined J-PAL South Asia’s policy team after receiving a master’s degree in public policy from the Paris School of Economics. At J-PAL South Asia, she played a key role in an early partnership with the Government of Tamil Nadu and helped lay the conceptual groundwork for important program scale-ups. Now a senior policy analyst at AidData, she reflects on her path to international development, the challenges associated with bridging the gap between evidence and policy, and the dynamic nature of policy research.
The Alumni Spotlight series highlights J-PAL alumni who are making an impact across industries and around the world. To nominate a J-PAL alum to be featured in a future Alumni Spotlight, please fill out this form.
In the next installment of our Alumni Spotlight series, we speak with Tanya Sethi, a former senior policy associate at J-PAL South Asia. Tanya joined J-PAL South Asia’s policy team after receiving a master’s degree in public policy from the Paris School of Economics. At J-PAL South Asia, she played a key role in an early partnership with the Government of Tamil Nadu and helped lay the conceptual groundwork for important program scale-ups. Now a senior policy analyst at AidData, she reflects on her path to international development, the challenges associated with bridging the gap between evidence and policy, and the dynamic nature of policy research at AidData.
Tell us a bit about your background. What drew you to the field of development?
While I was interested in development issues throughout my higher education, it was my first job as an assistant editor for the Economic and Political Weekly, an Indian social sciences journal, that exposed me to a wide range of development policy issues. Some of the most exciting debates that were happening in India at that point in time—from how to measure poverty to assessing the impact of flagship government programs—were happening on the pages of this particular journal. So even though my academic training was in economics, through this journal I also had the opportunity to look at issues from historical, sociological, and political perspectives. The possibility of seeing issues from these multiple perspectives and arriving at solutions further advanced my interest in the field of development.
That was one of the reasons I decided to pursue a second master's degree in public policy. I wanted to go broader than economics and look at development issues through a multi-disciplinary lens. My master’s program strengthened my quantitative skills, specifically in the area of impact evaluation, and got me excited about the applied side of economics. While my program was about developing a rigorous approach to impact evaluation, it also made me aware that not every question or context lends itself to the same kind of rigor.
How did you first come across J-PAL, and what were some of your main responsibilities on the policy team?
I first read about J-PAL when I was working at the Economic and Political Weekly in Mumbai, but it was only during my master's program that I developed a good understanding of the organization and its methods. It’s one thing to read about impact evaluations in various settings and contexts, and another to see how these play out in reality when you're working with an implementing partner, such as an NGO or a government who you're hoping will take action based on the results of these evaluations. I wanted to witness first-hand how evaluations enter the realm of policy, which is what drew me to apply to J-PAL.
At the time that I joined, the policy team at J-PAL South Asia was focused on documenting some of the programs that had shown impact in the region, including Targeting the Ultra-Poor and Teaching at the Right Level. These approaches had been tested in different contexts with slight tweaks in implementation to fit the contextual realities—my job was, in part, to document what these tweaks looked like in the hope of creating a more generalizable approach that could then be scaled up in different contexts.
At that time, J-PAL had also just launched a five-year partnership with the state of Tamil Nadu. The policy team was quite nascent then, and this partnership was geared towards generating demand for impact evaluations across multiple government departments. The unique aspect was that we used the most pressing challenges of various line ministries as the starting point to jointly develop solutions in collaboration with J-PAL researchers.
I led the partnership with Tamil Nadu’s health department, which involved serving as a bridge between government officials and J-PAL affiliates who were interested in partnering with governments to test interventions to health problems. This was challenging because the incentive structures on both sides could be quite different. Government officials tend to care most about cost effectiveness and the potential to scale up an intervention, while academics are often driven by testing innovative approaches, the results of which have a higher likelihood of being published in prestigious journals. In those cases, it was important to listen and to appreciate the constraints and political realities that government officials faced. I enjoyed playing that liaising role and bringing the two sides together to flesh out interventions that would be mutually agreeable and testable.
I also learned that it’s important to engage at different levels of government, whether the state, local, or district level, because their understanding of bottlenecks for certain issues may be very different, and it is very important—though often challenging—to reconcile these perspectives.
You moved on to AidData, a research lab at the College of William and Mary, after J-PAL. Tell us a little bit about what you’re working on now.
I have been with AidData for a little over four years now, where I work as a Senior Policy Analyst within the policy team. In this capacity, I’ve consulted organizations such as the US Department of State, USAID, the German Ministry for Economic Cooperation and Development, the Hewlett and Gates Foundation on issues of aid effectiveness, foreign policy, development finance, and evidence use. I serve the dual role of leading the research and translating insights from this research into policy recommendations that can inform their organizational strategies.
I am currently leading the design of a survey to capture perceptions of policymakers and practitioners in 140 low- and middle-income countries regarding donor performance. In an upcoming study in collaboration with the German Institute for Development Evaluation (DEval), we look at whether donor adherence to internationally recognized aid effectiveness principles (such as not tying aid, or aligning aid with recipient country budgets) makes donors seem more helpful in the eyes of their domestic counterparts.
What advice would you give to those who are interested in exploring a career at the intersection of research and policy?
I've realized over time is that we often have a simple linear model of how research influences policy: I'm going to produce amazing research, and then I'm going to present it to a government official or to a certain organization, they're going to understand it and use it to introduce or change policies. In my view, credible research is a necessary (though insufficient) starting point, but the pathway to actually influencing policy can be quite messy. My advice, especially to those who are on the early side of their careers and who are ambitious about creating change, is to have a lot of patience, since there are no quick wins in this space.
Three lessons that I’ve learned in my career thus far: (1) translating research into policy requires a deep understanding of political realities and how to navigate them, (2) listening to the other side and being flexible in changing your approach can go a long way in ensuring take-up of results, and (3) whether working with a government or another organization, it pays to build relationships and trust at various levels to ensure continuity of engagement over the long term.