Meghan Mahoney, J-PAL ‘16, is building a career in impact evaluation. What has she learned along the way?
The Alumni Spotlight series highlights J-PAL alumni who are making an impact across industries and around the world. To nominate a J-PAL alum to be featured in a future Alumni Spotlight, please fill out this form.
We are excited to kick off our Alumni Spotlight series with Meghan Mahoney, formerly a Policy Manager at J-PAL Global. After a four-year stint at J-PAL, she left in 2016 to become the Evaluation Director at Educate!, an organization which works to transform secondary education in Africa to empower young people to take leadership initiative, create small businesses, and improve their livelihoods. She joins us to talk about lessons she learned while at J-PAL and share advice for succeeding at J-PAL and beyond.
Could you tell us a little bit about your background, and how that led to your decision to join J-PAL?
My first job after college was with an economic consulting firm in D.C. working on international trade policy research. During my time there, I grew to understand that while inclusive trade policy is an important tool for poverty alleviation, it was sometimes difficult to trace work on that policy level to short-term improvements in the lives of individuals and households. But, after researching various other policy options, it wasn’t clear what the answer was. I was surprised that there was still a dearth of rigorous evidence about what social policies achieved their goal of reducing poverty. This led me to Esther Duflo and Abhijit Banerjee’s work to rigorously test social programs and use that evidence to inform future social policy. This motivated me to pursue graduate studies in development economics and program evaluation, so that I could develop the research and analysis skills to design and test effective social programs, and the writing and communication skills to communicate the findings. When I was looking for my next step, it seemed only natural that I should join J-PAL, and particularly the group within J-PAL charged with making the results of these evaluations accessible to policymakers.
Tell us a little about your time at J-PAL. What was your role, and what was your favorite part about your work?
I started as a Senior Policy Associate in the J-PAL Global Policy Group. I was in that role for two years before becoming a Policy Manager, a position I held two years. In those four years, I was on the Youth and Labor Markets sector, a member of the Cost Effectiveness Analysis team, and was also the liaison to J-PAL’s regional office in Latin America and the Caribbean.
My favorite part of my time at J-PAL was working with program sector chairs to synthesize the literature in the sector, attempting to identify trends and distill lessons from randomized evaluations done in a number of different contexts about what works to improve youth labor market participation. I enjoyed packaging findings into practical policy recommendations that could help organizations like Educate! figure out how best to tackle pressing social issues.
What is a key skill you can learn while at J-PAL that you can apply in future roles?
While quantitative skills were important, especially for my role on the cost effectiveness analysis team, on a day-to-day basis I depended a great deal on project management skills. Being able to see what steps must be taken in a project from beginning to end, as well as anticipating what the bottlenecks will be and how to navigate around them, allowed me to manage issues before they came up.
Communication skills are also incredibly important to success. J-PAL works with a wide variety of stakeholders, all with varying levels of technical expertise, and thus the ability to communicate across backgrounds and contexts is essential. Writing skills were a huge part of this on the policy team, where summarizing complex research and studies for a non-technical audience in a clear and concise manner is the bread and butter of what the group does.
Could you tell us a little about your role now?
I am the Director of Monitoring and Evaluation at an organization called Educate!. Educate! works in East Africa, in particular Uganda, Rwanda and Kenya. Our mission is to help youth develop the skills they need to succeed in today’s economy. We tackle youth unemployment by partnering with schools and governments to reform what they teach and how they teach it, so that students in Africa have the skills to start businesses, get jobs, and drive development in their communities.
As the Director of Monitoring and Evaluation, I oversee the development and execution of Educate!’s monitoring and evaluation strategy. Based on the impact that we would like to have, I work to articulate the organization’s theory of change, and create a plan to test and validate that theory of change through both continuous performance monitoring and rigorous research. This means that I collaborate with colleagues across various design and program implementation functions to determine what data they need to do their jobs better, and figure out how to get it to them.
In retrospect, are there any experiences or lessons learned from your time at J-PAL that you think are applicable in your role today?
We are thinking a lot about systems integration as a pathway to sustainable impact. In order to design a program that governments and their partners can execute, we need to design monitoring systems that can be integrated into their day-to-day work. This is something that J-PAL has made important investments in through the regional offices’ policy work and through the Innovation in Government Initiative (formerly GPI). In order for us to be successful, we need to think about our monitoring and evaluation systems not only as a way for us to get the right information, but as a way for the system actors that we partner with to also get that information continuously. I think the work J-PAL has done and is doing to set up collaborations with governments and think about how to do this effectively is really important.
My experience and lessons learned from piecing together evidence from different sectors or different contexts has been really helpful in my role today. There is a lot of evidence out there, thanks to the great work that has been done by J-PAL affiliates, IPA, and large international organizations such as the World Bank. But you can’t always draw direct parallels between the available evidence and the specific context that we work in. This means that I often have to look at similar programs from other contexts or sectors and think about whether or not we can use that evidence in Educate!’s work. My time at J-PAL really helped me hone that skill.
What advice would you give to those who are interested in pursuing a career in impact evaluation and evidence-based policymaking?
Don’t have tunnel vision. Having a solid grounding in economic methods and theory is very important, but don’t discount the importance of developing a thorough understanding of other types of evaluations beyond RCTs. While J-PAL and its affiliates have a comparative advantage in running randomized evaluations, as a member of the organization, you also need to know what other M&E strategies are out there and why and when to use them. If you’re planning on launching a career in impact evaluation, this knowledge will serve you well. It will give you credibility when working with partners, and ensure that you’re helping those who truly need evidence to make better decisions by getting them the information they need. I think this is particularly helpful when interacting with policy makers, program designers, and implementers frequently.
And if you work at J-PAL, enjoy your time there! J-PAL is a special place to work, with some of the smartest people I’ve ever had the pleasure of working with. And organizationally, its commitment to expanding the evidence base, giving evidence a seat at the table, and understanding the broader lessons from the literature, is what makes J-PAL important. Even after working there, I still turn to J-PAL’s evaluation database and policy publications for information about what has been tested before and how we can use that to improve our programming.