Strengthening randomized evaluations with qualitative research: Baby’s First Years mothers' experiences
In part five of our blog series on incorporating qualitative research into randomized evaluations, we feature the perspectives of researchers conducting the Baby’s First Years (BFY) study. BFY is a J-PAL-funded research project assessing the impact of poverty reduction on family life and infant and toddlers’ cognitive, emotional, and brain development. In this post, the researchers reflect on the value of qualitative research in both providing a deeper understanding of each participant’s background and context and painting a fuller picture of mothers’ experiences in the study.
Randomized evaluations are great for telling us the answers to the questions we ask; they are less useful for helping us answer questions we didn’t know to ask.
As researchers from the BFY study, we recognized that survey and debit card transaction data would show how mothers spent their monthly cash gifts, but not why they did so—that is, what meaning and motivation lay behind spending decisions and what power dynamics or negotiations within families may surround these decisions. Therefore, we pursued a multi-method approach to understanding the impact of monthly unconditional cash transfers distributed to families with low incomes for the first several years of their focal child’s life.
In 2018 and 2019, the BFY study recruited 1,000 mothers in four US cities whose household incomes were below the federal poverty line when they gave birth. Mothers agreed to be randomly assigned to receive a large (US$333) or small (US$20) cash gift on a debit card each month until their child was around four years old.
Our qualitative companion study, BFY: Mothers’ Voices, invited a stratified random sample of eighty mothers to participate in repeated, in-depth qualitative interviews each year for the duration of the study. Both BFY and BFY: Mothers’ Voices are currently in the field, surveying and interviewing mothers by phone to find out how they and their now-three-year-old children are doing.
Our qualitative study has two distinct contributions to offer from typical survey data and related methods in understanding the impacts of the cash transfers. First, the qualitative approach starts from a position that we, as researchers, do not necessarily know all the important questions to ask or the response categories that make sense. So, while in BFY: Mothers’ Voices we use an interview guide to think through topics and ways to word questions, we also ask open-ended and follow-up questions to allow mothers to lead us where they think it is important to go. If we ask about the joys of motherhood, and a mom shares a story about her children playing together, or she starts talking about a recent stressful event, each answer is equally welcome and insightful.
Qualitative research opens the door to a fuller appraisal and exploration of mothers’ experiences that consider their dreams and fears in ways that are not pre-supposed to fit into validated scales. There are some things we simply cannot make sense of if we only ask our own questions, provide our own answer options, and, therefore, stay rooted in our own perspectives and experiences.
Nina, a 26-year-old, Black, high-gift-group mother of four from New Orleans, described using the first installment of the BFY money shortly after she gave birth (as is standard, we use pseudonyms to protect mothers’ identities). “I could have just cried because it was a total relief. Because first of all we went in the hospital flat broke. We was flat broke in the hospital.” You can hear not just Nina’s relief but also her joy: “We got food, a lot of food. We put food in the house. I even went and got the kids a gift. That's how happy I was. I was like, ‘Let me get the children something.’ So, I even got them a toy at the store. I got some cleaning supplies to make sure it was really sanitary for when I brought [my daughter] home.” Multiple mothers described wanting to bring their babies home from the hospital to clean houses—these are moments that are rich with the symbolism of having a fresh start.
Our open-ended interviews give us a chance to capture cash uses we didn’t anticipate. Because money is a distinct intervention in that it can be used in nearly an infinite number of ways, it’s just not possible to ask about them all on a survey. As a simple example, the BFY survey queried mothers about purchasing common goods for infants, such as highchairs and car seats. But, as with Nina, in the interviews we learned about purchases mothers made intentionally to benefit their children—such as cleaning supplies—that we did not know to ask about as child-related items. This changes our conceptualization of what counts as a “child-related purchase.”
The qualitative approach also provides a rich understanding of the background and context for each person. This contextualized approach may be particularly important in studying a topic like unconditional cash transfers, since mothers’ financial allocation decisions are likely embedded in the complex context of their childhood experiences, parenting values, family material circumstances, dreams for their children’s futures, and conceptualization of the cash transfer and what they see as its “proper” use.
When we center this complex context in our analyses, we may understand socioeconomic behavior differently. Following the typical frameworks of economics, buying a fast-food kid’s meal when you’re on a tight budget might make little sense, since you don’t have the money to spare and the burger and treat will not be an investment in developing your child’s human capital. To a mom who is scraping by, the skip in her child’s step and the smile on her child’s face as they leave the restaurant may be great reasons for such a financial allocation decision. And her motivation for doing so could be found in her experiences as a child, either relishing similar memories of her own or wishing to give her children more happy moments than what dominated her upbringing. We could learn how the deprivation parents want their children to avoid may extend beyond the food, clothing, and shelter parameters that often constitute material hardship in research to include social and emotional experiences as well.
Scholars’ existing measures of financial disadvantage don’t typically capture the small, daily expenses that can also be central to helping parents feel like they are doing right by their children. The qualitative work in BFY allows our team to think differently about how the cash gift affects allocations that support children—expanding the important but also narrow expenditure types captured in the survey to include popsicles that offer a toddler joy or cleaning supplies that help a mom feel she is providing a safe place for her baby. These are things that we may discover are important ingredients in child investment and can have ripple effects for children’s development.
Part one of this blog series highlights the value of incorporating qualitative methods into randomized evaluations and outlined specific tips for researchers. Part two talks about how qualitative research helped motivate and shaped the central question and hypothesis for a study on racial concordance between physicians and patients. Part three looks at how Creating Moves to Opportunity randomized evaluation embedded qualitative research methods into its study design. Part four discusses how qualitative research helped the Oregon Health Insurance Experiment research team make sense of some of the study’s results.
For the final part of our blog series on incorporating qualitative research into randomized evaluations, we spoke with Associate Professor of Social Work and co-author of the Oregon Health Insurance experiment, Heidi Allen, about how in-depth interviews with study participants helped the research team interpret some of the study’s results.
For the final part of our blog series on incorporating qualitative research into randomized evaluations, we spoke with Associate Professor of Social Work and co-author of the Oregon Health Insurance experiment, Heidi Allen, about how in-depth interviews with study participants helped the research team interpret some of the study’s results.
In 2008, Oregon held a lottery to select an additional group of low-income, uninsured adults into its Medicaid program. Around 90,000 applied for 10,000 openings, providing an opportunity for J-PAL North America Scientific Co-Chair Amy Finkelstein, J-PAL affiliated professor Kate Baicker, Allen, and coauthors to conduct a randomized evaluation to understand the impact of providing health insurance to the uninsured.
The study found that participants who received Medicaid coverage experienced increased health care use, reduced financial strain, and improved mental health and self-reported health. To better understand the causal mechanisms behind these quantitative results, a staff of interviewers led by Allen conducted 120 structured interviews with study participants who recently gained Medicaid coverage.
The interviews were especially useful in helping researchers understand the quantitative results that were more unexpected. For example, many in the broader health policy community believed that providing health insurance to low-income individuals would result in a decrease in emergency department use. However, the results of the study showed that Medicaid coverage actually increased the use of health-care services across the board, including the emergency department. Through the interviews with participants, researchers started to uncover some of the potential factors contributing to the sustained increase in emergency department use.
“There were several things driving this increase. First, we learned that there were many individuals who should have been seeking care but were not. Some of these individuals described themselves as healthy despite having serious, chronic health problems or were generally hesitant to seek care due to cost concerns. Receiving health insurance removed a financial access barrier that may have prevented them from going to the emergency department in the past. On the other hand, we also started hearing how some primary care providers actually told their patients to go to the emergency department. One study participant shared that her primary care provider directed her to the ER since her blood sugar was dangerously high. All these stories helped us understand why we saw an increase in emergency departments.”
According to Allen, another finding that was surprising was that many study participants believed they had emergency-only coverage. Initially, this made little sense to the researchers since the coverage study participants received was a comprehensive health insurance plan. The researchers decided to probe a little deeper to understand the root cause of the confusion.
“We learned that one of the very first things many people did when they got their insurance card was call a dentist and say, ‘I just got the Oregon Health Plan, can I make an appointment?’ And the receptionist at the dental clinic would tell them that the Oregon Health Plan was emergency-only coverage, which is the case for dental care. But hearing this led many study participants to believe that their insurance only covered emergency care for health services too. I don't know that this drove any large effects in the study, but it was really interesting to hear that there was an unexpected kind of administrative complexity that I wouldn't have guessed prior to actually talking to people.”
Study participants also reported significant improvements in health. However, these findings were not coupled with clinically significant improvements in objective health measures like hypertension, diabetes, obesity, and behaviors like smoking. This posed another puzzle for researchers, but some potential explanations were found in participants’ stories.
“Initially, we thought when participants received health insurance, they’d gain access to a provider who could diagnose their health issues, and with this diagnostic information, they could start treatment that led to improved health outcomes. The interviews helped us understand that the pathway of receiving health insurance to improved health wasn’t as simple as we had thought. Having a good patient-provider relationship is a key factor in whether a patient’s health improves. For many participants, it took seeing multiple providers before they found someone they really liked working with. Once participants established a good relationship with a provider, they often prioritized what they were going to work on. And they might have more immediate needs like a broken ankle that needs surgery, so they may not address their diabetes or their obesity or smoking first.”
For Allen, conducting this qualitative follow-up research was critically important to making meaning out of the quantitative results, especially the findings that were surprising. These conversations with patients helped researchers understand where Medicaid was working for people and where people ran into barriers.
“The Oregon study constantly drives me to think about how we can make Medicaid work better for people, how we can improve the places where we lose efficiency and effectiveness in this causal chain from getting coverage to having positive health outcomes. There are probably multiple points where we could make slight modifications and see meaningful improvements. I think what's been helpful about our qualitative work is that it provides a pathway to think through real people's experience, rather than just partisan expectations of the experience. The interviews also drove home the humanity behind the numbers which isn’t to be taken lightly.”
Part one of this four-part blog series highlights the value of incorporating qualitative methods into randomized evaluations and outlined specific tips for researchers. Part two talks about how qualitative research helped motivate and shaped the central question and hypothesis for a study on racial concordance between physicians and patients. Part three looks at how Creating Moves to Opportunity randomized evaluation embedded qualitative research methods into its study design. Part five highlights the value of qualitative research in providing a deeper understanding of mothers' experiences in the Baby's First Years study.
In part three of our qualitative research blog series on incorporating qualitative research into randomized evaluations, we learn more about how researchers conducting the Creating Moves to Opportunity (CMTO) project embedded qualitative research methods into their study and what factors made conducting high-quality, interdisciplinary research feasible.
In part three of our qualitative research blog series on incorporating qualitative research into randomized evaluations, we learn more about how researchers conducting the Creating Moves to Opportunity (CMTO) project embedded qualitative research methods into their study and what factors made conducting high-quality, interdisciplinary research feasible.
Since 2018, MDRC, J-PAL affiliates Peter Bergman, Raj Chetty, Nathaniel Hendren, Lawrence Katz, and Christopher Palmer, along with sociologist and qualitative research expert Stefanie DeLuca, have been conducting a randomized evaluation of Creating Moves to Opportunity, a housing mobility program in Seattle and King County, Washington. The study, conducted with approximately 1,300 families, aims to understand CMTO’s impact on helping families move to neighborhoods with lower rates of poverty and more opportunities for upward income mobility.
In Phase One of the study, qualified low-income families with at least one child under fifteen were drawn from the Housing Choice Vouchers waitlist and offered an opportunity to enroll in the study. Participants were then randomly selected to receive CMTO services—including customized search and landlord engagement assistance from family and housing navigators and short-term financial assistance—or receive the housing authorities’ standard services.
Results from Phase One of the study found that CMTO-participating families were more likely to move to higher-opportunity neighborhoods than families who only received standard services. To help interpret these results and understand participating families’ experiences, the research team conducted qualitative analyses to complement the quantitative results. Led by DeLuca, a team of research staff carried out in-depth interviews with 161 participating families (from both the treatment and control groups) using an approach that borrowed some elements typically used in quantitative research to ensure the collection of high-quality, representative data.
“Using the program administrative data, we pulled a stratified random sample of families participating in CMTO to interview and achieved an 80 percent response rate. Those are two relatively unusual things to do in interview studies, but it helped ensure that the particular patterns that emerged from the data were representative of the families in the program.”
According to DeLuca, having a research team with the capacity and flexibility to follow up and be on-site when necessary was critical to the success of the qualitative research component of CMTO.
“Achieving an 80 percent response rate took more than just well-spaced regular phone calls. It required a research team with the capacity to be on-site and commit to really connecting with the study participants, especially with the door-knocking component of recruitment. It makes a huge different for people to see you as interested in their stories, not as telemarketers. With interviews taking up to two to four hours, being on-site also meant we could conduct follow-up interviews in case the interviewer wasn’t able to get through everything the first time. It also allowed us to collect ethnographic observations of the neighborhood and see the rhythm of the household, meet family members and friends—even run errands with families when needed.
We also used pairs when possible for interviews and had team members memorize the interview guide. With one team member conducting the interview without looking at a script and another paying attention to anything that got missed and noting that at the end, we had the benefit of an organic, more natural conversation alongside systematic data collection. While team members trained coders and designed codebooks, the bulk of the coding was also done by people who weren't interviewers, so the data wasn’t coded through the lens of assumptions and preconceived notions that someone present at the interview might have. Finally, we had several reliability procedures in place, including having the interviews coded by at least two different people, and a third person to note inconsistencies to be resolved.”
Participant interview findings helped to highlight the aspects of the program that led families to move to higher-opportunity neighborhoods.
“Through the interviews, we were able to better understand what was happening from the participants’ point of view and what mechanisms were at play. Previously, many believed that interventions focused on providing more information and monetary resources would encourage families to move to neighborhoods with more opportunities for upward income mobility. But what we see from CMTO is that those two types of resources are not sufficient to explain the success of this program. What came through so clearly in the narratives of the participants was how important it was to have that support from the housing navigators who could boost confidence and provide customized assistance based on the specific needs of the families.”
Ultimately, for DeLuca, incorporating qualitative research into randomized evaluations is about providing opportunities to check assumptions and to see a bigger picture of what might be driving the impact of a program or policy.
“The joining of disciplines like economics and sociology for randomized evaluations can provide an opportunity to see something you otherwise wouldn’t see. Sociologists are often trained to think about barriers to social mobility and wellbeing and tend to focus a bit less on the decision-making process of individuals. In contrast, economists tend to emphasize decision-making quite a bit and might give less attention to the context in which the decisions are being made. By bringing the two together, researchers can leverage the theories and tools of each discipline and, hopefully, conduct a more policy-relevant and consequential study.”
Part one of this four-part blog series highlights the value of incorporating qualitative methods into randomized evaluations and outlined specific tips for researchers. Part two talks about how qualitative research helped motivate and shaped the central question and hypothesis for a study on racial concordance between physicians and patients. Part four discusses how qualitative research helped the Oregon Health Insurance Experiment research team make sense of some of the study’s results. Part five highlights the value of qualitative research in providing a deeper understanding of mothers' experiences in the Baby's First Years study.
J-PAL North America reflects on using qualitative research methods in randomized evaluations, and summarize a few practical tips for those interested in integrating a qualitative approach into their studies.
Randomized evaluations allow researchers to measure the impact of programs and policies on a range of outcomes. Using this approach in North America, J-PAL researchers have recently examined a wide range of topics, including the effects of Medicaid on rates of health care utilization and the impact of a housing mobility program on the likelihood of families moving to lower-poverty neighborhoods.
But what mechanisms are driving the effects of these programs and policies? How did the context, design, and implementation of the program or policy influence the result? If replicated in a different context, will the program have the same effects? Is the study asking the right question?
Researchers can often collect quantitative data and design evaluations to shed light on these types of questions, but there’s always more to learn. Qualitative methods, such as direct observation, in-depth interviews, and focus groups, allow researchers to dive into these questions by examining participants’ beliefs, attitudes, experiences, and perspectives. Data gleaned from these methods can help researchers gain insight into potential mechanisms or barriers, generate new hypotheses and questions, and understand the stories behind the quantitative results.
For decades, social science scholars within anthropology, sociology, and psychology have employed qualitative methods. In recent years, many researchers within the traditionally quantitative field of economics have also incorporated qualitative methods into their studies and built teams with qualitative expertise to strengthen their research.
From our conversations with several researchers who've conducted and relied on qualitative research methods as part of J-PAL-supported randomized evaluations, we've summarized a few practical tips for those interested in integrating a qualitative approach into their studies:
- While developing your randomized evaluation, don't discount questions that can be best addressed through qualitative methods. These questions may challenge certain assumptions or shed light on mechanisms, contexts, or outcomes that quantitative methods may not fully capture. For example, researchers may want to gain insight into the experience of staff implementing a particular program to identify the challenges and barriers they faced, understand their perception of the program’s successes or shortcomings, and identify potential obstacles to longer-term implementation or scale-up. While this may be difficult to assess in a survey, focus groups and qualitative interviews could provide valuable insights.
- Account for qualitative research in study proposals and budgets. Qualitative research can require a high time commitment and can benefit from the support of specialized team members.
- Cultivate relationships with implementing partners. Forming a strong relationship with implementing partners is one key component to a successful and policy-relevant study and can help build a foundation for conducting qualitative research. Implementing organizations interact closely with study participants and often play instrumental roles in shaping the design and implementation of randomized evaluations. They are also well-placed to help researchers determine the best approaches to carrying out the qualitative parts of a study.
- Diversify your research team. Consider building a research team of individuals from different disciplines. Scholars of psychology, anthropology, sociology, and social work often have extensive experience with qualitative methods and bring valued perspectives that economists may be missing.
This blog series highlights three examples of J-PAL research teams using qualitative research methods to inform and strengthen the design, implementation, and analysis of their randomized evaluations. For part two of the series, we interviewed Professor of Public Policy and US Health Care Delivery Initiative Co-Chair Dr. Marcella Alsan about how qualitative research helped motivate and shaped the central question and hypothesis for a study on racial concordance between physicians and patients. In part three, we spoke with Professor of Sociology & Social Policy Stefanie Deluca about how the Creating Moves to Opportunity randomized evaluation, a study she co-led, embedded qualitative research methods into its study design. Part four features a conversation with Associate Professor of Social Work and Oregon Health Insurance Experiment co-author, Heidi Allen, on how qualitative research helped the research team make sense of some of the study’s results. The series concludes with part five, where we spoke with researchers from the the Baby's First Years study about the value of qualitative research in providing a deeper understanding of mothers' experiences.