Designing intake and consent processes in health care contexts

Authors
Laura Ruiz
Contributors
Jesse Gubb Noreen Giga
Last updated
In Collaboration With
MIT Roybal Center logo

Overview 


When conducting randomized evaluations of interventions with human subjects, researchers have to consider how to design intake and consent processes for study participants. Researchers must ensure that participants are well informed about a study in order for them to give informed consent. This part of the enrollment process will ultimately determine the study sample composition. Health care interventions present additional challenges compared to other human subject research areas, given the complexities of the patient-health professional relationship, concerns about disruptions to engagement in care, and the vulnerable position patients are potentially in, particularly if enrolled while receiving or seeking care.


This resource details challenges in health care contexts when designing the intake and consent process and highlights how challenges were addressed in two studies conducted by J-PAL North America: Health Care Hotspotting in the United States and the Impact of a Nurse Home Visiting Program on Maternal and Early Childhood Outcomes in the United States. These case studies show the importance of communicating with implementing partners, deciding how to approach program participants, supporting enrollment, providing training, and creating context-specific solutions. For a general overview of the intake and consent process beyond the health care context please see the resource on defining intake and consent processes. 

Health Care Hotspotting

Description of the study

The Camden Coalition of Healthcare Providers and J-PAL affiliated researchers collaborated on a randomized evaluation of the Camden Core Model, a care management program serving “super-utilizers,” individuals with very high use of the health care system and with complex medical and social needs, who account for a disproportionately large share of health care costs. This program provides intensive, time-limited clinical and social assistance to patients in the months after hospital discharge with the goal of improving health and reducing hospital use among some of the least healthy and most vulnerable adults. Assistance includes coordinating follow-up care, managing medication, connecting patients to social services, and coaching on disease-specific self care. The study was conducted to evaluate the ability of the Camden Core Model program to reduce future hospitalizations amongst super-utilizers compared to usual care.

Patients were enrolled in the study while hospitalized. Prior to enrollment, patients were first identified as potentially eligible by the Camden Coalition’s triage team, which reviewed electronic medical records of admitted patients daily to see if they met inclusion criteria. Camden Coalition recruiters then approached potentially-eligible patients at the bedside, confirmed eligibility, administered the informed consent process and a baseline survey, and revealed random assignment. Potential participants were given a paper copy of the consent form, which was available in English and Spanish, and were given time to ask questions after the information had been provided to them. Enrollment ran from June 2, 2014 through September 13, 2017. 

Enrollment at a glance:

  • Enrollment period: 3.3 years (June 2, 2014 to September 13, 2017)
  • Sample size: 800 hospitalized patients with medically and socially complex conditions, all with at least one additional hospitalization in the preceding 6 months
  • Subjects contacted: 1,442 eligible patients were identified, many of whom declined to participate or could not be reached prior to discharge
  • Who enrolled: Camden Coalition recruitment staff (distinct from direct providers of health care and social services)
  • Where enrolled: Hospital, while subject was admitted
  • When enrolled: Enrolled prior to random assignment; randomization revealed immediately after 
  • Level of risk: Minimal risk
  • Enrollment compensation: $20 gift card

Which activities required consent?

Participants provided informed consent for study participation, for researchers to access data, and to receive the intervention if assigned to the intervention. The consent process included informing participants about the randomized study design, the details of the Camden Core Model program and the care they would receive, and how their data would be used. Participants were only able to receive services from the Camden Coalition if they agreed to be part of the study, but they were informed about their rights to seek alternative care regardless of randomization. 

The consent process focused on consent to permit the use and disclosure of health care data, known as protected health information (PHI). As explained in the resource on data access under HIPAA, accessing PHI for research often requires individual authorization from patients, and individual authorization can smooth negotiations with data providers even when not required. The process for individual authorization under HIPAA can be combined with informed consent for research. In this case, the study relied on administrative data from several sources: hospital discharge data provided by Camden-area hospitals and the state of New Jersey, Medicaid claims data, social service data provided by the state, and mortality data provided by the federal government. The risk of accidental data disclosure comprised the study’s primary risk. 

The consent process provided details so patients could make informed decisions about data disclosure. Patients were informed that the researchers would collect identifiers (name, address, social security number, and medical ID numbers) and use them to link participants to administrative data. Patients were informed about the types of outcomes considered, the types of data sources, the legal environment and HIPAA protections afforded to participants, and how data would be stored and protected. Plans for using administrative data were not all finalized at the time of enrollment so language was kept broad to allow for linkages to additional datasets. The consent form also made clear that the data existed whether patients participated in the study or not; consent was sought so that researchers could access the existing data to measure the impact of the Camden Core Model. 

In addition to informed consent for health data disclosure, researchers also obtained informed consent for participating in the study and receiving the intervention. Participants were informed about the parameters of the study (including the goals of the research, that participation is voluntary and does not affect treatment at the hospital, and the probability of selection for receiving the intervention) as well as details of the program (including the composition of the care team and the goals and activities of the program, such as conducting home visits and scheduling and accompanying patients to medical appointments). Because the intervention was not created by the researchers, was unchanged for the purposes of the study, and came with no additional risks beyond the risk of data disclosure, the bulk of the consent process focused on data. Participants selected into the intervention group filled out an additional consent form in order to receive the Camden Core Model program.

A waiver of informed consent was not pursued by the research team, despite the focus on secondary data, which may have qualified the study for a waiver. Waivers are not allowed when it is practicable to seek consent. There was a previously existing in-person intake process for Camden Core Model program participants, so seeking consent was practicable, and therefore sought by the research team.

Seeking informed consent also generated additional benefits. It gave all potential participants the opportunity to understand the research study and make an informed decision about participation. Consenting prior to randomization also improved statistical power, because those who declined to participate were excluded from the study entirely rather than diluting an intervention effect by lowering take up. As noted, informed consent may have also helped researchers gain access to data even if a HIPAA waiver or a DUA (in the case of limited data) may have been technically acceptable.  

Who was asked for consent? 

All participants, in both intervention and comparison groups, were required to give consent to participate in the study. Initial eligibility checks performed by examining electronic medical records guaranteed that all potential participants approached by recruiters met most inclusion criteria. The broader triage population whose records were examined did not give informed consent because this process was already part of program implementation and was not unique to the study.

If potential participants were cognitively impaired, did not meet eligibility requirements, or could not communicate with the recruiter when being approached to inform them about the program and the study, the individuals were deemed not eligible to participate and the consent process did not take place. 

The study included vulnerable participants who were chronically ill, currently hospitalized, and likely economically disadvantaged. Beyond the target population for the intervention however, specific vulnerable groups governed by human subjects regulations (such as pregnant women, children, or prisoners) were either explicitly excluded (in the case of children) or recruited only incidentally. 

How was program enrollment modified for the study? 

The Camden Core Model was already in place before the study started. Before the study took place, Camden Coalition staff identified eligible patients using real time data from their Health Information Exchange (HIE) — a database which covers four Camden hospital systems — and then approached patients while in the hospital to explain the program and to invite them to enroll. Because of program capacity constraints, only a small fraction of the eligible population could be offered the program, and participant prioritization was ad hoc prior to the beginning of the study.

The study was designed with similar inclusion and exclusion criteria and a similar triage and bedside enrollment process as the one the program had in place. For randomization, researchers created a randomization list with study IDs and random assignments prior to enrollment. Triage staff, who identified eligible patients using the HIE, assigned a study ID to potential participants without knowing intervention and comparison assignments. Informed consent and the baseline survey took place at the bedside, and recruiters revealed random assignments at the end of the enrollment process. The Camden Coalition hired additional full time recruiters to meet the scale of the study and strictly separated functions of their data team, triage staff, and recruiters in order to preserve the integrity of random assignment. 

When did random assignment occur and why?

Recruiters revealed random assignments immediately after the informed consent process and the baseline survey were completed. Randomizing after consent was extremely important for two distinct reasons: bias and power. Randomizing after consent ensured that participation rates were balanced between intervention and comparison groups; this helped to prevent bias because individuals could not self-select into the study at different rates based on intervention assignment. Had the study team randomized prior to seeking consent, some individuals offered the program may have declined participation in the study, lowering take-up and reducing statistical power. Randomizing after consent also greatly increased power since researchers were able to exclude those who declined to consent from the study entirely, resulting in increased take up. This approach, however, places burdens on enrollment staff who must inform participants when they are assigned to the comparison group that they cannot access the intervention. As a result, it is important for researchers to discuss these issues early with the implementing partner and brainstorm whether there are ways to conduct randomization after recruitment in a manner that the partner is comfortable with.

Who conducted enrollment? 

The study team used enrollment specialists employed by Camden Coalition who, while separate from other program staff, were sensitive to participant concerns about research participation. Some specialists were newly hired while others were redeployed from other positions within the Camden Coalition. As much as possible, the Camden Coalition hired specialists from the community where the study took place to allow for greater cultural and demographic alignment with participants. To ensure staffing across the full study period, however, they were flexible in this criterion. 

Enrollment specialists approached potential participants to describe the program and seek consent prior to random assignment. Recruiters were bilingual, to communicate with patients in either English or Spanish. They were trained to approach eligible patients in a timely manner and in a standardized way following study protocols. They were also trained to introduce themselves and ask the patient questions to assess how they were feeling prior to talking about the intervention, to assess whether patients had the cognitive capacity to listen to and understand the information being provided, and to give consent. Camden Coalition staff led the development of this training, building off how enrollment was conducted prior to the RCT.

Some patients were wary of participating in research in general, and both patients and recruitment specialists often felt discouraged when patients were assigned to the comparison group. By limiting enrollment to a small number of recruitment specialists, researchers and the Camden Coalition were able to provide specialized support and training to the recruitment specialists. The Camden Coalition provided staff funding for therapy to support their mental health. The recruitment specialists also supported each other and developed best practices, including language to introduce the study without promising service, methods of preventing undue influence, and ways to support disappointed patients. To help maintain momentum and morale throughout the study period, the team celebrated enrollment milestones with a larger group of Camden Coalition staff, which helped to illustrate that enrollment was a part of a broader organizational goal.

How were risks and benefits described?

The IRB deemed this study minimal risk. The only risk highlighted during enrollment was the loss or misuse of protected health information. The study team noted that this risk existed for both intervention and comparison groups and that they were taking steps to mitigate it. Benefits were similarly modest, and as noted above, recruiters made sure to avoid over-promising potential benefits to participants. The benefits included potential useful information derived from the results of the study, and receipt of the program (which was described as potentially improving interactions with health care and social service systems). Although not considered a research best practice, the $20 compensation for completing the survey was also listed as a benefit.

How were participants compensated?

Participants who completed the baseline survey (administered after consent and prior to randomization) were given a $20 gift card for their time. Recruiters informed participants that the survey would take approximately 30 minutes.

Randomized Evaluation of the Nurse-Family Partnership

Description of the study

J-PAL affiliated researchers partnered with the South Carolina Department of Health and Human Services (DHHS), other research collaborators, and other partners to conduct a randomized evaluation of the Nurse-Family Partnership (NFP) program. This program pairs low-income first-time mothers with a personal nurse who provides at-home visits from early pregnancy through the child’s second birthday. The goal of the program is to support families during the transition to parenthood and through childrens’ early childhood to improve their health and wellbeing. The program includes up to 40 home visits (15 prenatal visits, 8 postpartum visits up to 60 days after delivery, and 17 visits during the child’s first two years), with services available in Spanish and English. An expansion of the program in South Carolina to nearly double its reach presented an opportunity to evaluate the program and measure its impact on adverse birth outcomes like low birth weight, child development, maternal life changes through family planning, and other outcomes, further described in the study protocol. 

Different channels were used to identify potential participants, including direct referral through local health care providers, schools, and Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) agencies; direct referrals from the Medicaid eligibility database to NFP; referrals by friends or family members; knowledge of the program through digital and printed advertisement; or identification of participants by the outreach team hired for the study period for this purpose. After identification, NFP nurses who were also direct service providers visited the potential participants at their homes or private location of their choice, assessed their eligibility, and if eligible, conducted the informed consent process. After obtaining informed consent, the nurses administered the baseline survey, after which the participant was compensated with a $25 gift card for their time, and received a referral list to other programs and services in the area. Immediately after, participants were randomized into the intervention or comparison group using the survey software SurveyCTO. Two-thirds of participants were randomly allocated to the intervention group and one-third to the comparison group. Enrollment ran from April 1, 2016 to March 17, 2020. 

Enrollment at a glance:

  • Enrollment period: 4 years (April 1, 2016 to March 17, 2020)
  • Sample size: 5,655 Medicaid-eligible, nulliparous pregnant individuals at less than 28 weeks’ gestation
  • Subjects contacted: 12,189 eligible and invited to participate
  • Who enrolled: Nurse home visitors (direct service providers)
  • Where enrolled: Participant’s home or location of preference
  • When enrolled: Enrolled prior to random assignment; on the spot randomization
  • Level of risk: Minimal risk
  • Enrollment compensation: $25 gift card

Which activities required consent?

Similar to Health Care Hotspotting, the NFP study protocol required consent from participants to be part of the evaluation, as well as consent to use their personal identifiable information to link to administrative data for up to 30 years. These data included Medicaid claims data, hospital discharge records, and vital statistics, as well as a broad range of linked data covering social services, education, mental health, criminal justice, and more. This data allowed the researchers to gather information on the health and well-being of mothers and children. For those assigned to the intervention group, program participation involved a separate consent process. 

Who was asked for consent?

All participants, in both intervention and comparison groups, were required to give consent to participate in the study. Potential participants were first-time pregnant people who were less than 28 weeks’ gestation, were income-eligible for Medicaid during their pregnancy, older than 15, and lived in a catchment area served by NFP nurses. Participants consented for themselves and for their children, as stated in the informed consent process. 

The NFP program focused on enrolling people in early pregnancy to ensure the home visits included the prenatal period. People were excluded from the study if they were incarcerated or living in a lockdown facility, or under the age of 15. Program services were provided in English, Spanish, and additional translation services were available for participants who spoke other languages. To be eligible for the study, people also needed to have enough language fluency that they would be able to benefit therapeutically from the program.

To identify children born during the study, researchers probabilistically matched mothers to births recorded in vital records, using social security number, birth date, name, and Medicaid ID.

How was program enrollment modified for the study? 

NFP had a decentralized method of identifying participants for their program prior to the study. Once patients were identified, nurses would enroll them as part of the first home visit. Under this system, the program served about 600 women per year. 

Alongside the randomized evaluation, the NFP program was scaled up to serve an average of 1,200 people per year, compared to 600 people served prior to the study. Extra personnel were hired to conduct outreach to new eligible participants.

Although the randomized evaluation coincided with an expansion of the program and the hiring of new staff, including an outreach team, NFP leadership requested that the nurse home visitors  who make home visits be the ones conducting study enrollment. Therefore, nurses provided on-the-spot randomization and informed participants about their study group. NFP and local implementing partners believed their nurses were better equipped to work with the study population rather than external surveyors, as nurses who deliver the program are well-trained in working with low-income, first-time mothers from the local communities. They also believed that shifting the recruitment model to a centralized process with enrollment specialists was infeasible given the scale of the study and the program.

The study’s informed consent process was incorporated into the pre-existing program recruitment and consent process, therefore participants received both program and study information at the same time. This allowed the study team to ensure participants also received clear information about the program.

When did random assignment occur and why?

Random assignment was conducted on-the-spot, after participants consented to the study and completed the baseline survey. The survey software, SurveyCTO, automatically assigned participants to either an intervention or comparison group. Two-thirds of the sample was randomly allocated to receive the treatment intervention and one-third to the comparison group.

As in Health Care Hotspotting, consent prior to randomization guaranteed that there was balance between intervention and comparison groups as both would have received the same information and would be equally likely to consent to be part of the study. On-the-spot randomization also maximized statistical power, as only those who consented to participate were randomized, instead of randomizing and then approaching participants — who may decline to participate. Although withdrawal after randomization could still occur, for those in the intervention group, nurses were able to immediately conduct the first home visit, so this strategy helped improve take-up of the intervention.

Who conducted enrollment?

Nurse home visitors conducted enrollment and randomization with support from a recruitment support specialist (a research associate) at J-PAL North America. This involved investment from the research team to dedicate sufficient staff capacity to train and support nurses during the four years in which enrollment took place. The research associate was responsible for training nurses on study protocols, providing remote and in-person field support, monitoring survey recordings for quality and compliance checks, managing gift cards and study tablets, and maintaining morale and building relationships with nurses.

Nurses invested in learning how to be survey enumerators in addition to delivering high-quality nursing care. The research team trained all nurses on how to recruit, assess eligibility, deliver informed consent, randomize patients into intervention and comparison groups, and deliver the baseline survey using SurveyCTO software on a tablet. All nurses had to complete this training before they could enroll patients. The research team went to South Carolina before the start of the study and conducted a two-day in-person training to practice obtaining informed consent and using tablets to administer the baseline survey. Yearly refresher trainings were offered.

Communicating comparison group assignment was one of the main challenges for nurses during study enrollment. Nurses navigated this challenge by handing patients the tablet and having them press the randomization button, which helped nurses and patients to remember that assignment to a group was automatically generated and out of the control of both parties.

The research team provided additional resources to mitigate the concern that nurses may not adhere to random assignments. The research team conducted quarterly in-person enrollment and field trainings for nurses and nurse supervisors that highlighted benefits for comparison group participants, including that all comparison group participants benefit from meeting with a caring health care professional who can help with Medicaid enrollment if needed and receive a list of other available services.They also reminded nurses that the evaluation helped to expand services to more patients than would otherwise be served.

The research team’s recruitment support specialist operated a phone line that nurses could call for emotional and technical support, coordinated in-person and web-based training for new nurses, sent encouragement to the nurses, and monitored fidelity to the evaluation design through conducting audio checks on the delivery of informed consent and baseline survey. The support specialist also troubleshooted any issues with the tablet in real time and helped to resolve any data input errors...

How were risks and benefits described?

The primary risk of participating in the study was the loss or misuse of protected health information. The benefits included potentially useful information derived from the results of the study, and receipt of the intervention, which could improve the health and wellbeing of participants and their child.

How were participants compensated?

After completing the baseline survey, all study participants were compensated with a $25 Visa gift card for the time it took them to complete the survey.

 

Acknowledgements: Thanks to Amy Finkelstein, Catherine Darrow, Jesse Gubb, Margaret McConnell, and Noreen Giga for their thoughtful contributions. Amanda Buechele copy-edited this document. Creation of this resource was supported by the National Institute On Aging of the National Institutes of Health under Award Number P30AG064190. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

J-PAL’s Research Resources

J-PAL’s Research Resources provide additional information on topics discussed in this resource, including the Regular intake and consent research resource. 

Health Care Hotspotting

Nurse Family Partnership

Camden Coalition  of Healthcare Providers. “Camden Core Model.” Camden Coalition  of Healthcare Providers. (December 2, 2021). Accessed February 6, 2023. https://camdenhealth.org/care-interventions/camden-core-model/. 

Camden Coalition of Healthcare Providers. “The Camden Core Model – Patient Selection and Triage Methodology.” Camden Coalition  of Healthcare Providers. Accessed February 6, 2023. https://camdenhealth.org/wp-content/uploads/2019/11/Care-Management_Triage_11022018_v5.pdf. 

Finkelstein, Amy, Annetta Zhou, Sarah Taubman, and Joseph Doyle. “Health Care Hotspotting — a Randomized, Controlled Trial.” New England Journal of Medicine 382, no. 2 (2020): 152–62. https://doi.org/10.1056/nejmsa1906848. 

Finkelstein, Amy, Annetta Zhou, Sarah Taubman, and Joseph Doyle. “Supplementary Appendix. Healthcare Hotspotting – A Randomized Controlled Trial.” New England Journal of Medicine 382, no. 2 (2020): 152–62. https://www.nejm.org/doi/suppl/10.1056/NEJMsa1906848/suppl_file/nejmsa1906848_appendix.pdf.

Finkelstein, Amy, Annetta Zhou, Sarah Taubman, and Joseph Doyle. “Supplementary Appendix. Healthcare Hotspotting – A Randomized Controlled Trial.” New England Journal of Medicine 382, no. 2 (2020): 152–62. https://www.nejm.org/doi/suppl/10.1056/NEJMsa1906848/suppl_file/nejmsa1906848_protocol.pdf.

Harvard T.H. Chan School of Public Health. “Partners.” South Carolina Nurse-Family Partnership Study Website. Accessed February 6, 2023. https://www.hsph.harvard.edu/sc-nfp-study/partners/. 

Harvard T.H. Chan School of Public Health. “Pay for Success.” South Carolina Nurse-Family Partnership Study, July 1, 2022. https://www.hsph.harvard.edu/sc-nfp-study/about-the-study-about-the-study/pay-for-success/. 

Harvard T.H. Chan School of Public Health. “People.” South Carolina Nurse-Family Partnership Study Website. Accessed February 6, 2023. https://www.hsph.harvard.edu/sc-nfp-study/people/. 

McConnell, Margaret A., Slawa Rokicki, Samuel Ayers, Farah Allouch, Nicolas Perreault, Rebecca A. Gourevitch, Michelle W. Martin, et al. “Effect of an Intensive Nurse Home Visiting Program on Adverse Birth Outcomes in a Medicaid-Eligible Population.” JAMA 328, no. 1 (2022): 27. https://doi.org/10.1001/jama.2022.9703

McConnell, Margaret A., R. Annetta Zhou, Michelle W. Martin, Rebecca A. Gourevitch, Maria Steenland, Mary Ann Bates, Chloe Zera, Michele Hacker, Alyna Chien, and Katherine Baicker. “Protocol for a Randomized Controlled Trial Evaluating the Impact of the Nurse-Family Partnership’s Home Visiting Program in South Carolina on Maternal and Child Health Outcomes.” Trials 21, no. 1 (2020). https://doi.org/10.1186/s13063-020-04916-9.

The Abdul Latif Jameel Poverty Action Lab. “Health Care Hotspotting in the United States.” The Abdul Latif Jameel Poverty Action Lab (J-PAL). Accessed February 6, 2023. https://www.povertyactionlab.org/evaluation/health-care-hotspotting-united-states.

The Abdul Latif Jameel Poverty Action Lab. “The impact of a nurse home visiting program on maternal and early childhood outcomes in the United States.” The Abdul Latif Jameel Poverty Action Lab (J-PAL).  Accessed December 22, 2022, from https://www.povertyactionlab.org/evaluation/impact-nurse-home-visiting-program-maternal-and-early-childhood-outcomes-united-states

South Carolina Healthy Connections. “Fact Sheet: South Carolina Nurse -Family Partnership Pay for Success Project.” Accessed February 6, 2023. https://socialfinance.org/wp-content/uploads/2016/02/021616-SC-NFP-PFS-Fact-Sheet_vFINAL.pdf