Research Resources

Resources for conducting remote surveys

Summary

As with in-person surveys, remote survey work involves considerations at every stage of the project lifecycle. This page summarizes key points regarding remote surveys and, where applicable, lists J-PAL’s related public resources, in which more detailed guidance can be found.

In restricting the focus to J-PAL’s resources, this page does not list the universe of useful tools for conducting remote surveys, including those developed by Innovations for Poverty Action (IPA) and others. IPA’s resources, including the Remote surveying in a pandemic handbook, can be accessed on the Phone survey methods page of their RECOVR hub.

Project planning

Phone survey planning should occur as early in the project lifecycle as possible, as conducting surveys via phone rather than in person has implications for the budget, sampling strategy, and IRB protocol. If the study will use administrative data (e.g., to construct a sampling frame), it is also important to ensure early on that the relevant agreements are in place to obtain access to administrative data with phone numbers.

IRB

IRB approval is needed for a switch from written to verbal consent (and for any consent process regardless of format). Data security procedures must also be documented for the IRB and may differ for phone surveys. Changes to data quality checks, such as utilizing audio recordings instead of spot checks or having a third-person listen in on a call, also require IRB approval. More information on adapting consent processes, data security procedures, sampling, and data quality checks can be found below. 

Budgeting

On the budgeting front, phone-based surveys may benefit from hiring more supervisory staff, conducting additional training, and supplying enumerators with additional devices as compared to in-person surveys. Typically, phone-based surveys do not require other expenses such as transportation or lodging. See the relevant section of the grant proposals resource and also the budgeting for phone surveys guide for more information. 

  • Staff and staff training: Researchers may want to consider budgeting for additional research manager time to supervise hiring. Enumerators will need training on tracking calls and appointments, conducting surveys via phone, and device management, so it can be helpful to budget extra training time.
  • Survey productivity: While phone-based surveys should be relatively short (no more than 30 minutes), enumerators may not be able to conduct a large number of surveys in a given day. Note that the bulk of enumerator time will most likely be spent attempting to reach potential respondents, rather than conducting interviews, so it is important to be conservative with productivity estimates. For example, with a 30-minute survey, J-PAL South Asia advises planning for surveyors to complete 4-8 surveys per surveyor-day. 
  • Devices: Researchers should budget for additional devices and software to facilitate phone-based surveys (e.g., headphones, phone recharges, number-masking platforms, data or internet packages, etc.). See J-PAL South Asia’s sample phone survey budget for more information. 
  • Airtime: Payments for airtime—both for surveyors and as an incentive for participants—should be budgeted and tracked carefully. The J-PAL Africa office has used tools such as flickswitch to manage airtime payments.

Research design

Phone surveys face a couple of logistical challenges that must be accommodated in research design. First, to keep data quality high and to mitigate issues such as dying phone batteries, surveys should be kept to a 30 minute call at most (or multiple 30 minute calls if needed), meaning that most surveys should be shortened considerably to only focus on key outcomes. Unless drawing from a pre-existing list (such as from a baseline survey), obtaining a list of participants with contact information can be difficult. Finally, as mentioned above, consent processes must be adapted to be done via phone. See the following resources for more information.

Survey design

In addition to the modifications described above, eliminate sensitive questions to the extent possible (and, if not possible, establish clear protocols for asking these questions safely). While coding your survey, consider including the option to save and exit the survey after major modules if the respondent needs to stop the survey and continue later. For more information, see the relevant section of the survey design resource. 

Intake and consent process

Though the informed consent script used by enumerators should always be as clear and simple as possible, this is particularly important in phone surveys, where it is impossible to use body language to gauge the respondent’s understanding and reaction. The consent script should contain information about the purpose of the call, who is calling (both the organization and the individual), the confidentiality of results, and an estimated duration of the survey (which can be obtained during survey piloting). If feasible, consider collecting electronic signatures instead of verbal consent. If the research team plans to use audio recordings, respondents need to be notified during the consent process. 

Sampling and randomization

If a pre-existing list of potential respondents is unavailable, consider utilizing random digit dialing, snowball sampling, or sample pooling (i.e., using lists from other studies, with relevant IRB approvals) to construct your sampling frame, though note that all of these methods have implications for representativeness of the sample.1 With sample pooling, an important additional consideration is overlap with any other recent interventions. To ensure that respondents remain in the study, collect additional contact information (such as a family member’s phone number or a friend’s phone number). See also the randomization resource and the best practices for conducting phone surveys blog post for more information on combining sample frames. 

Data collection

Implementation of phone surveys involves additional departures from that of in-person surveys. As some types of quality checks, such as spot checks or accompaniments, become more challenging logistically, it is important to set up alternative measures such as audio audits and increased coverage of high-frequency checks and back-checks. Respondent tracking takes the form of call scheduling, recording call attempts, and in some cases matching respondents’ spoken language to the appropriate surveyor. Researchers can consider adding an optional, open-ended response question, directed at the surveyor, to record anything unusual or worth mentioning about the call (e.g., call quality issues, concerns about the questions, etc.).

In some instances, such as the 2020 Covid-19 lockdown, it may be impossible or impractical to conduct in-person training of surveyors or to provide surveyors with project devices for making calls. In these cases, creative approaches to surveyor hiring and training and additional data security measures to protect participants’ information are needed. Below, we summarize some key adaptations specific to remote surveys. Further information can be found in the linked resources. 

Data quality checks

While the principles behind data quality checks remain the same for remote surveys, there are some logistical differences. Below are some extra considerations to take when implementing remote surveys. More information can be found in the remote survey section of the data quality checks resource.

  • Audio audits: Research teams can consider incorporating audio audits into their data quality practices. Note that this requires both IRB approval and needs to be mentioned to respondents in the consent form. Different data collection devices offer different audio recording capabilities (e.g., some versions of Android operating systems can only record the enumerator, whereas others can record both sides of the conversation). Some platforms, such as SurveyCTO, have automated quality checks on audio data, such as the amount of silence during a call. 
    • While planning audio audits, researchers should carefully consider the following logistical challenges: 1) audio audit media files can be large, extending the time it takes to download data; and 2) in many cases, researchers will need to hire additional staff (and train them) to transcribe and listen to the recordings, increasing costs. 
  • Phone accompaniments: All enumerators should have some portion of their interviews accompanied by a senior member of the research team. This can be accomplished by having the supervisor join the call directly or review the audio recordings ex-post (though note that a third person joining the interview requires IRB approval and needs to be mentioned in the consent form). Construct clear guidelines for enumerators and the staff who will be accompanying for collecting and responding to data generated during the accompaniment. See also J-PAL’s quality assurance for CATI guide for more information. 

Survey logistics

The decentralized nature of remote surveys can pose logistical challenges not faced in in-person surveys while alleviating others. Some of the key adaptations to survey logistics are summarized below, and are discussed in detail in the survey logistics resource.

  • Device management: Phone-based surveys often require enumerators to have additional devices relative to in-person surveys—examples include headphones, number masking platforms, tablets, etc. The increased number of devices, combined with the fact that research teams may be decentralized, requires adaptations to the device tracking sheets. For more information, see the budgeting for phone surveys guide. 
  • Respondent tracking: Respondent tracking protocols need to capture information about each call attempt the surveyors make, including the day of the week, time of day, who answered the call, and which language they spoke. If there is an incentive being paid for taking the survey, this should also be carefully tracked; this may involve creating a workflow for confirming respondents received the incentive and resolving issues. Research teams should additionally establish a workflow for responding to the respondent tracking data (e.g., targeting a different time of day, or matching surveyors to different households based on the language of who answered the phone). As with in-person surveys, researchers should also provide enumerators with clear guidelines for replacing respondents.
  • Field team management: With limited in-person interaction, surveyor supervision and management are even more critical. Debrief sessions, which should be held often (even daily), may need to be conducted virtually. Researchers should use these sessions to discuss any challenges faced by the surveying team and any quality-related concerns. Additionally, researchers should consider conducting daily phone check-ins with a subset of surveyors, to both reinforce the message that quality and productivity are being monitored and give surveyors a chance to privately relay concerns or ask questions.

Surveyor hiring and training

Test enumerator applicants for additional technical and soft skills vital to phone-based surveys, including but not limited to: ability to hold a conversation while entering data on another device, basic Google Drive or Dropbox knowledge, and ability to handle distractions during a phone call. For more information, including how to conduct surveyor training virtually, see J-PAL South Asia’s Remote Field Staff Training guide and the remote survey training portion of the surveyor hiring and training resource. 

Data security

Surveyors may need additional equipment to ensure data is secure and confidential—examples include noise-canceling headphones, 4G adaptors, sim cards, etc. Any audio recordings of consent or surveys need to be encrypted; as a result, enumerators should be trained on the encryption method and how to manage the audio files. If enumerators will be using personal devices to conduct interviews, additional precautions such as call masking must be taken to protect both enumerator and respondent privacy. Note that if enumerators are conducting interviews from their homes or another decentralized location, the risk of data loss (e.g., losing a device) is higher. 

Last updated: May 2021

These resources are a collaborative effort. If you notice a bug or have a suggestion for additional content, please fill out this form.

1.
See also page 20 of IPA’s Remote surveying in a pandemic handbook for more on random digit dialingAtkinson & Flint (2001) for more on snowball sampling, and the World Bank’s Mobile phone surveys for understanding COVID-19 impacts blog for more on sampling and representativeness.
    Additional Resources
    1. Atkinson, Rowland and John Flint. 2001. “Accessing Hidden and Hard-to-Reach Populations: Snowball research strategies.” Social Research Update 33. https://sru.soc.surrey.ac.uk/SRU33.html

    2. Himelein, Kristen, Stephanie Eckman, Charles Lau, and David McKenzie. “Mobile Phone Surveys for Understanding COVID-19 Impacts: Part I Sampling and Mode.” Development Impact (Blog). April 7, 2020, https://blogs.worldbank.org/impactevaluations/mobile-phone-surveys-understanding-covid-19-impacts-part-i-sampling-and-mode. Last accessed April 23, 2021
       

    3. Hughes, Sarah, and Kristen Velyvis. “Tips to quickly switch from Face-to-Face to Home-Based telephone interviewing.” Mathematica (Blog), April 1, 2020 https://www.mathematica.org/commentary/tips-to-quickly-switch-from-face-to-face-to-home-based-telephone-interviewing. Last accessed April 23, 2021

    4. IPAs’ Remote surveying in a pandemic handbook

    5. IPA’s Phone survey methods guide

    6. J-PAL’s Best practices: Remote field staff training

    7. J-PAL’s Budgeting for phone surveys during the Covid-19 Outbreak

    8. J-PAL’s Covid-19 Pilots and Surveys

    9. J-PAL’s Quality assurance best practices for CATI

    10. J-PAL’s Transitioning to CATI: Checklists and Resources

    11. J-PAL’s Using smartphones to trace mobility during regional lockdowns in Indonesia

    12. SurveyCTO’s Conducting remote and decentralized data collection with SurveyCTO

    Technical resources
    1. Appliqato: Automatic call recording

    2. IPA’s SurveyCTO code for embedding Youtube videos for remote enumerator training

    3. IPA’s SurveyCTO Templates:

      • Mobilizer survey: Contains the mobilizer form, used to generate a survey sample and assign respondents to receive a phone survey, and the phone survey form, which contains the interview.
      • Multiple attempts: Contains code allowing enumerators to submit one form daily, with all attempted interviews, rather than one form per attempt. This limits the total number of submissions to SurveyCTO.
    4. J-PAL South Asia'sSurveyCTO Exotel plug-in enables Exotel calls to be made or SMSes to be sent from within a SurveyCTO form

    5. J-PAL South Asia’s Quick guide to using Exotel

    6. SurveyCTO code for scheduling appointments (Mansa Saxena)

    7. SurveyCTO’s 7 ways to conduct a Covid-19 phone survey like IPA contains seven case studies for conducting phone interviews, including case studies on random digit dialing and remote training 

    8. SurveyCTO’s Resources for transitioning projects to telephone surveying (CATI)

    9. SurveyCTO’s webinar: Discover how J-PAL and IPA use SurveyCTO plug-ins for phone surveys and more

    In this resource