Our library of practical resources is intended for researchers and research staff undertaking randomized evaluations, as well as those teaching the technique to others, and anyone interested in how randomized evaluations are conducted.
Incorporating lessons learned through our own experience and through guidance from researchers and research organizations, we provide practical advice for designing, implementing, and communicating about evaluations. These resources are a collaborative effort. We credit the authors of all the resources we post here, and link to their original work wherever possible.
Please reach out to us at [email protected] or fill out this form with questions or feedback.
Introduction to Randomized Evaluations
Resources
A non-technical overview and step-by-step introduction for those who are new to randomized evaluations, as well as case studies and other teaching resources.
Before Starting a Project
Resources
Tips on successful field management and implementation partnerships for researchers who are new to fieldwork.
Project Planning
Resources
Highlights include: annotated checklist for designing an informed consent process, detailed advice on grant proposals and budgeting, and suggestions for proactive measures to help ensure ethical principles are followed in research design and implementation.
Research Design
Resources
Data Collection and Access
Resources
This section contains guidance specific to working with surveyors or survey companies, information about administrative data collection, and information applicable to all modes of data collection, such as on data security, data quality, and grant management.
Processing and Analysis
Resources
All the steps in a research project after the data was collected or assembled, from data cleaning to communicating results.
Define intake and consent process
Far from a simple administrative step, decisions about a study’s intake and consent process are critical for the success of a study. This process can affect statistical power, bias, and the validity...
Data security procedures for researchers
This document provides a primer on basic data security themes, provides context on elements of data security that are particularly relevant for randomized evaluations using individual-level...
Design and iterate implementation strategy
Implementing partners and researchers should work closely together during the study design phase of a randomized evaluation to create a feasible implementation strategy. This resource is intended to...
Evaluating technology-based interventions
This resource provides guidance for evaluations that use technology as a key part of the intervention being tested. Examples of such interventions might include automated alerts embedded into an...
Grant proposals
The first step in embarking on a research project is often writing the grant proposal to fund it. Outside of securing funding for your project, the main purpose of the grant is to lay out your...
Trial registration
This resource provides guidance for researchers wishing to register their study in a public trial registry. We list common social science registries and registration policies of common funders and...
Pre-analysis plans
A pre-analysis plan (PAP) describes how researchers plan to analyze the data from a randomized evaluation. It is distinct from the concept of pre-registration, which in economics is the act of...
Introduction to measurement and indicators
The goal of measurement is to get reliable data with which to answer research questions and assess theories of change. Inaccurate measurement can lead to unreliable data, from which it is difficult to...