Research Transparency and Reproducibility
J-PAL has been a leader in making research more transparent for over a decade by developing a registry for randomized evaluations and publishing data from research studies conducted by our affiliates. In 2008, J-PAL was one of the first organizations to have researchers publish their data on the Harvard Dataverse, a repository for published scientific data. In 2009, we begun work on the development of a hypothesis registry, which was a precursor to the American Economic Association's (AEA) registry for randomized controlled trials where researchers can publish their study design.
J-PAL works closely with other organizations that promote research transparency, including the Berkeley Initiative for Transparency in the Social Sciences (BITSS), Center for Open Science, Innovations for Poverty Action (IPA), and International Initiative for Impact Evaluation (3ie), among others.
Our five core activities include:
We run the American Economic Association’s (AEA) registry for randomized controlled trials.
Starting in 2013, the AEA RCT registry is the central database of randomized experiments in economics. Researchers register the study design of their evaluations and update their entries with post-trial information. The goal of the registry is to help researchers understand the extent of publication bias in economics, the bias that occurs because positive research results are more likely to get published, by making available a more comprehensive list of randomized experiments. Registration is required by six of the AEA journals for all field experiments submitted for publication. All J-PAL funded studies are required to pre-register.
J-PAL worked with the AEA to complete the design and implementation of the registry, and has continued to develop, support, and maintain the registry.
We publish research data and make it accessible for reuse and replication.
J-PAL staff support our affiliated researchers to publish study data so that it can be used by other researchers for further exploration and analysis. We improve the quantity and quality of published research data by cleaning data and code, writing clarifying documentation, ensuring the protection of human subjects and study participants, and publishing data in trusted digital repositories. Together with our partner, Innovations for Poverty Action, we have built a new hub for data in field experiments in economics and public policy.
All research funded by J-PAL through our initiatives is subject to a data and code availability policy that requires researchers to prepare their data for publication within 18 months of the end of data collection.
We increase reproducibility by running checks for computational reproducibility and replications of original research before publishing.
J-PAL increases reproducibility and verifiability of research studies by ensuring that the results of a study can be independently replicated prior to the study or research data being published. By request of our affiliated researchers, we will ensure computational reproducibility of any research study. We will confirm the results reported in the paper match the results received by running the data and code on a third-party computer.
We also ran a pilot project from 2017 to 2019 that provided graduate students with fellowships to run full code replications of our affiliates’ studies. We tested the feasibility of a system that could make third-party code replications part of the analysis and publication process. To read more about the project, our fellowships, and some findings from the project, please see our OSF page.
We train and educate the next generation of researchers in transparent research practices.
J-PAL trains the next generation of researchers on transparent research practices through internal staff trainings and massive open online courses (MOOCs). J-PAL and IPA host three joint research staff trainings annually for full-time research assistants, field managers, and other research staff who work on randomized evaluations. We provide research staff with the theoretical and technical foundations of designing and coordinating high-quality field experiments.
We have also developed three specific training modules related to our research transparency work that were incorporated into our online course, Designing and Running Randomized Evaluations.
We develop and promote resources to make research more transparent.
J-PAL maintains a comprehensive page of resources for transparency and reproducibility.
To see more about J-PAL’s research transparency work, check out our blog articles below:
- A new hub for data in randomized evaluations; Badani, Hasina, Pam Kingpetcharat, Karl Rubio, and James Turitto, J-PAL Blog, September 2019.
- Improving research transparency through easier, faster access to studies in the AEA RCT Registry, Turitto, James and Keesler Welch, J-PAL Blog, August 2019.
- Why researchers should publish their data; Rubio, Karl, J-PAL Blog, March 2019.
- Pre-results review at the Journal of Development Economics: Taking transparency in the discipline to the next level; Welch, Keesler, J-PAL Blog, September 2018.
- Addressing the challenges of publication bias with RCT registration; Turitto, James, and Keesler Welch, J-PAL Blog, February 2018.
- Replication support for J-PAL evaluations now underway; February 2017.
- Video: Announcement of J-PALs 3-year Research Transparency grant from the Alfred P. Sloan Foundation and Laura and John Arnold Foundation; October 2016.
If you have any questions related to J-PAL's Research Transparency and Reproducibility work, please email [email protected].