Research Transparency and Reproducibility
For over a decade, J-PAL has been a leader in making research more transparent. In 2008, J-PAL was one of the first organizations to have researchers publish their data on the Harvard Dataverse, a repository for published scientific data. In 2009, we developed a hypothesis registry, which was the precursor to the American Economic Association's (AEA) registry for randomized controlled trials where researchers can publish their study design.
J-PAL works closely with other organizations that promote research transparency, including the Berkeley Initiative for Transparency in the Social Sciences (BITSS), Center for Open Science, Innovations for Poverty Action (IPA), and International Initiative for Impact Evaluation (3ie), among others.
Our five core activities include:
We run the American Economic Association’s (AEA) registry for randomized controlled trials.
The AEA RCT Registry is the central database of randomized experiments in economics. Researchers register the design of their evaluations, updating their entries after the study has concluded. Though registration is still relatively new in the social sciences, registries have the potential to reveal the true extent of the file drawer problem and publication bias--in which papers with significant results are more likely to be published. Registration is required by six of the AEA journals for all field experiments submitted for publication. All J-PAL funded studies are required to register--ideally before the start of the trial intervention.
J-PAL worked with the AEA to design and implement the registry in 2013, and continues to develop, support, and maintain the registry.
We publish research data and make it accessible for reuse and replication.
J-PAL staff support our affiliated researchers to publish study data so that it can be re-used by other researchers, policy partners, students, and the broader research community for further exploration and analysis. Data sharing benefits include the ability to generate insights based on multiple studies through meta-analysis, and enabling the replication and confirmation of published results. J-PAL improves the quantity and quality of published research data by cleaning data and code, writing clarifying documentation, ensuring the protection of human subjects and study participants, and publishing data in trusted digital repositories. Together with our partner, Innovations for Poverty Action, we have built a hub for data in field experiments in economics and public policy.
All research funded by J-PAL through our initiatives is subject to a data publication policy that requires researchers to prepare their data for publication within three years of the end of data collection.
We increase reproducibility by running replication checks before researchers publish their studies.
J-PAL increases reproducibility and verifiability of research studies by ensuring that the results of a study can be independently replicated prior to the study or research data being published. By request of our affiliated researchers, we can ensure computational reproducibility of any research study--that is, confirming whether the results reported in a paper match those obtained by running researchers' data and code on a third-party computer.
From 2017 to 2019, we ran a pilot project running full code replications of our affiliates’ studies. The study was a test of the feasibility of a system that could make third-party code replications a standard part of researchers' analysis and publication process. To read more about the project and our fellowships, please see our OSF page.
We train and educate the next generation of researchers in transparent research practices.
J-PAL trains the next generation of researchers on transparent research practices through internal staff trainings and massive open online courses (MOOCs). J-PAL and IPA host three joint research staff trainings annually to provide research staff with the theoretical and technical foundations of designing and coordinating high-quality field experiments: coding best practices, ethics in research, data security, an introduction to research transparency, and more. We have also developed three training modules on research transparency that are part of the MITx MicroMasters course Designing and Running Randomized Evaluations.
We develop and promote resources to make research more transparent.
J-PAL maintains a library of practical resources on transparency and reproducibility topics (among others). Some of these include:
To see more about J-PAL’s research transparency work, check out our blog articles below:
"Unlocking the file drawer" to ensure research results—even null results—are shared; Turitto, James, and Keesler Welch, J-PAL Blog, January 2020.
- Take practical steps to de-identifying and publishing research data with J-PAL’s new guides; Kopper, Sarah, Anja Sautmann, and James Turitto, J-PAL Blog, January 2020.
- A new hub for data in randomized evaluations; Badani, Hasina, Pam Kingpetcharat, Karl Rubio, and James Turitto, J-PAL Blog, September 2019.
- Improving research transparency through easier, faster access to studies in the AEA RCT Registry, Turitto, James and Keesler Welch, J-PAL Blog, August 2019.
- Why researchers should publish their data; Rubio, Karl, J-PAL Blog, March 2019.
- Pre-results review at the Journal of Development Economics: Taking transparency in the discipline to the next level; Welch, Keesler, J-PAL Blog, September 2018.
- Addressing the challenges of publication bias with RCT registration; Turitto, James, and Keesler Welch, J-PAL Blog, February 2018.
- Replication support for J-PAL evaluations now underway; February 2017.
- Video: Announcement of J-PALs 3-year Research Transparency grant from the Alfred P. Sloan Foundation and Laura and John Arnold Foundation; October 2016.
If you have any questions related to J-PAL's Research Transparency and Reproducibility work, please email [email protected].