Reproducibility and Replication

Kickoff Workshop for the Center of Reproducible Science (CRS) at the University of Zurich.

Registration is closed.

About CRS

The objective of the Center of Reproducible Science (CRS) is to improve the overall reproducibility of empirical scientific research at UZH and to promote original research in reproducibility studies and methodology related to reproducibility. One of the main additional values of the CRS is that researchers of UZH who are active in methodological aspects of research get together to discuss these challenges, keep each other updated on advances from different fields which typically do not communicate with each other intensively and hopefully to propose common solutions. Moreover, UZH researchers who are invested in replication or reproducibility efforts can get together with the methodologists of the CRS either through training activities, direct collaboration, or simply publications. As a result we will improve overall quality and reproducibility of empirical scientific research at UZH.

This strategic kick-off workshop has the goal to define the optimal set-up of the activities of the CRS. As a working meeting it brings together international leaders in reproducibility and open science initiatives and will allow us to benefit from lessons learned at already-established initiatives.

All UZH researchers are invited to participate in the workshop, please register.

Venue Info

UZH Building

SOC-1-106, Rämistrasse 69

Kickoff Schedule

Information on talks is continuosly updated, please check back.

3 Sep, 2018

9 : 00 - 9 : 15

Welcome

By Michael Schaepman, Vice president for research

9 : 15 - 9 : 30

Introduction

UZH Center for Reproducible Science (CRS)

By Leonhard Held, CRS Director

9 : 30 - 10 : 15

Three Recommendations for Improving the Use of p-Values

Even as the limitations of p-values are becoming more widely appreciated, we anticipate that p-values will continue to be widely reported. In statistical practice, perhaps the single biggest problem with p-values is that they are often misinterpreted in ways that lead to overstating the evidence against the null hypothesis. We recommend three practices, in increasing level of sophistication, that would help safeguard against such misinterpretation: (1) Use 0.005 as the threshold for statistical significance for novel discoveries, and refer to results with 0.005 < p < 0.05 as "statistically suggestive"; (2) When reporting a p-value, also report its corresponding Bayes factor upper bound; and (3) Report prior odds (ideally determined ex ante) against the null hypothesis and the posterior odds implied by multiplying the prior odds by the Bayes factor upper bound. The paper I am presenting is joint work with Jim Berger (Duke University).

By Daniel J. Benjamin, University of Southern California

Coffee

10 : 15 - 10 : 45

Coffee break

10 : 45 - 11 : 30

Replications and prediction markets in the social sciences

Why are there so many false positive results in the published scientific literature? And what is the actual share in different literatures? We will discuss these questions, see what prediction markets might add to the understanding of the reproducibility of science and what we can do to increase the reproducibility.

By Anna Dreber, Stockholm School of Economics

11 : 30 - 12 : 00

Discussion and recommendations

By Rainer Winkelmann, UZH

Lunch

12 : 00 - 13 : 00

Stehlunch

13 : 00 - 13 : 45

Psychology: from crisis to change

Psychology is in a crisis: Prominent research findings fail to replicate, errors in the reporting of significance testing are common, the use of Questionable Research Practices (QRPs) in the collection and analysis of data and reporting of results appears to be widespread, and some major fraud cases have come to light. This crisis has resulted in a call for more transparency and openness to improve psychological science. Different initiatives are proposed and also implemented. Data and materials become more easily available for other researchers by using for example the open science framework, and transparent behavior is awarded with badges. Another solution to guarantee the confirmatory nature of a study is preregistration, which is registering the hypothesis, study design, and data-analysis plan prior to data collection. Currently more than 100 journals offer the possibility to submit registered reports, which are peer reviewed before data collection and are published regardless of what the results will show. In this talk I will present some of the research that investigates the effectiveness of these solutions.

By Marjan Bakker, Tilburg University

13 : 45 - 14 : 15

Discussion and recommendations

By Marco Steenbergen, Carolin Strobl, UZH

14 : 15 - 15 : 00

Human behaviorial challenges in reproducibility

Reproducibility and replicability are a major focus of both the scientific and statististical communities. Concerns about rates of reproducibility, replicability, and false discoveries have risen from a technical scientific issue to a broad societal discussion about the value of research. We all agree that reproducibility and replicability are fundamental to the scientific process. But what exactly do they mean? And what are the real barriers to implementation of reproducible and replicable research? Is it the lack of software or infrastructure? Is it the lack of good statistical procedures? In this talk I will argue that perhaps the key challenge is due to our poor understanding of human-data interaction and how to encourage people to use already existing tools and procedures to improve their research.

By Jeff Leek, Johns Hopkins University (presenting remotely)

Coffee

15 : 00 - 15 : 30

Coffee break

15 : 30 - 16 : 15

Data science: What it takes to reach reproducibility

Data science analysis usually involves a large number of software tools, reference data and pipelines used to elaborate the results. While the process sounds trivial, reproducibility (let alone by other researchers) is often a burden as many pieces of the puzzle are missing in the used methodology. The latter result is a lack of transparency, trust and most importantly the loss of resources that should be rather more invested in replicability than reproducibility.

In my talk, I will discuss the available technologies and their respective roles towards a systematic reproducibility guideline in data science. I will emphasise on the use of containerization technology and cloud computing as a main piece of the reproducibility puzzle and will present a live example using the containerization technology with the collaboration of the audience.

By Walid Gharib, Swiss Insitute of Bioinformatics

16 : 15 - 16 : 45

Discussion and recommendations

By Mark Robinson, UZH

4 Sep, 2018

9 : 00 - 9 : 45

The Berlin Institute of Health QUEST- CENTER for transforming biomedical research

QUEST – Quality | Ethics | Open Science | Translation – aims to overcome the roadblocks in translational medicine and to increase the value and impact of biomedical research by maximizing the quality, reproducibility, generalizability, and validity of BIH research and beyond. We are creating awareness of the need to rethink biomedical research and to initiate a culture change in academic biomedicine. Our programs and initiatives 1) foster quality assurance by promoting compliance of preclinical as well as clinical research with standards and guidelines on design, conduct, analysis and reporting; 2) develop and implement training and teaching resources on experimental and study design, methods to reduce bias, new modes of publishing, the digital footprint of academics, as well as open science; 3) improve the accessibility and transparency of BIH research and its results through open access and open data; 4) identify opportunities for improving research practice and obtain evidence for the impact of its activities through ‘research on research’; 5) develop new incentive systems in research, e.g. by selecting appropriate novel indicators and metrics for the assessment of research performance of researchers and institutions; 6) assist in the implementation and evaluation of these novel systems 7) foster research for and with the public to enhance public outreach and public involvement in BIH research; 8) develop and implement innovative, science-based guidelines and training modules for the quality of research and protection of humans and animals. Last but not least, QUEST 9) acts as advisor to stakeholders in biomedicine from funders to politics.

By Ulrich Dirnagl, Berlin Institute of Health

9 : 45 - 10 : 30

The Robust Research Initiative Oxford – Changing culture from the bottom up

How do we respond to the reproducibility crisis in academic research? That was the question that brought us together, a group of mostly early career researchers from the medical and social sciences. We decided that we had heard enough talks about the reproducibility crisis and wanted to do something about it. We started the Robust Research Initiative last year with the aim to promote open and reproducible research in our research environments at the University of Oxford. I will give a brief overview of our current and planned activities, from implementing educational projects to providing strategic solutions at higher levels and interacting with (inter)national networks. But in the end, we will only be successful if we manage to bring about much-needed cultural change. I will provide some answers to the questions what needs to change and how we can instigate change from the perspective of early career researchers.

By Verena Heise, University of Oxford

Coffee

10 : 30 - 11 : 00

Coffee break

11 : 00 - 11 : 45

Biomedical research: Is failed replication always a bad thing?

Failed replication in biomedical research may occur if the originator study was falsely positive (by chance, or because the experimental design placed the study at risk of bias); if our understanding of the literature is polluted by publication bias; or in the presence of some unknown (latent) independent variable which influences the phenomenon under study. By definition, this latter explanation promises the discovery of a previously unknown facet of the process being studied. Approaches to replication studies might focus on the parallel tasks of increasing the probability that published research is true; and then exploring the possible causes for unexplained conflicting results.

By Malcolm McLeod, University of Edinburgh

11 : 45 - 12 : 15

Discussion and recommendations

By Lukas Sommer, UZH

Lunch

12 : 15 - 13 : 15

Stehlunch

13 : 15 - 14 : 00

Treating NHST addiction among the research community

Since its emergence around 90 years ago, the use of NHST has reached epidemic proportions in the research community. Its well-known symptoms include confusion, illusory beliefs, and mood swings triggered by whether or not use of NHST leads to “significance”. I argue that such addiction will not be cured by increasing the dosage, nor by switching to powerful substitutes. Instead, I will describe an alternative strategy based on a combination of education and the use of a new technique that can be used alongside NHST. The strategy allows those affected to see how NHST causes harm, and encourages them to become fully functioning members of the research community.

By Robert Matthews, Aston University

14 : 00 - 14 : 45

Interactive input from the audience

By Eva Furrer, Leonhard Held, UZH

14 : 45 - 15 : 00

Discussion, recommendations and wrap-up

By Leonhard Held, UZH

Coffee

15:00 - 15 : 30

Coffee

afterwards

CRS Steering committee meeting

https://www.uzh.ch/cmsssl/en/about/management/unileitung/research.html

Our Speakers

Michael Schaepman

Daniel Benjamin

Anna Dreber

Marjan Bakker

Jeff Leek

Walid Gharib

Ulrich Dirnagl

Verena Heise

Malcolm McLeod

Robert Matthews

Steering Committee

Sara Fabrikant

Leonhard Held

Mark Robinson

Gerhard Rogler

Lukas Sommer

Marco Steenbergen

Carolin Strobl

Rainer Winkelmann

Workshop Sponsors

Reserve your spot now!

Registration is closed.

Contact Info

Address

UZH CRS, EBPI, Hischengraben 84

Web

www.crs.uzh.ch