The Meta-Research Center is Looking for Four PhD Candidates!

We currently have vacancies for four exciting PhD projects in our group:

  1. Two positions in the project “Examining variation in causal effects in psychology”: https://tiu.nu/21562

  2. One position in the project “How trustworthy are simulation studies? A meta-research perspective”: https://tiu.nu/21530

  3. One position in the project “From degrees of freedom to robustness: strenghtening the evidence base for psychological interventions”: https://tiu.nu/21539

Robbie van Aert Wins a Veni Grant for his Research on Meta-Analysis, Preregistration, and Replication

We are very happy to announce that Robbie van Aert was awarded a NWO Veni grant (€280.000) for his research entitled “Empowering meta-analysis by taking advantage of preregistered and replication studies”.

 

Below you can find a short description of the proposed research:

 

An important threat to the validity of meta-analyses is publication bias. Replication and preregistered studies are deemed less susceptible to publication bias. I will develop a novel meta-analysis methodology that optimally synthesizes conventional with replication/preregistered studies and corrects for publication bias. This new methodology yields more accurate conclusions in meta-analyses.

Young eScientist Award to Improve "statcheck"

The Netherlands eScience Center awarded our team member Michèle Nuijten and our colleague from social psychology Willem Sleegers the Young eScientist Award for their proposal to improve statcheck. The prize consists of €50,000 worth of expertise from the eScience Center, which will be used to expand statcheck’s search algorithm with more advanced techniques such as Natural Language Processing.

‘We are […] very excited to collaborate with the eScience Center to improve statcheck’s searching algorithm to make it ‘smarter’ in recognizing statistical results so that it can also spot errors in other scientific fields. We are confident that by collaborating with the eScience Center, we can expand statcheck to improve scientific quality on an even larger scale.’
— Michèle & Willem

NWO Veni Grant for the 4-Step Robustness Check

We are happy to announce that a €250,000 NWO Veni Grant was awarded to Michèle Nuijten for her proposed 4-Step Robustness Check.

She describes the project on her website:

To check the robustness of a study could replicate it in a new sample. However, in my 4-Step Robustness Check, you first verify if the reported numbers in the original study are correct. If they’re not, they are not interpretable and you can’t compare them to the results of your replication.

Specifically, I advise researchers to do the following:

  1. Check if there are visible errors in the reported numbers, for example by running a paper through my spellchecker for statistics: statcheck

  2. Reanalyze the data following the original strategy to see if this leads to the same numbers

  3. Check if the result is robust to alternative analytical choices

  4. Perform a replication study in a new sample

The 4-Step Robustness Check can be used to efficiently assess robustness of results

The 4-Step Robustness Check can be used to efficiently assess robustness of results

This 4-step check provides an efficient framework to check if a study’s findings are robust. Note that the first steps take way less time than a full replication and might be enough to conclude a result is not robust.

The proposed framework can also be used as an efficient checklist for researchers to improve robustness of their own results:

  1. Check the internal consistency of your reported results

  2. Share your data and analysis scripts to facilitate reanalysis

  3. Conduct and report your own sensitivity analyses

  4. Write detailed methods sections and share materials to facilitate replication

Ultimately, I aim to create interactive, pragmatic, and evidence-based methods to improve and assess robustness, applicable to psychology and other fields.

I would like to wholeheartedly thank my colleagues, reviewers, and committee members for their time, feedback, and valuable insights. I’m looking forward to the next three years!

Sustainable Science Symposium

The Sustainable Science Symposium will take place on 15th April at the LocHal in TilburgRegistration is open and costs 15 euros. Organizers also have an open call for ‘dogfood’ discussion sessions and virtual posters that are further detailed below.

This symposium aims to go beyond recurring topics in open science and stimulate dialogue on issues such as academic culture, intellectual property & copyright of research output, scholarly communication, infrastructural governance, and how to venture beyond the traditional sphere to connect to marginalized communities.

The symposium will be kicked off with talks by Sam MooreChris MeynsIris van Rooij, and Anasuya Sengupta. These will be followed by 30-minute ‘dogfood’ sessions for which we invite concrete discussion questions on open and sustainable science. Don’t worry if you’re not familiar with dogfood sessions, our submission form will guide you in adapting your question for the same!

We also invite submissions for our virtual poster exhibition, which we’ve set up as an experiment in alternative + remote conferencing formats. Submissions can also include slide decks or even artwork! VR headsets will be provided for local attendees to explore the virtual world.

Join us in a day of radical and uncompromising thought about the sustainability of science in all its shapes and forms! More information, registration, and submissions at sustainablesci.com.

This event is organized in conjunction with Chris Hartgerink’s PhD defense on “Contributions Towards Understanding and Building Sustainable Science” on April 17.

Leonie Van Grootel won the Thomas C. Chalmers Award

foto.jpg

In 2018, the winner of best short oral presentation winner was Leonie Van Grootel for 'Using Bayesian information for matchinig qualitative and quantative sources in a mixed studies review'. 

In 2018, the winner of best short oral presentation winner was Leonie Van Grootel for 'Using Bayesian information for matchinig qualitative and quantative sources in a mixed studies review'. 

Cochrane’s Blog

Michèle Wins the Tilburg University Dissertation Prize

We are excited to announce that Michèle has won the Tilburg University Dissertation Prize with her PhD Thesis “Research on Research: A Meta-Scientific Study of Problems and Solutions in Psychological Science”.

In her thank-you speech, Michèle emphasized the importance of interdisciplinarity in solving the replicability problems: “I think researchers from different fields should be open to learn the best practices from other fields.”

She ended by thanking Tilburg University for this prize, also because “it shows me that Tilburg University thinks it is good to be critical about the scientific system, and that open science is an important step forward.”

The full thesis can be found online at https://psyarxiv.com/qtk7e.

201811_TiU_DiesNatalis_108.jpg

Letter in the Chronicle: The Open Science Movement is Cooperative, not Destructive

Together with a group of colleagues, Michèle wrote a letter to the editor of the Chronicle, to reply to an earlier article that presented the open science movement as “burning things to the ground”. Michèle and her colleagues disagreed. They argue that they mainly see cooperative, constructive, and pragmatic initiatives to improve the state of psychological science.

Read the full letter here.

download.jpg

Chartier, C. R., Kline, M. E., McCarthy, R. J., Nuijten, M. B., Dunleavy, D. J., & Ledgerwood, A. A cooperative revolution in psychology. The Chronicle of Higher Education. Retrieved from https://www.chronicle.com/blogs/letters/a-cooperative-revolution-in-psychology/.

From Replication to Reproducible knowledge: A hybrid method to combine evidence across studies

The recent publication of Robbie van Aert and Marcel van Assen in the Behavior Research Methods journal has been featured on Psychonomic Society's Blog by Stephan Lewandowsky.

The blog post outlines the issue of combining the results of a replication study with the results of an originally published experiment, as well as describing van Aert and van Assen's hybrid meta-analysis technique that addresses this issue. 

Read more

New Paper by Robbie van Aert

Title: Multistep estimators of the between‐study variance: The relationship with the Paule‐Mandel estimator

Authors: Robbie C. M. van Aert & Dan Jackson

Published in: Statistics in Medicine

Abstract

A wide variety of estimators of the between‐study variance are available in random‐effects meta‐analysis. Many, but not all, of these estimators are based on the method of moments. The DerSimonian‐Laird estimator is widely used in applications, but the Paule‐Mandel estimator is an alternative that is now recommended. Recently, DerSimonian and Kacker have developed two‐step moment‐based estimators of the between‐study variance. We extend these two‐step estimators so that multiple (more than two) steps are used. We establish the surprising result that the multistep estimator tends towards the Paule‐Mandel estimator as the number of steps becomes large. Hence, the iterative scheme underlying our new multistep estimator provides a hitherto unknown relationship between two‐step estimators and Paule‐Mandel estimator. Our analysis suggests that two‐step estimators are not necessarily distinct estimators in their own right; instead, they are quantities that are closely related to the usual iterative scheme that is used to calculate the Paule‐Mandel estimate. The relationship that we establish between the multistep and Paule‐Mandel estimator is another justification for the use of the latter estimator. Two‐step and multistep estimators are perhaps best conceptualized as approximate Paule‐Mandel estimators.

DOI: https://doi.org/10.1002/sim.7665

"statcheck" in the Guardian's Weekly Science Podcast

uploads.guim.co.uk-2018-01-15-Science_3000_x_3000.jpg

This week the Guardian's Science Weekly podcast focuses on statistical malpractice and fraud in science. Michèle talks about the role of statcheck in detecting statistical inconsistencies, and discusses the causes and implications of seemingly innocent rounding errors. This podcast also offers fascinating insights from consultant anasesthetist John Carlisle about the detection of data fabrication, and president of the Royal Statistical Society David Spiegelhalter about the dangers of statistical malpractice.

Awarded a Campbell Methods Grant

Campbell-Collaboration-logo.png

We are honored to announce that Michèle Nuijten was awarded a $20,000 methods grant from the Campbell Collaboration, together with meta-analysis expert Joshua R. Polanin. They were awarded the grant for the project “Verifying the Accuracy of Statistical Significance Testing in Campbell Collaboration Systematic Reviews Through the Use of the R Package statcheck”. The grant is part of the Campbell Collaboration’s program to supporting innovative methods development in order to improve the quality of systematic reviews. For more information about the grant and the three other recipients, see their website here.

New Paper on “Bayesian evaluation of effect size after replicating an original study”

publication.png

Title: Bayesian evaluation of effect size after replicating an original study

Authors: Robbie C. M. van Aert & Marcel A. L. M. van Assen

Published in: PLOS One

Abstract 

The vast majority of published results in the literature is statistically significant, which raises concerns about their reliability. The Reproducibility Project Psychology (RPP) and Experimental Economics Replication Project (EE-RP) both replicated a large number of published studies in psychology and economics. The original study and replication were statistically significant in 36.1% in RPP and 68.8% in EE-RP suggesting many null effects among the replicated studies.

However, evidence in favor of the null hypothesis cannot be examined with null hypothesis significance testing. We developed a Bayesian meta-analysis method called snapshot hybrid that is easy to use and understand and quantifies the amount of evidence in favor of a zero, small, medium and large effect. The method computes posterior model probabilities for a zero, small, medium, and large effect and adjusts for publication bias by taking into account that the original study is statistically significant.

We first analytically approximate the methods performance, and demonstrate the necessity to control for the original study’s significance to enable the accumulation of evidence for a true zero effect. Then we applied the method to the data of RPP and EE-RP, showing that the underlying effect sizes of the included studies in EE-RP are generally larger than in RPP, but that the sample sizes of especially the included studies in RPP are often too small to draw definite conclusions about the true effect size. We also illustrate how snapshot hybrid can be used to determine the required sample size of the replication akin to power analysis in null hypothesis significance testing and present an easy to use web application (https://rvanaert.shinyapps.io/snapshot/) and R code for applying the method.

DOI: https://doi.org/10.1371/journal.pone.0175302

ERC Consolidator Grant for Jelte Wicherts

iu-2.jpeg

Jelte Wicherts has been awarded a prestigious 2 million euro Consolidator Grant from the European Research Council (ERC). With the money the meta-research group will be expanded with two postdocs and two PhD students.

The project is entitled IMPROVE: Innovative Methods for Psychology: Reproducible, Open, Valid, and Efficient and will start in the second half of 2017.

Postprint "Who Believes in the Storybook Image of the Scientist?"

A new manuscript by the Meta Research group is in press at Accountability in Research; primary author Coosje Veldkamp. The final paper will be available Open Access, but in the meantime find the abstract below and the postprint on PsyArxiv. Abstract:

Do lay people and scientists themselves recognize that scientists are human and therefore prone to human fallibilities such as error, bias, and even dishonesty? In a series of three experimental studies and one correlational study (total N = 3,278) we found that the ‘storybook image of the scientist’ is pervasive: American lay people and scientists from over 60 countries attributed considerably more objectivity, rationality, open-mindedness, intelligence, integrity, and communality to scientists than other highly-educated people. Moreover, scientists perceived even larger differences than lay people did. Some groups of scientists also differentiated between different categories of scientists: established scientists attributed higher levels of the scientific traits to established scientists than to early-career scientists and PhD students, and higher levels to PhD students than to early-career scientists. Female scientists attributed considerably higher levels of the scientific traits to female scientists than to male scientists. A strong belief in the storybook image and the (human) tendency to attribute higher levels of desirable traits to people in one’s own group than to people in other groups may decrease scientists’ willingness to adopt recently proposed practices to reduce error, bias and dishonesty in science.

Leamer-Rosenthal Prize for statcheck

Michèle Nuijten and Sacha Epskamp are two of the nine winners of the 2016 Leamer-Rosenthal prize for Open Social Science for their work on statcheck. This prize is an initiative of the Berkeley Initiative for Transparency in the Social Sciences (BITSS), and comes with a prize of $10,000. They will receive their prize at the 2016 BITSS annual meeting, along with seven other researchers and educators.

Read more here.