Got questions about our project? You can contact at repliCATSemail@example.com.
However, here’s a list of frequently asked questions that might help.
- What is “replication” as defined for this project?
Replication, along with many other related terms like reproducibility, are contested. That is, they have multiple meanings.
For this project, our working definition of a direct replication as a replication that follows the methods of the original study with a high degree of similarity, varying aspects only where there is a high degree of confidence that they are not relevant to the research claim. The aim of a direct replication is to improve confidence in the reliability and validity of an experimental finding by starting to account for things such as sampling error, measurement artefacts, and questionable research practices.
- Does repliCATS stand for something?
Yes. The “CATS” in repliCATS is an acronym for Collaborative Assessment for Trustworthy Science.
- Who is part of your research team?
We are an interdisciplinary research team based predominantly at the University of Melbourne. You can meet the research team here.
- What are the aims of the repliCATS project?
We are developing and testing methods to elicit accurate predictions about the likely replicability of published research claims in the social sciences. As you may be aware, some large scale, crowdsourced replication projects have alerted us to the possibility that replication success rates may be lower than we once thought. Our project will assist with the development of efficient methods for critically evaluating the evidence base of social science research.
- What is the IDEA protocol?
The IDEA protocol is a structured protocol for eliciting expert judgments based on the Delphi process. IDEA stands for Investigate, Discuss, Estimate, Aggregate. Applying the IDEA protocol involves recruiting a diverse group of experts to answer questions with probabilistic or quantitative responses. Experts first investigate the questions and clarify meanings of terms, reducing variation caused by linguistic ambiguity. They provide their private, individual estimate, using a 3- or 4-step method (highest, lowest, best guess). The group’s private estimates are revealed; group members can then see how their estimates sit in relation to others. The group discusses the results, shares information and cross-examines reasoning and evidence. Group members individually provide a second and final private estimate. These second-round estimates are then combined using mathematical aggregation. The strengths of the IDEA protocol in eliciting predictions of the likely replicability of research claims lies in the stepped, structured nature of the approach. The feedback and discussion components of the IDEA protocol both function to reduce overconfidence in estimates, which is a known limitation of expert elicitation methods. The discussion component of the IDEA protocol also allows experts to account for private information which could substantially alter the likely replicability assessment of a research claim.
- Can I participate in this project?
Yes! We hope to crowdsource expert judgements from a diverse range of participants in the following broad disciplines:
- political science
- public administration
- marketing, and
If you are interested in participating, express interest using this form so we can contact you when we are ready to begin the next phase of the project.
- If I participate, what’s in it for me?
Your participation will help us to refine methods for predicting the replicability of social and behavioural science claims. Any data we collect could drastically change the way we think about published research evidence. For individuals participants, it also provides the opportunity to develop your skills, through peer interactions, and to become more critical consumers of the research literature.
- How are the 3,000 research claims chosen?
The Center for Open Science (USA) are selecting the 3,000 research claims. These claims will be drawn from the following journals.
- Law and Human Behavior
- Journal of Consumer Research
- Journal of Marketing
- Journal of Marketing Research
- Journal of Organizational Behavior
- Journal of the Academy of Marketing Science
- Organizational Behavior and Human Decision Processes
- American Economic Journal: Applied Economics
- American Economic Revie
- Experimental Economics
- Journal of Finance
- Journal of Financial Economics
- Journal of Labor Economics
- Quarterly Journal of Economics
- Review of Financial Studies
- American Political Science Review
- British Journal of Political Science
- Comparative Political Studies
- Journal of Conflict Resolution
- Journal of Experimental Political Science
- Journal of Political Economy
- World Development
- World Politics
- American Educational Research Journal
- Computers and Education
- Contemporary Educational Psychology
- Educational Researcher
- Exceptional Children
- Journal of Educational Psychology
- Learning and Instruction
- Child Development
- Clinical Psychological Science
- European Journal of Personality
- Evolution and Human Behavior
- Journal of Applied Psychology
- Journal of Consulting and Clinical Psychology
- Journal of Environmental Psychology
- Journal of Experimental Psychology: General
- Journal of Experimental Social Psychology
- Journal of Personality and Social Psychology
- Psychological Science
- Health Psychology
- Psychological Medicine
- Social Science and Medicine
- Journal of Public Administration Research and Theory
- Public Administration Review
- Academy of Management Journal
- Journal of Business Research
- Journal of Management
- Leadership Quarterly
- Management Science
- Organization Science
- American Journal of Sociology
- American Sociological Review
- European Sociological Review
- Journal of Marriage and Family
- Social Forces
- How can I get more information?
You can also follow us on twitter, @replicats.
Or, you can send us an e-mail at repliCATSfirstname.lastname@example.org.