Felix Singleton Thorn

Affiliations: Interdisciplinary Meta-Research Group, The University of Melbourne

Poster title: Statistical power and effect sizes in psychology are decreasing over time

Abstract: This poster presents a meta-analysis of 46 studies assessing the statistical power of psychology research at Cohen’s effect size benchmarks, and an analysis of over 130,000 effect size estimates from over 9,000 articles published in 5 APA journals from 1985 to 2013. The first study shows that the average statistical power of psychology is extremely low for ‘small’ effects, .23 (95% CIs [.17, .29]), somewhat low for ‘medium’ effects, .62 (95% CI [.54, .70]), and only acceptably high for ‘large’ effects, .80 (95% CI [.68, .92]). It also shows that these values have changed little if at all over time, with an estimated yearly change of -0.000 (95% CI [-0.003, 0.003]), 0.001 (95% [-0.002, 0.004]), and -0.001 (95% [-.002, 0.001]) at the small, medium, and large benchmarks. However, the results of the second study show that effect sizes reported in published psychology research are becoming smaller over time, suggesting that the average statistical power of psychology research is decreasing.

 

 
13_15_SingletonThorn_AIMOS_poster.png
 
 
 

Charles T Gray

Affiliations: La Trobe University, Australia

Poster title: Truth, Proof, and Reproducibility: There's no counter-attack for the codeless

Abstract: The task of the modern mathematical scientist has drifted from that of blackboard rhetorician, where the craft of proof reigned, to a scientific workflow that now more closely resembles that of an experimental scientist. So, what is proof in modern mathematics? And, if proof is unattainable in other fields, what is due scientific diligence in a computational experimental environment? How do we measure truth in the context of uncertainty? https://arxiv.org/abs/1907.05947

 

 
 
 

Jamil Suprihatiningrum

Affiliations: Flinders University

Poster title: Data Accessibility of Students with Disabilities: Challenges for Educational Research in Indonesia

Abstract: Indonesian disability statistics in relation to education on the national scale are almost non-existent. As the main data agency, the Central Bureau of Statistics (BPS) of Indonesia has difficulty in providing access to data on disability issues due to inadequate and underreporting data. These are caused by high levels of shame and stigma attached to students with disabilities and vague of the agreement regarding the definition of disability causing mislead in data collection and reporting. Parents resist acknowledging their children's impairments and tend to lock away that information. Schools, on the other hand, do not consider students with a disability when conducting the admission process and the following actions. These data gaps create a great challenge for the researcher to determine the real problems, analysis the data and produce the recommendations for stakeholders on disability policy. Bringing together information from multiple data sources is important to provide a comprehensive picture of disability in education.

 

 
 
 

Lina Aviyanti

Affiliations: Flinders University and Indonesia University of Education (UPI)

Co-presenters: Carol R. Aldous & Penny Van Deur

Poster title: A Quantitative Study of Students' Physics Conceptual Understanding: A Structural Equation Modelling Analysis

Abstract: Many studies have shown the relationship between various factors that affect students' conceptual understanding. Mostly, these analyses have relied on single variate or bivariate analyses This study replicates and extends this prior research by combining multiple variables and implementing multivariate procedures. Statistical analyses such as the Rasch model and Confirmatory Factor Analysis (CFA) are used for instrument validation. Further, Structural Equation Modelling (SEM) is employed to test a complex model of relationships between constructs. The findings provide empirical evidence regarding multiple factors that predict students' understanding of physics concepts at the tertiary level.

 

 
 
 

Guy Prochilo

Affiliations: The University of Melbourne

Poster title: Organizational Neuroscience Needs Careful and Consistent Post-publication Peer Review

Abstract: Organizational neuroscience is field that applies neuroscience methods to study workplace behavior. While empirical work has grown, the soundness of this work is rarely evaluated beyond initial peer review. In this review we address this void and systematically evaluate the field’s seminal works and select secondary works. In doing so, we identify a series of statistical and reporting problems that have been ignored across a decade of research. This includes inadequate reporting practices, misuse of significance testing, and little consideration of effect size and uncertainty. We propose that scholars adopt a more careful approach to post-publication review and provide recommendations.

 

 
11_15_Prochilo_AIMOS_poster.png
 
 

Rachael West

Affiliations: The University of Sydney Children's Hospital at Westmead Clinical School & Kids Research, The Children's Hospital at Westmead

Poster title: Identifying biomedical literature containing erroneous nucleotide sequences using a targeted and screening approach

Abstract: Biomedical literature describing erroneous nucleotide sequence reagents could reduce research reproducibility. We therefore developed the semi-automatic fact-checking tool Seek & Blastn (SB) to verify the targeting/non-targeting status of published nucleotide sequences. We identified two corpora for SB analysis using targeted (key words to identify single gene knockdown (SGK) papers), and screening approaches (Gene journal papers from 2007-2018). SB analysis, supported by manual verification, detected 104/174 (60%) SGK papers and 262/933 (28.1%) Gene papers containing nucleotide sequence errors. Incorrect reagents may therefore represent a hidden problem within biomedical literature.

 

 
15_15_West_AIMOS_poster.png
 

Martin Héroux

Affiliation(s): Neuroscience Research Australia

Poster title: Development of the Quality Output Checklist and Content Assessment (QuOCCA)

Abstract: The Research Quality Committee at NeuRA in Sydney was set up to improve its clinical and non-clinical research and address problems of poor research reproducibility. We developed a Quality Output Checklist and Content Assessment (QuOCCA): a critical appraisal tool to assess the trustworthiness of published, peer-reviewed research papers. Now in its final form, the QuOCCA has 12 items designed to assess transparency, design and analysis, and reporting practices. Rather than generate a score, the QuOCCA is intended to reveal study strengths and weaknesses across some critical domains and measure change in practice.

 

 
9_15_Heroux_AIMOS_poster_QuOCCA.png
 
 

Martin Héroux

Affiliation(s): Neuroscience Research Australia

Poster title: Poor statistical reporting, inadequate data presentation and spin persist despite editorial advice

Abstract: The Journal of Physiology and British Journal of Pharmacology published an editorial series in 2011 to improve standards in statistical reporting and data analysis. It is not known whether reporting practices changed in response to the editorial advice. We conducted a cross-sectional analysis of reporting practices in a random sample of research papers published before and after the editorial advice. Overall, poor statistical reporting, inadequate data presentation and spin were equally present before and after the editorial advice. While the scientific community continues to implement strategies for improving reporting practices, stronger incentives or enforcements are likely needed.

 

 
8_15_Heroux_AIMOS_poster_journal_audit.png
 
 

Julia G Bottesini

Co-presenter: Simine Vazire

Affiliation(s): University of California, Davis, USA

Poster title: Do participants care if we p-hack their data?

Abstract: Do research participants care if we engage in questionable research practices with their data? Do they have an opinion about open science practices? In this proposed registered report, we will investigate research participants' opinions about research practices using samples drawn from the most common psychology participant populations: undergraduates and MTurk workers. We ask them about QRPs (p-hacking, file-drawering, HARKing), open science practices (open materials and methods, open access publishing, replication, data sharing), and fraud. Here, we present our proposed method and results from two pilot studies (Ns = 1,258, 303).

 

 
2_15_Bottesini_AIMOS_poster.png
 
 

Daniel G Hamilton

Affiliation: Division of Radiation Oncology and Medical Imaging, Peter MacCallum Cancer Centre, Melbourne Australia.

Poster title: Don’t believe everything you read - Ethical issues in scientific publishing and the radiation oncology literature.

Abstract: Academic publishing is a necessary part of the communication of scientific research. Retraction of unreliable and unethical publications is also an important part of this process. While it is estimated that only 0.02% of the published literature has been retracted to date, this number is on the rise, even when taking into account the growing number of papers published. This research will present the results of a survey of both retracted articles in the field of radiation oncology and medical physics, as well as the studies citing these articles following their retraction.

 

 
6_15_Hamilton_AIMOS_poster.png
 
 

Tom Hardwicke

Affiliation: Meta-Research Innovation Center Berlin (METRIC-B), Charité – Universitätsmedizin Berlin

Poster title: Transdisciplinary estimates for the prevalence of transparency and reproducibility-related research practices

Abstract: Serious concerns about research quality have catalyzed a number of reform initiatives intended to improve adoption of transparency and reproducibility-related research practices, such as sharing of data, materials, protocols, and analysis scripts, disclosure of funding and conflicts of interest, replication, and pre-registration. Meta-research has evaluated the effectiveness of some individual initiatives; however, this may not capture broader trends. Here we collate and present the findings of four observational studies that have sought to estimate the prevalence of transparency and reproducibility-related research practices in psychology, the social sciences, and biomedicine.

 

 
7_15_Hardwicke_AIMOS_poster.png
 

Elise Gould

Affiliation: Interdisciplinary Meta-Research Group, The University of Melbourne

Co-presenters: Fiona Fidler, Hannah Fraser & Libby Rumpff

Poster title: Questionable Research Practices in non-hypothesis testing research: ecological models for conservation decision-making

Abstract: Recent metaresearch studying the prevalence, types, causes and solutions to Questionable Research Practices (QRPs) has almost exclusively focussed on frequentist hypothesis-testing. However, within Conservation Science, modellers routinely conduct non-hypothesis testing research; building models that generate anticipatory predictions to guide policy and decision makers in managing threatened ecosystems and species. We expand the scope of current QRP work by adapting Gelman and Loken's 'Garden of Forking Paths' theory of researcher degrees of freedom to account for model-centric research. We surveyed the literature to construct ‘roadmaps’ identifying the type and location of plausible QRPs within the model development process. There are many opportunities for undisclosed researcher degrees of freedom in ecological modelling. While there are direct analogues between NHST and non-NHST research, there are domain and context-specific nuances that influence the emergence of QRPs in model-centric research.

 

 
4_15_Gould_AIMOS_poster.png
 
 

Rebbekah Neale

Affiliation: Misinformation Lab, Deakin University

Co-presenters: Amy Wanschers, Mathew Ling & Emily Kothe

Poster title: How do participants want their data handled: does data sensitivity or data type matter?

Abstract: Open data enables confirmation of results and enables unforeseen research opportunities. However, these benefits are offset by lasting risks to participants in terms of re-identification. These risks will also be affected by the properties of the data. This study provides insight into 176 participants views on data sharing after they’d completed one of 7 studies within the domain of psychological science. Self-reported sensitivity of information had little impact on how data sharing was received, concern was only raised if the data was to be shared for secondary research questions. Provisionally, Qualitative versus quantitative design also did not seem to affect attitudes toward sharing.

 

 
10_15_Neale_AIMOS_poster.png
 

Sarah R Schiavone

Affiliation: University of California, Davis

Co-presenter: Simine Vazire

Poster title: Quantifying credibility: Assessing quality, evidence, and claims in social and personality psychology research

Abstract: The credibility of psychology research has been questioned in recent years with increased attention to questionable practices, ‘failures’ to replicate, and the detection of errors, bias, and fraud within the published literature. Quantifying research quality presents a central challenge to metascientists, and without the ability to estimate and access quality, self-correction is unlikely to occur. This project combines a multi-method approach of webscraping, automation, human coding, and subjective ratings to broadly investigate research quality, strength of evidence, and the calibration of claims within the last 10 years of publications in several social and personality psychology journals.

 

 
12_15_Schiavone_AIMOS_poster.png
 
 

Aidan G Cashin

Affiliations: Neuroscience Research Australia, Sydney, Australia & Prince of Wales Clinical School, The University of New South Wales, Sydney, Australia

Poster title: Take Up TOP! – A crowd-sourced initiative to improve the transparency and openness of published research

Abstract: Although transparency and openness are embraced as vital requirements for scientific progress, the current publication record does not reflect this. Scientific journals are key stakeholders in the production of transparent and open research and their actions may facilitate progress in scientific practice. Take-up TOP! is a crowd-sourced project aiming to improve the extent to which journal authorial guidance champions transparent and open scientific practice. This may increase the use of transparent and open research practices by scientists.

 

 
3_15_Cashin_AIMOS_poster.png