Scientific Programme

Sports and Exercise Medicine and Health

IS-MH07 - We don’t need to put our house in order, we need to build a better house: Improving research quality and trust in the age of open research

Date: 05.07.2024, Time: 09:30 - 10:45, Lecture room: Clyde Auditorium

Description

It seems foolish to spend limited time and resources on fruitless lines of scientific inquiry when research quality is poor and of little informational value. Yet, a history of design issues (e.g. lack of sample size estimation and vague hypotheses), statistical illiteracy (e.g. the misuse of NHST and statistical errors), and transparency issues that contribute towards replication concerns in sport and exercise science, and we appear to be doing just that. Much research has discussed the problems in our discipline but this presentation will focus on solutions for improving research quality across the scientific process. We will first consider the importance of fully developing research questions and appropriately operationalising variables. Second, we will discuss solutions that can be taken at individual and institutional level to improve data and statistical literacy. Finally, we will discuss why publication should not be solely considered the successful conclusion of a research project and why formal replications should be incorporated more broadly in the research cycle. There are barriers to improving research quality and it requires time to learn new skills, however, the strategic implementation of small changes will be beneficial. Although we will discuss research design, data literacy and replication separately, the message remains the same; why should we trust research built on poor foundations and what can we individually and collectively do to regain that trust?

Chair(s)

Jennifer Murphy
Jennifer Murphy
Technological University Dublin, School of Biological, Health, and Sports Sciences
Ireland
Sabrina Skorski
Sabrina Skorski
University of Saarland, Institute of Sports and Preventive Medicine
Germany
Franco M. Impellizzeri

Speaker A

Franco M. Impellizzeri
University of Technology of Sydney, Faculty of Health
Australia
Read CV

ECSS Glasgow 2024: IS-MH07

If you fail to plan, you are planning to fail: The significance of a well-defined research question and optimal methodology

Various phases of the research process involve subjective and arbitrary decisions. When these choices, referred to as researcher degrees of freedom, are made opportunistically, they increase the risk of false positive results and exaggerated effects [1]. At the start of the research process, a vague and ambiguous research question constitutes an initial source of researcher degrees of freedom [2]. Such vagueness facilitates practices like p-hacking and HARKing (hypothesizing after the results are known), with the latter being particularly common for example when exploratory studies are “rebranded” as confirmatory during manuscript preparation. Unclear research questions and hypotheses can result from a superficial or suboptimal approach to the research development phase, rather than being intentionally introduced for post-results ad hoc explanations. Vagueness may also result from the absence of well-operationally defined concepts within the question and unclear dependent variables and outcomes. The research projects methodology (design and analysis) is closely linked to the research question. Clarity and transparency are therefore essential to ensure a robust research process and integrity. Preregistration is an important initiative to enhance transparency but should not be mistaken for a quality certification’ [3]. While it promotes transparency, it does not inherently address issues like vague research questions and methodologies (i.e., authors can register a “bad” study). An alternative approach, such as registered reports, presents a more robust initiative for improving the research design phase [4]. With registered reports, authors submit their research design before conducting the study, enabling error correction beforehand. Furthermore, the requirement for a detailed research plan encourages authors to be more specific and unambiguous. Considering and implementing the estimand as recommended by the International Council for Harmonisation (ICH E9-R1) [5] in intervention studies would also be valuable in sports science and medicine. This presentation aims to elucidate why dedicating sufficient time to formulating a research question and selecting appropriate methods is the crucial first step for successful, relevant, robust and replicable studies. [1] https://doi.org/10.1177/0956797611417632 [2] https://doi.org/10.3389/fpsyg.2016.01832 [3] https://doi.org/10.31234/osf.io/jbh4w [4] https://doi.org/10.1016/j.cortex.2012.12.016 [5] https://www.ema.europa.eu/en/ich-e9-statistical-principles-clinical-trials-scientific-guideline

Grant Abt

Speaker B

Grant Abt
The University of Hull, Sport, Health and Rehabilitation Sciences
United Kingdom
Read CV

ECSS Glasgow 2024: IS-MH07

“I’m not that great at stats” and other cautionary tales: Why data and statistical literacy are vital

In the 1820s more than 80% of the world was illiterate [1], but now most people can read and write. Yet in the 2020s we face a new kind of illiteracy – data illiteracy. Data literacy has broadly been defined as the ability to ask and answer real-world questions from data, including the abilities to select, clean, analyse, visualise, critique, and interpret data [2]. Data literacy is increasingly important in a digital world [3] and should be considered a ‘life skill’ [2]. However, fewer than 20% of university graduates feel their degree prepared them ‘very well’ for the data skills needed for employment [4], including academia and research. Yet academics and researchers would probably agree that formal statistical inference methods are required if we are to avoid ‘fooling ourselves’ [5]. For example, the banning of null hypothesis significance testing by Basic and Applied Social Psychology in 2015 [6] often led to researchers overstating their conclusions beyond what the data supported [7]. If we couple data illiteracy with the ‘surprisingly high’ prevalence of questionable research practices [8], the perverse incentives that often drive those behaviours [9], and a replication crisis [10], and yes Houston, we clearly have a problem. While most degree programmes include research methods and statistics classes, recent evidence on teaching statistics in psychology [11] and our own discipline [12] suggests that we focus too much on hypothesis testing and not enough on conceptual understanding or topics such as open research, confidence intervals, and replication. Academics and researchers also need to ensure their own data and statistical literacy are adequate to conduct high-quality research, yet based on the problems outlined, this might not be the case [13]. Yet the rapid rise of artificial intelligence needs to be considered, and particularly how these technologies might not only help us to learn statistical concepts [14] but do it for us [15]. This presentation will address these issues and suggest a range of possible solutions that can be taken at individual, discipline, and institutional level. We all need to take data and statistical literacy seriously if we are to ensure high-quality research outputs and a future workforce that has the data skills required for employment. [1] https://ourworldindata.org/literacy [2] 10.15353/joci.v12i3.3275 [3] https://royalsociety.org/blog/2022/06/envision-adrian-smith/ [4] https://www.gov.uk/government/statistical-data-sets/ad-hoc-statistical-analysis-202021-quarter-2 [5] https://calteches.library.caltech.edu/51/2/CargoCult.htm [6] 10.1080/01973533.2015.1012991 [7] 10.1080/00031305.2018.1537892 [8] 10.1177/0956797611430953 [9] 10.1123/kr.2022-0039 [10] 10.1098/rsos.220946 [11] https://doi.org/10.31234/osf.io/gp9aj [12] https://osf.io/vh7dw [13] 10.1002/tesq.128 [14] 10.30935/cedtech/13152 [15] 10.1145/3570220

Jennifer Murphy

Speaker C

Jennifer Murphy
Technological University Dublin, School of Biological, Health, and Sports Sciences
Ireland
Read CV

ECSS Glasgow 2024: IS-MH07

My research got published, so what’s the big deal? The need for formal replication in sports science

Publication is the successful pinnacle for any research project but the methodological quality of that research should be priority for robust knowledge gain. The scientific process has its foundations in replicability and transparency, and requires a verification process that eliminates redundant paths after non-replicability (1). Yet, formal replication of published sports science research is undervalued. Concerns about replicability are high as 75% of surveyed researchers believe there is a replication crisis in the field and 42% believe this is a significant crisis (2). Lack of transparency with data (4.3%) and code sharing (<1%) rates (3) is worrisome as a clear association was found between the willingness of authors to share their data and increased statistical errors (4). Evidence of publication bias also likely contributes to replication concerns (5); our indisputable focus on significant effects and the high proportion of supported hypotheses in the field (81%)(6), given average statistical power, should be a major cause for unease. With the absence of replication research, many published findings will often remain unchallenged. This presentation will outline the different types of replication, and highlight the conceptual and practical challenges in running replication studies based on the first large replication project in the field (7). Initial outcomes of this replication attempt will also be discussed in addition to a perspective on why formal replications should be incorporated more broadly in the research cycle (e.g. in PhD programs). 1. Errington TM, Mathur M, Soderberg CK, Denis A, Perfito N, Iorns E, et al. Investigating the replicability of preclinical cancer biology. Elife. 2021;10:1–30. 2. Murphy J, Mesquida C, Warne JP. A Survey on the Attitudes Towards and Perception of Reproducibility and Replicability in Sports and Exercise Science. Communications in Kinesiology. 2023;1(5). 3. Borg DN, Bon JJ, Sainani KL, Baguley BJ, Tierney NJ, Drovandi C. Comment on: ‘Moving sport and exercise science forward: a call for the adoption of more transparent research practices’. Sports Medicine. 2020;50(8):1551–3. 4. Wicherts JM, Bakker M, Molenaar D. Willingness to share research data is related to the strength of the evidence and the quality of reporting of statistical results. PLoS One. 2011 Nov 2;6(11):1–7. 5. Mesquida C, Murphy J, Lakens D, Warne J. Replication concerns in sports and exercise science: a narrative review of selected methodological issues in the field. R Soc Open Sci. 2022;9(220946). 6. Twomey R, Yingling VR, Warne JP, Scheider C, McCrum C, Atkins WC, et al. The nature of our literature: A registered report on the positive result rate and reporting practices in kinesiology. Communications in Kinesiology. 2021;1(3):1–14. 7. Murphy J, Mesquida C, Caldwell AR, Earp BD, Warne JP. Proposal of a Selection Protocol for Replication of Studies in Sports and Exercise Science. Sports Medicine. 2023;53:281–91.