Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
 
Session Overview
Session
S1: Neutral comparison studies in methodological research
Time:
Monday, 04/Sept/2023:
11:00am - 12:40pm

Session Chair: Sarah Friedrich
Session Chair: Anne-Laure Boulesteix
Discussant/Panelist: Ruth Keogh
Discussant/Panelist: Tim Morris
Location: Lecture Room U1.111 hybrid


Show help for 'Increase or decrease the abstract text size'
Presentations
11:00am - 11:20am

Pitfalls and Potentials in Simulation Studies

Samuel Pawel, Lucas Kook, Kelly Reeve

University of Zurich, Switzerland

Comparative simulation studies are workhorse tools for benchmarking statistical methods. As with other empirical studies, the success of simulation studies hinges on the quality of their design, execution and reporting. If not conducted carefully and transparently, their conclusions may be misleading. In this paper we discuss various questionable research practices which may impact the validity of simulation studies, some of which cannot be detected or prevented by the current publication process in statistics journals. To illustrate our point, we invent a novel prediction method with no expected performance gain and benchmark it in a pre-registered comparative simulation study. We show how easy it is to make the method appear superior over well-established competitor methods if questionable research practices are employed. Finally, we provide concrete suggestions for researchers, reviewers and other academic stakeholders for improving the methodological quality of comparative simulation studies, such as pre-registering simulation protocols, incentivizing neutral simulation studies and code and data sharing.



11:20am - 11:40am

Against the “one method fits all data sets” philosophy for comparison studies in methodological research

Carolin Strobl1, Friedrich Leisch2

1Universität Zürich, Schweiz; 2Universität für Bodenkultur Wien, Österreich

Many methodological comparison studies aim at identifying a single or a few “best performing” methods over a certain range of data sets. In this presentation we take a different viewpoint by asking whether the research question of identifying the best performing method is what we should be striving for in the first place. We will argue that this research question implies assumptions which we do not consider warranted in methodological research, that a different research question would be more informative, and how this research question can be fruitfully investigated.



11:40am - 12:00pm

Phases of methodological research in biostatistics - Building the evidence base for new methods

Georg Heinze1, Anne-Laure Boulesteix2, Michael Kammer1, Tim Morris3, Ian White3

1Center for Medical Data Science, Medical University of Vienna, Austria; 2Institute for Medical Information Processing, Biometry and Epidemiology, Ludwig-Maximilians University of Munich, Germany; 3MRC Clinical Trials Unit, UCL, London, UK

Although new biostatistical methods are published at a very high rate, many of these developments are not trustworthy enough to be adopted by the scientific community. We propose a framework to think about how a piece of methodological work contributes to the evidence base for a method. Similar to the well-known phases of clinical research in drug development, we propose to define four phases of methodological research. These four phases cover (I) proposing a new methodological idea while providing, for example, logical reasoning or proofs, (II) providing empirical evidence, first in a narrow target setting, then (III) in an extended range of settings and for various outcomes, accompanied by appropriate application examples, and (IV) investigations that establish a method as sufficiently well-understood to know when it is preferred over others and when it is not; that is, its pitfalls. We suggest basic definitions of the four phases to provoke thought and discussion rather than devising an unambiguous classification of studies into phases. Too many methodological developments finish before phase III/IV, but we give two examples with references. Our concept rebalances the emphasis to studies in phases III and IV, that is, carefully planned method comparison studies and studies that explore the empirical properties of existing methods in a wider range of problems. All authors of this paper are members of the international STRATOS Initiative (STRengthening Analytical Thinking for Observational Studies). The proposed framework aims at refining the notion of evidence in methodological research that is central to STRATOS’ efforts.



 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: CEN 2023
Conference Software: ConfTool Pro 2.6.151+TC
© 2001–2024 by Dr. H. Weinreich, Hamburg, Germany