Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
 
Session Overview
Session
S8: Statistics in Practice 1
Time:
Monday, 04/Sept/2023:
2:00pm - 3:40pm

Session Chair: Willi Sauerbrei
Session Chair: Georg Heinze
Location: Lecture Room U1.111 hybrid


Show help for 'Increase or decrease the abstract text size'
Presentations
2:00pm - 2:20pm

Simulation studies as a tool to assess and compare the properties of statistical methods – an overview

Tim Morris, Brennan C. Kahan

MRC Clinical Trials Unit at UCL

Simulation studies are a key tool for studying the properties of statistical methods. As such, they contribute to the evidence base that supports the choice of methods in practice. This workshop will provide an overview of simulation studies for those who might use them. The workshop will be split into four sessions, providing a whistle stop tour of the planning, coding, analysis and reporting of simulation studies.

1 Planning

Simulation studies need to be carefully planned. This session will outline the ADEMP structure for planning simulation studies, which involves defining Aims, Data-generating mechanisms, Estimands, Methods of analysis and Performance measures. We will discuss issues and subtleties that should be considered for each of the steps, and introduce standard terminology.

2 Coding

All simulation studies require coding. This can be extremely simple but is frequently complex, and it is easy to make mistakes. In this session, we will cover how to write ‘defensive’ code for a simulation study that reduces the chance of making mistakes that produce misleading results. As such, we will focus on the concepts and ideas but not actual pieces of code.

3 Analysis

Once a simulation study has been run, it needs to be analysed. Analysis should begin by checking the data (e.g. for missing values). This session will focus on performance measures: the metrics by which we judge results. We will define some common performance measures and describe how they are estimated, emphasising the importance of simulation uncertainty (Monte Carlo error). The session will end by considering how to choose the sample size for a simulation study.

4 Writing up and reporting

Not every simulation study is intended for publication. However, if a simulation study is to be published and its results trusted, it must be to be clearly understood by others. This session will describe some principles for tabular and graphical displays of simulation results, and consider some examples from the literature. We will make suggestions for what should be reported and, of course, advocate open sharing of code

Some key publications

[1] T. P. Morris, I. R. White, and M. J. Crowther, “Using simulation studies to evaluate statistical methods,” Statistics in Medicine, vol. 38, no. 11, pp. 2074–2102, 2019. doi: 10.1002/sim.8086.

[2] A.-L. Boulesteix, H. Binder, M. Abrahamowicz, and W. Sauerbrei, for the Simulation Panel of the STRATOS Initiative, “On the necessity and design of studies comparing statistical methods,” Biometrical Journal, vol. 60, no. 1, pp. 216–218, Nov. 2017. doi: 10.1002/bimj.201700129. [Online]. Available: https://doi.org/10.1002/ bimj.201700129.

[3] G. Heinze, A.-L. Boulesteix, M. Kammer, T. P. Morris, and I. R. White, “Phases of methodological research in biostatistics—building the evidence base for new methods,” Biometrical Journal, p. 2 200 222, 2023. doi: 10.1002 / bjm. 202200222. [Online]. Available: https://doi.org/10.1002/bimj.202200222.

[4] I. R. White, T. M. Pham, M. Quartagno, and T. P. Morris, “How to check a simulation study,” 2023. doi: 10.31219/osf.io/cbr72. [Online]. Available: https://doi. org/10.31219/osf.io/cbr72.



 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: CEN 2023
Conference Software: ConfTool Pro 2.6.149+TC
© 2001–2024 by Dr. H. Weinreich, Hamburg, Germany