Location:
on-site: Mensa-Room 0001
digital: MS Teams (link shared via e-mail)
Sebastian Brandstäter (UniBw M): Sensitivity Analysis for Biomechanical Models
Uncertainty plays a crucial role in active biomechanical systems, primarily due to their inherent high natural variability. This variability should be considered in any biomechanical modelling endeavour, but particularly when the models are used in clinical decision-making. Most notably, biomechanical models incorporate numerous parameters with imprecise values, leading to uncertainty propagation through the models. Ultimately, this results in uncertain model predictions. Global sensitivity analysis aims to analyse and quantify how much of the prediction uncertainty can be attributed to the individual model input parameters and their interactions.
The talk will introduce global sensitivity analysis methods applicable to studying complex, large-scale computational models such as finite-element models of active biomechanical systems. A focus will be placed on computational aspects, such as making the computational burden tractable with data-driven surrogate models. The practical utility of these approaches in studying the biomechanical systems will be demonstrated through examples, including a model of growth and remodelling during the formation of an abdominal aortic aneurysm, drug delivery to a tumour, and the electromechanics of gastric peristalsis.
Ruben Horn (HSU): Energy Efficiency of Molecular(-Continuum) Simulations
Energy consumption is an ever-growing concern in high performance computing (HPC) including simulation studies.
Compared to computationally very expensive molecular dynamic (MD) simulations, coupled scenarios of MD and continuum simulation domains can be used to achieve a tradeoff between reduced energy consumption and sufficient accuracy of the simulation result. Statistical noise in the MD simulation, which dominates the runtime and thus energy consumption, can be alleviated in two ways:
Using MultiMD, an ensemble of multiple independent MD simulations is sampled and the results averaged. This comes at a linear cost to energy and runtime, and logarithmic gain of the signal-to-noise ratio.
Applying a filtering method such as the Gauss filter yields a fixed noise reduction at negligible overhead. Preliminary results show that this method can achieve comparable result to doubling the number of MD simulations.
For simple scenarios with homogeneous particle distribution, the energy-error tradeoff can be described by a simple linear regression model, which allows for selecting the simulation parameters so as to achieve results with a statistical error under a set threshold.
The results for a distribution over a large number of nodes significantly diverge from the model, which appears to be caused by an unaccounted for communication overhead.
Additionally, several other parameters are identified, such as the HPC platform and filtering method, which affect the energy consumption and accuracy of the simulation, but are not yet captured by this model.