Quantifying model prediction sensitivity to model-form uncertainty

Teresa Portone, Rebekah D. White, Joseph L. Hart

公開日: 2025/9/10

Abstract

Model-form uncertainty (MFU) in assumptions made during physics-based model development is widely considered a significant source of uncertainty; however, there are limited approaches that can quantify MFU in predictions extrapolating beyond available data. As a result, it is challenging to know how important MFU is in practice, especially relative to other sources of uncertainty in a model, making it difficult to prioritize resources and efforts to drive down error in model predictions. To address these challenges, we present a novel method to quantify the importance of uncertainties associated with model assumptions. We combine parameterized modifications to assumptions (called MFU representations) with grouped variance-based sensitivity analysis to measure the importance of assumptions. We demonstrate how, in contrast to existing methods addressing MFU, our approach can be applied without access to calibration data. However, if calibration data is available, we demonstrate how it can be used to inform the MFU representation, and how variance-based sensitivity analysis can be meaningfully applied even in the presence of dependence between parameters (a common byproduct of calibration).

Quantifying model prediction sensitivity to model-form uncertainty | SummarXiv | SummarXiv