Improving Model Prediction Accuracy by Reducing Uncertainty in Model Components, 18-R8169
Principal Investigators
David S. Riha
John M. McFarland
Todd L. Bredbenner
Daniel P. Nicolella
Barron J. Bichon
Don Moravits
Inclusive Dates: 07/01/10 – 07/02/12
Background — Numerical models such as finite element analysis are routinely used to predict the performance of engineered systems. Government and industry now routinely rely on model predictions to make such decisions as to when to retire system components, how to extend the life of an aging system, or if a new design will be safe or available. The validity of many models used to predict the performance of existing engineered systems has been assessed through historical data but this type of validation is not possible for new designs or designs used in different environments. The validation of new models using experiments becomes more difficult and costly as the complexity and reliability requirements increase. For example, a highly reliable aircraft engine component is difficult to test to failure under operating conditions due to the high reliability. In addition, it may be cost prohibitive to actually test an expensive component to failure. Other systems are impractical to test such as the in vivo measurement of performance measures in humans or animals. Valid model predictions become increasingly important as the cost, reliability, and experimental complexity for the engineered system increases. Therefore, effective approaches for model validation are needed to assess and improve model predictions. A general and consistent approach for model validation has not been developed for complex problems. Determining the uncertainty on the model predictions is a critical element in the validation process and is a main focus of this research.
Approach — The primary objectives of this program were to:
- Develop model precision methodology for different types of model uncertainties (e.g., model form, limited data) and approaches and methods to compute model component uncertainty and their contribution to the total model uncertainty.
- Demonstrate the methodologies and approaches by developing validated finite element models of mouse ulnae.
Accomplishments — Variance decomposition methods as described by Saltelli et al.1 have been implemented, extended, and exercised to model different types of uncertainties and identify their importance to the model prediction uncertainty for the model precision methodology. The approaches were evaluated via the reliability analysis for the deflection of a statically indeterminate beam. The example problem illustrates the distinction between aleatory and epistemic uncertainty, as the aleatory distributions for the model inputs are estimated based on limited sample data, which introduces epistemic uncertainty about the distribution parameters such as the means and standard deviations. It is shown that the variance decomposition approach can successfully identify a data-rich input as having a negligible contribution to variance, even though the deterministic model is highly sensitive to that input. The model development and validation for the mouse ulna in the in vivo forearm compressive loading model protocol was performed following ASME V&V 10 Guide. A V&V plan was developed and followed. The plan identified the modeling assumptions, calibration and validation experiments, and validation metrics and is a useful tool to communicate assumptions and uncertainties about both the models and experiments. Model V&V is typically used for models of complex systems that seldom have standard test protocols for validation experiments. Based on experience working other validation projects and this research in particular, the uniqueness of validation experiments leads to potential errors in the experiments and additional unknown uncertainties. In many cases, the experiments are not successful, leaving little information for the model validation. The uniqueness of validation experiments have the potential to impact schedule and cost budgets and should be carefully planned and executed. In addition, regular communication between the model developers and those performing the experiments is essential to successful model validation. The early model predictions can be used to assist in designing the experiments, and the model developers need a thorough understanding of the uncertainties in the experiments. Several challenges were experienced during this research such as completing all experiments for the calibration and validation due to new experimental protocols, equipment availability, and specimen availability, which are all real-world issues for model validation. Overall, this research effort developed new tools and methods to support model validation efforts and provided valuable experience in the relationship between experiments and model development. One main outcome was the development of sensitivities that identify the contribution of both aleatory and epistemic uncertainties. This information can be used to guide the allocation of resources to improve models and/or to perform additional experiments. These variance based sensitivities methods were implemented in the NESSUS® software through CENTAUR™ and are now available to support future V&V efforts. Experience was also gained about the complexity and challenges to obtain quality experimental data for calibration and validation of complex systems. While a rigorously validated model of the mouse ulna was not achieved, there is more confidence in the model predictions based on the qualitative comparisons of the model and measurements and information about how to improve the model to better match reality.
Saltelli, A., S. Tarantola, F. Campolongo and M. Ratto. Sensitivity Analysis in Practice: A Guide to Assessing Scientific Models, Wiley, New York, NY, 2004.