Treatment of Multiple Failure Modes and Epistemic Uncertainties with the Efficient Global Reliability Analysis Method, 18-R9895Printer Friendly Version
Inclusive Dates: 10/01/08 Current
Background - Reliability analysis is the process of studying how uncertainties and variations associated with model inputs (such as material properties, boundary conditions, or loading conditions) may impact the resulting outputs or system performance. Simply put, given the variations that are known to exist in properties, operating conditions, and other factors, the goal is to estimate the probability that a particular system will satisfy some required performance measure: this probability is referred to as the reliability.
Unfortunately, reliability analysis can add considerable complication and computational expense to traditional deterministic analysis, making it cost prohibitive for many applications of interest. This is especially true for systems that possess multiple failure modes. Additionally, uncertainty in the inputs to the reliability analysis computation leads to uncertainty in the estimated reliability, which reduces confidence in the result.
Approach - The first phase of this project aims to simultaneously reduce the computational expense and increase the accuracy of estimating the reliability of large-scale engineering systems. This will be accomplished through the development of several improvements to the recently developed Efficient Global Reliability Analysis (EGRA) method. Specifically targeted will be its application to (a) extremely expensive, (b) potentially noisy, and (c) system-level problems; (d) parallel execution of the underlying deterministic analyses; and (e) inverse reliability analysis.
The second phase is uncertainty quantification. Here the research aims to study, develop, and improve methods for quantifying several types of uncertainty that can impact computed reliability results. The types of uncertainty of interest are (a) uncertainty associated with estimated probability distribution model parameters, (b) uncertainty associated with probability distribution model type, and (c) uncertainty introduced through the use of a Gaussian process surrogate model as part of the EGRA framework.
Accomplishments - System-Level Reliability Analysis: Through a modification of a core part of EGRA, a method with an unprecedented combination of accuracy and efficiency has been created. For instance, one example problem demonstrated that at an average expense of only approximately 35 function evaluations, EGRA could provide equivalent accuracy as random sampling with 3 million evaluations. Previous efficient methods remain both more computationally expensive than EGRA and less efficient due to their reliance on simplifying approximations.
Distribution Uncertainty: The probability density functions for the random variables used in reliability analysis are often estimated by fitting probability distribution models to observed test data. Because an infinite amount of test data can never be obtained, there is always some degree of uncertainty associated with the actual underlying probability density functions that generated the observed data. Consequently, there is some amount of uncertainty associated with any reliability estimate that is computed based on the assumed probability distribution models.
This project has developed a rigorous approach founded in Bayesian inference for quantifying the uncertainty in both the distribution model form and its parameters. This approach allows one to compute the uncertainty associated with the reliability by averaging over multiple possible models, based on their relative likelihoods, as indicated by observed data.
It is important to note that while this type of analysis is certainly possible with a different reliability analysis method, such as Monte Carlo sampling, it is the application of EGRA to this analysis that has made quantifying the uncertainty in reliability estimates due to the distribution uncertainty finally practical due to the vast computational savings.