Uncertainty Quantification and Analysis Methods for
Complex Problems, 18-9346

Printer Friendly Version

Principal Investigators
Luc Huyse
David S. Riha
Ben H. Thacker
Jung S. Kong
John A. Stamatakos
Doug Gute

Inclusive Dates: 08/14/02 - Current

Background - Recent programs of national significance are pushing the development of numerical simulation to new levels. These include, among others, the Nuclear Regulatory Commission program to assess the long-term safety of the nation's first underground high-level radioactive waste repository. An enabling technology common to all these programs is the ability to compute the reliability of complex, large-scale systems with high confidence. However, for many important engineering applications insufficient data are available to justify the choice of a particular probability density function (PDF) required for traditional probabilistic analysis. Sometimes the only data available are in the form of interval estimates defined by often-conflicting expert opinion. The first objective of this research is to develop new probabilistic methods that can fill this need.

In addition, the numerical accuracy of an efficient reliability computation hinges on the ability to correctly locate the most probable failure point (MPP), which is found as the result of an optimization problem. For a complex system, this optimization often fails to adequately converge. The second objective is to develop tools to quickly detect this so that the search algorithm can be changed to a more robust but computationally intensive approach.

Approach - In absence of data or detailed specific knowledge about the random variables, the use of probabilistic methods is characterized by the avoidance of the selection of a specific PDF for a variable. A hierarchical model of a continuous family of PDF's is used instead. This research also demonstrates that Bayesian estimation techniques can successfully be used in applications where only vague interval measurements are available. The classical Bayesian estimation methods are expanded to make use of imprecise interval data. Each expert opinion (interval data) is interpreted as a random interval sample from a parent PDF. Consequently, a partial conflict between experts is automatically accounted for through the likelihood function.

Accomplishments - We developed and implemented a Bayesian updating scheme that can take the additional uncertainty associated with interval data into account. Figure 1 illustrates the effect of vagueness in the input data. If the data are extremely vague, there is relatively little difference between the prior and posterior distribution. However, when the data are precise, the posterior PDF has considerably less variance.

Once the uncertain PDF is described, a reliability analysis can be performed. The proposed approach has been applied to a few problems. A result of such a reliability analysis on the basis of uncertain PDFs is given in the Figure 2, which shows the reliability as a function of life. Because insufficient data are available to accurately determine the PDF and its parameters, considerable uncertainty is associated with the reliability. The red and green lines represent confidence limits on the results. As more data become available these confidence limits will become narrower and collapse onto the updated reliability curve. The results in Figure 2 were obtained using Monte Carlo simulation. Efficient reliability computation in complex problems typically requires an MPP search. A novel MPP search failure detection algorithm has been proposed and is under testing.

Figure 1. Impact of Vagueness of Data on Posterior PDF
 

Figure 2. Impact of Epistemic Uncertainty on Reliability Assessment

2004 Program Home