For most history matching problems, the posterior probability density function (PDF) may have multiple local maxima, and it is extremely challenging to quantify uncertainty of model parameters and production forecasts… Click to show full abstract
For most history matching problems, the posterior probability density function (PDF) may have multiple local maxima, and it is extremely challenging to quantify uncertainty of model parameters and production forecasts by conditioning to production data. In this paper, a novel method is proposed to improve the accuracy of Gaussian mixture model (GMM) approximation of the complex posterior PDF by adding more Gaussian components. Simulation results of all reservoir models generated during the history matching process, e.g., using the distributed Gauss-Newton (DGN) optimizer, are used as training data points for GMM fitting. The distance between the GMM approximation and the actual posterior PDF is estimated by summing up the errors calculated at all training data points. The distance is an analytical function of unknown GMM parameters such as covariance matrix and weighting factor for each Gaussian component. These unknown GMM parameters are determined by minimizing the distance function. A GMM is accepted if the distance is reasonably small. Otherwise, new Gaussian components will be added iteratively to further reduce the distance until convergence. Finally, high-quality conditional realizations are generated by sampling from each Gaussian component in the mixture, with the appropriate relative probability. The proposed method is first validated using nonlinear toy problems and then applied to a history-matching example. GMM generates better samples with a computational cost comparable to or less than other methods we tested. GMM samples yield production forecasts that match production data reasonably well in the history-matching period and are consistent with production data observed in the blind test period.
               
Click one of the above tabs to view related content.