In spite of the wide improvements in computer simulation packages, many complex simulation models, particularly under uncertainty, may be inefficient to run in terms of time, computation, and resources. To… Click to show full abstract
In spite of the wide improvements in computer simulation packages, many complex simulation models, particularly under uncertainty, may be inefficient to run in terms of time, computation, and resources. To address such a challenge, integrating metamodels and robust design optimization has been applied. In the current paper, a systematic comparative study is implemented to evaluate the performance of three common metamodels, namely polynomial regression, kriging, and radial basis function. The required experiments are designed by different space-filling methods including the orthogonal array design and three forms of Latin hypercube sampling such as randomized, maximin, and correlation approaches. Although, the impact of sample size on the performance of metamodels in robust optimization results are investigated. All methods are analyzed using five two-dimensional test problems and one engineering problem while all of them are considered in two forms that are expensive (with a small sample size) and semi-expensive (with a large sample size). Uncertainty is assumed in all problems as a source of variability, so all test problems are conducted in the format of robust optimization in the class of dual response surface in order to estimate robust Pareto frontier. The performances of methods are studied in two terms of accuracy and robustness. Finally, the results of comparison, an applicable guideline is provided to aid the practitioners in selecting the appropriate combination of metamodels and sampling design methods for investigating set of robust optimal points (estimated Pareto frontier) in simulation–optimization problems under uncertainty.
               
Click one of the above tabs to view related content.