We show how a recently developed multivariate data fitting technique enables to solve a variety of scientific computing problems in filtering, queueing, networks, metamodelling, computational finance, graphics, and more. We… Click to show full abstract
We show how a recently developed multivariate data fitting technique enables to solve a variety of scientific computing problems in filtering, queueing, networks, metamodelling, computational finance, graphics, and more. We can capture linear as well as nonlinear phenomena because the method uses a generalized multivariate rational model. The technique is a refinement of the basic ideas developed in Salazar et al. (Numer Algorithms 45:375–388, 2007. https://doi.org/10.1007/s11075-007-9077-3) and interpolates interval data. Intervals allow to take the inherent data error in measurements and simulation into consideration, whilst guaranteeing an upper bound on the tolerated range of uncertainty. The latter is the main difference with a best approximation or least squares technique which does as well as it can, but without respecting an a priori imposed threshold on the approximation error. Compared to the best approximations, the interval interpolant is relatively easy to compute. In applications where industry standards need to be guaranteed, the interval interpolation technique may be a valuable alternative.
               
Click one of the above tabs to view related content.