This work presents a novel approach for the accurate estimation of multiple time-delays from the frequency response of a distributed system. The proposed approach is based on a powerful and… Click to show full abstract
This work presents a novel approach for the accurate estimation of multiple time-delays from the frequency response of a distributed system. The proposed approach is based on a powerful and flexible machine learning technique, namely, the least-square support vector machine (LS-SVM). The LS-SVM regression is used to construct a metamodel of the transfer function describing a generic linear time-invariant system in a delayed-rational form. Specifically, after some manipulation the LS-SVM model precisely identifies the dominant propagation delays of the original system. The essential steps and critical criteria for the delay identification procedure are carefully discussed throughout the paper. Once the system delays have been identified, the rational part of the metamodel expansion is then obtained by means of a progressive application of the conventional vector fitting algorithm. Numerical examples are presented to illustrate the feasibility and performance of the proposed technique and to compare its performances with what is provided by state-of-the-art techniques. The results clearly highlight the capability of the proposed approach to identify the dominant delays in distributed systems, thus allowing to construct compact delayed rational models.
               
Click one of the above tabs to view related content.