Entropy measures are widely applied to quantify the complexity of dynamical systems in diverse fields. However, the practical application of entropy methods is challenging, due to the variety of entropy… Click to show full abstract
Entropy measures are widely applied to quantify the complexity of dynamical systems in diverse fields. However, the practical application of entropy methods is challenging, due to the variety of entropy measures and estimators and the complexity of real-world time series, including nonstationarities and long-range correlations (LRC). We conduct a systematic study on the performance, bias, and limitations of three basic measures (entropy, conditional entropy, information storage) and three traditionally used estimators (linear, kernel, nearest neighbor). We investigate the dependence of entropy measures on estimator- and process-specific parameters, and we show the effects of three types of nonstationarities due to artifacts (trends, spikes, local variance change) in simulations of stochastic autoregressive processes. We also analyze the impact of LRC on the theoretical and estimated values of entropy measures. Finally, we apply entropy methods on heart rate variability data from subjects in different physiological states and clinical conditions. We find that entropy measures can only differentiate changes of specific types in cardiac dynamics and that appropriate preprocessing is vital for correct estimation and interpretation. Demonstrating the limitations of entropy methods and shedding light on how to mitigate bias and provide correct interpretations of results, this work can serve as a comprehensive reference for the application of entropy methods and the evaluation of existing studies.
               
Click one of the above tabs to view related content.