Shannon developed the idea of entropy in 1948, which relates to the measure of uncertainty associated with a random variable X. The contribution of the extropy function as a dual… Click to show full abstract
Shannon developed the idea of entropy in 1948, which relates to the measure of uncertainty associated with a random variable X. The contribution of the extropy function as a dual complement of entropy is one of the key modern results based on Shannon’s work. In order to develop the inferential aspects of the extropy function, this paper proposes a non-parametric kernel type estimator as a new method of measuring uncertainty. Here, the observations are exhibiting α-mixing dependence. Asymptotic properties of the estimator are proved under appropriate regularity conditions. For comparison’s sake, a simple non-parametric estimator is proposed, and in this respect, the performance of the estimator is investigated using a Monte Carlo simulation study based on mean-squared error and using two real-life data.
               
Click one of the above tabs to view related content.