Abstract Divergence measures are statistical tools designed to distinguish between the information provided by distribution functions of f(x) and g(x). The magnitude of divergence has been defined using a variety… Click to show full abstract
Abstract Divergence measures are statistical tools designed to distinguish between the information provided by distribution functions of f(x) and g(x). The magnitude of divergence has been defined using a variety of methods such as Shannon entropy and other mathematical functions through a history of more than a century. In the present study, we have briefly explained the Lin–Wong divergence measure and compared it to other statistical information such as the Kullback-Leibler, Bhattacharyya and divergence as well as Shannon entropy and Fisher information on Type I censored data. Besides, we obtain some inequalities for the Lin–Wong distance and the mentioned divergences on the Type I censored scheme. Finally, we identified a number of ordering properties for the Lin–Wong distance measure based on stochastic ordering, likelihood ratio ordering and hazard rate ordering techniques.
               
Click one of the above tabs to view related content.