To overcome the drawbacks of Shannon’s entropy, the concept of cumulative residual and past entropy has been proposed in the information theoretic literature. Furthermore, the Shannon entropy has been generalized… Click to show full abstract
To overcome the drawbacks of Shannon’s entropy, the concept of cumulative residual and past entropy has been proposed in the information theoretic literature. Furthermore, the Shannon entropy has been generalized in a number of different ways by many researchers. One important extension is Kerridge inaccuracy measure. In the present communication we study the cumulative residual and past inaccuracy measures, which are extensions of the corresponding cumulative entropies. Several properties, including monotonicity and bounds, are obtained for left, right and doubly truncated random variables.
               
Click one of the above tabs to view related content.