You are not currently logged in.
Access JSTOR through your library or other institution:
Information Distinguishability with Application to Analysis of Failure Data
Ehsan S. Soofi, Nadar Ebrahimi and Mohamed Habibullah
Journal of the American Statistical Association
Vol. 90, No. 430 (Jun., 1995), pp. 657-668
Stable URL: http://www.jstor.org/stable/2291079
Page Count: 12
Preview not available
In maximum entropy (ME) modeling, the information discrepancy between two distributions is measured in terms of their entropy difference. In discrimination information statistics the information discrepancy between two distributions is measured in terms of the Kullback-Leibler function (i.e., relative entropy or cross-entropy). This article presents an equivalence between Kullback-Leibler functions and entropy differences involving an ME distribution. Based on this equivalence, the concept of information discrimination (ID) distinguishability is introduced as a unifying framework for the two methods of measuring information discrepancy between distributions. Applications of ID distinguishability as diagnostics for examining robustness of parametric procedures and sensitivity of nonparametric statistics across parametric families of distributions is proposed. The equivalence results facilitates estimation of Kullback-Leibler functions in terms of entropy estimates. Application of the ID distinguishability to modeling failure data brings a new dimension into entropy estimation--entropy estimation based on the hazard function. ID statistics for modeling lifetime distributions with increasing failure rates are studied. Two illustrative examples are analyzed.
Journal of the American Statistical Association © 1995 American Statistical Association