You are not currently logged in.
Access JSTOR through your library or other institution:
If You Use a Screen ReaderThis content is available through Read Online (Free) program, which relies on page scans. Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.
Predictive Inference, Sufficiency, Entropy and an Asymptotic Likelihood Principle
Wallace E. Larimore
Vol. 70, No. 1 (Apr., 1983), pp. 175-181
Stable URL: http://www.jstor.org/stable/2335955
Page Count: 7
You can always find the topics here!Topics: Entropy, Probabilities, Law of likelihood, Inference, Statistics, Statistical models, Experimentation, Density, Statistical inferences, Statism
Were these topics helpful?See something inaccurate? Let us know!
Select the topics that are inaccurate.
Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.
Preview not available
The objective of inferring stochastic models from a set of data is to obtain the best description, by using a probability model, of the statistical behaviour of future samples of the process. A conceptual repeated sampling experiment is considered for evaluating a predictive distribution used to describe such future observations and leads to an asymptotic likelihood principle. Considerations of likelihood and sufficiency lead to the use of entropy or the Kullback-Leibler information as the natural measure of approximation to the actual distribution by a predictive distribution in repeated samples. This gives a small-sample justification for the use of entropy for evaluating parameter estimation as well as model order and structure determination procedures.
Biometrika © 1983 Biometrika Trust