Access

You are not currently logged in.

Access your personal account or get JSTOR access through your library or other institution:

login

Log in to your personal account or through your institution.

If you need an accessible version of this item please contact JSTOR User Support

A Generalization of Ornstein's $\bar d$ Distance with Applications to Information Theory

Robert M. Gray, David L. Neuhoff and Paul C. Shields
The Annals of Probability
Vol. 3, No. 2 (Apr., 1975), pp. 315-328
Stable URL: http://www.jstor.org/stable/2959395
Page Count: 14
  • Read Online (Free)
  • Download ($19.00)
  • Subscribe ($19.50)
  • Cite this Item
If you need an accessible version of this item please contact JSTOR User Support
A Generalization of Ornstein's $\bar d$ Distance with Applications to Information Theory
Preview not available

Abstract

Ornstein's $\bar{d}$ distance between finite alphabet discrete-time random processes is generalized in a natural way to discrete-time random processes having separable metric spaces for alphabets. As an application, several new results are obtained on the information theoretic problem of source coding with a fidelity criterion (information transmission at rates below capacity) when the source statistics are inaccurately or incompletely known. Two examples of evaluation and bounding of the process distance are presented: (i) the $\bar{d}$ distance between two binary Bernoulli shifts, and (ii) the process distance between two stationary Gaussian time series with an alphabet metric $|x - y|$.

Page Thumbnails

  • Thumbnail: Page 
315
    315
  • Thumbnail: Page 
316
    316
  • Thumbnail: Page 
317
    317
  • Thumbnail: Page 
318
    318
  • Thumbnail: Page 
319
    319
  • Thumbnail: Page 
320
    320
  • Thumbnail: Page 
321
    321
  • Thumbnail: Page 
322
    322
  • Thumbnail: Page 
323
    323
  • Thumbnail: Page 
324
    324
  • Thumbnail: Page 
325
    325
  • Thumbnail: Page 
326
    326
  • Thumbnail: Page 
327
    327
  • Thumbnail: Page 
328
    328