Access

You are not currently logged in.

Access your personal account or get JSTOR access through your library or other institution:

login

Log in to your personal account or through your institution.

If you need an accessible version of this item please contact JSTOR User Support

Mutual Information, Metric Entropy and Cumulative Relative Entropy Risk

David Haussler and Manfred Opper
The Annals of Statistics
Vol. 25, No. 6 (Dec., 1997), pp. 2451-2492
Stable URL: http://www.jstor.org/stable/2959041
Page Count: 42
  • Read Online (Free)
  • Download ($19.00)
  • Cite this Item
If you need an accessible version of this item please contact JSTOR User Support
Mutual Information, Metric Entropy and Cumulative Relative Entropy Risk
Preview not available

Abstract

Assume $\{P_\theta: \theta \in \Theta\}$ is a set of probability distributions with a common dominating measure on a complete separable metric space $Y$. A state $\theta^\ast \in \Theta$ is chosen by Nature. A statistician obtains $n$ independent observations $Y_1,\ldots,Y_n$ from $Y$ distributed according to $P_{\theta^\ast}$. For each time $t$ between 1 and $n$, based on the observations $Y_1,\ldots,Y_{t-1}$, the statistician produces an estimated distribution $\hat{P}_t$ for $P_{\theta^\ast}$ and suffers a loss $L(P_{\theta\ast}, \hat{P}_t)$. The cumulative risk for the statistician is the average total loss up to time $n$. Of special interest in information theory, data compression, mathematical finance, computational learning theory and statistical mechanics is the special case when the loss $L(P_{\theta^\ast}, \hat{P}_t)$ is the relative entropy between the true distribution $P_{\theta^\ast}$ and the estimated distribution $\hat{P}_t$. Here the cumulative Bayes risk from time 1 to $n$ is the mutual information between the random parameter $\Theta^\ast$ and the observations $Y_1,\ldots,Y_n$. New bounds on this mutual information are given in terms of the Laplace transform of the Hellinger distance between pairs of distributions indexed by parameters in $\Theta$. From these, bounds on the cumulative minimax risk are given in terms of the metric entropy of $\Theta$ with respect to the Hellinger distance. The assumptions required for these bounds are very general and do not depend on the choice of the dominating measure. They apply to both finite- and infinite-dimensional $\Theta$. They apply in some cases where $Y$ is infinite dimensional, in some cases where $Y$ is not compact, in some cases where the distributions are not smooth and in some parametric cases where asymptotic normality of the posterior distribution fails.

Page Thumbnails

  • Thumbnail: Page 
2451
    2451
  • Thumbnail: Page 
2452
    2452
  • Thumbnail: Page 
2453
    2453
  • Thumbnail: Page 
2454
    2454
  • Thumbnail: Page 
2455
    2455
  • Thumbnail: Page 
2456
    2456
  • Thumbnail: Page 
2457
    2457
  • Thumbnail: Page 
2458
    2458
  • Thumbnail: Page 
2459
    2459
  • Thumbnail: Page 
2460
    2460
  • Thumbnail: Page 
2461
    2461
  • Thumbnail: Page 
2462
    2462
  • Thumbnail: Page 
2463
    2463
  • Thumbnail: Page 
2464
    2464
  • Thumbnail: Page 
2465
    2465
  • Thumbnail: Page 
2466
    2466
  • Thumbnail: Page 
2467
    2467
  • Thumbnail: Page 
2468
    2468
  • Thumbnail: Page 
2469
    2469
  • Thumbnail: Page 
2470
    2470
  • Thumbnail: Page 
2471
    2471
  • Thumbnail: Page 
2472
    2472
  • Thumbnail: Page 
2473
    2473
  • Thumbnail: Page 
2474
    2474
  • Thumbnail: Page 
2475
    2475
  • Thumbnail: Page 
2476
    2476
  • Thumbnail: Page 
2477
    2477
  • Thumbnail: Page 
2478
    2478
  • Thumbnail: Page 
2479
    2479
  • Thumbnail: Page 
2480
    2480
  • Thumbnail: Page 
2481
    2481
  • Thumbnail: Page 
2482
    2482
  • Thumbnail: Page 
2483
    2483
  • Thumbnail: Page 
2484
    2484
  • Thumbnail: Page 
2485
    2485
  • Thumbnail: Page 
2486
    2486
  • Thumbnail: Page 
2487
    2487
  • Thumbnail: Page 
2488
    2488
  • Thumbnail: Page 
2489
    2489
  • Thumbnail: Page 
2490
    2490
  • Thumbnail: Page 
2491
    2491
  • Thumbnail: Page 
2492
    2492