Access

You are not currently logged in.

Access your personal account or get JSTOR access through your library or other institution:

login

Log in to your personal account or through your institution.

If you need an accessible version of this item please contact JSTOR User Support

An Exact Algorithm for Maximum Entropy Sampling

Chun-Wa Ko, Jon Lee and Maurice Queyranne
Operations Research
Vol. 43, No. 4 (Jul. - Aug., 1995), pp. 684-691
Published by: INFORMS
Stable URL: http://www.jstor.org/stable/171694
Page Count: 8
  • Download ($30.00)
  • Cite this Item
If you need an accessible version of this item please contact JSTOR User Support
An Exact Algorithm for Maximum Entropy Sampling
Preview not available

Abstract

We study the experimental design problem of selecting a most informative subset, having prespecified size, from a set of correlated random variables. The problem arises in many applied domains, such as meteorology, environmental statistics, and statistical geology. In these applications, observations can be collected at different locations, and possibly, at different times. Information is measured by "entropy." In the Gaussian case, the problem is recast as that of maximizing the determinant of the covariance matrix of the chosen subset. We demonstrate that this problem is NP-hard. We establish an upper bound for the entropy, based on the eigenvalue interlacing property, and we incorporate this bound in a branch-and-bound algorithm for the exact solution of the problem. We present computational results for estimated covariance matrices that correspond to sets of environmental monitoring stations in the United States.

Page Thumbnails

  • Thumbnail: Page 
684
    684
  • Thumbnail: Page 
685
    685
  • Thumbnail: Page 
686
    686
  • Thumbnail: Page 
687
    687
  • Thumbnail: Page 
688
    688
  • Thumbnail: Page 
689
    689
  • Thumbnail: Page 
690
    690
  • Thumbnail: Page 
691
    691