You are not currently logged in.
Access JSTOR through your library or other institution:
Constrained Maximum-Entropy Sampling
Vol. 46, No. 5 (Sep. - Oct., 1998), pp. 655-664
Published by: INFORMS
Stable URL: http://www.jstor.org/stable/223009
Page Count: 10
You can always find the topics here!Topics: Entropy, Eigenvalues, Integers, Covariance, Matrices, Linear programming, Random variables, Budget constraints, Algorithms, Statistics
Were these topics helpful?See somethings inaccurate? Let us know!
Select the topics that are inaccurate.
Preview not available
A fundamental experimental design problem is to select a most informative subset, having prespecified size, from a set of correlated random variables. Instances of this problem arise in many applied domains such as meteorology, environmental statistics, and statistical geology. In these applications, observations can be collected at different locations and, possibly, at different times. Information is measured by "entropy." Practical situations have further restrictions on the design space. For example, budgetary limits, geographical considerations, as well as legislative and political considerations may restrict the design space in a complicated manner. Using techniques of linear algebra, combinatorial optimization, and convex optimization, we develop upper and lower bounds on the optimal value for the Gaussian case. We describe how these bounds can be integrated into a branch-and-bound algorithm for the exact solution of these design problems. Finally, we describe how we have implemented this algorithm, and we present computational results for estimated covariance matrices corresponding to sets of environmental monitoring stations in the Ohio Valley of the United States.
Operations Research © 1998 INFORMS