Access

You are not currently logged in.

Access your personal account or get JSTOR access through your library or other institution:

login

Log in to your personal account or through your institution.

If you need an accessible version of this item please contact JSTOR User Support

Why Least Squares and Maximum Entropy? An Axiomatic Approach to Inference for Linear Inverse Problems

Imre Csiszar
The Annals of Statistics
Vol. 19, No. 4 (Dec., 1991), pp. 2032-2066
Stable URL: http://www.jstor.org/stable/2241918
Page Count: 35
  • Read Online (Free)
  • Download ($19.00)
  • Subscribe ($19.50)
  • Cite this Item
If you need an accessible version of this item please contact JSTOR User Support
Why Least Squares and Maximum Entropy? An Axiomatic Approach to Inference for Linear Inverse Problems
Preview not available

Abstract

An attempt is made to determine the logically consistent rules for selecting a vector from any feasible set defined by linear constraints, when either all n-vectors or those with positive components or the probability vectors are permissible. Some basic postulates are satisfied if and only if the selection rule is to minimize a certain function which, if a "prior guess" is available, is a measure of distance from the prior guess. Two further natural postulates restrict the permissible distances to the author's f-divergences and Bregman's divergences, respectively. As corollaries, axiomatic characterizations of the methods of least squares and minimum discrimination information are arrived at. Alternatively, the latter are also characterized by a postulate of composition consistency. As a special case, a derivation of the method of maximum entropy from a small set of natural axioms is obtained.

Page Thumbnails

  • Thumbnail: Page 
2032
    2032
  • Thumbnail: Page 
2033
    2033
  • Thumbnail: Page 
2034
    2034
  • Thumbnail: Page 
2035
    2035
  • Thumbnail: Page 
2036
    2036
  • Thumbnail: Page 
2037
    2037
  • Thumbnail: Page 
2038
    2038
  • Thumbnail: Page 
2039
    2039
  • Thumbnail: Page 
2040
    2040
  • Thumbnail: Page 
2041
    2041
  • Thumbnail: Page 
2042
    2042
  • Thumbnail: Page 
2043
    2043
  • Thumbnail: Page 
2044
    2044
  • Thumbnail: Page 
2045
    2045
  • Thumbnail: Page 
2046
    2046
  • Thumbnail: Page 
2047
    2047
  • Thumbnail: Page 
2048
    2048
  • Thumbnail: Page 
2049
    2049
  • Thumbnail: Page 
2050
    2050
  • Thumbnail: Page 
2051
    2051
  • Thumbnail: Page 
2052
    2052
  • Thumbnail: Page 
2053
    2053
  • Thumbnail: Page 
2054
    2054
  • Thumbnail: Page 
2055
    2055
  • Thumbnail: Page 
2056
    2056
  • Thumbnail: Page 
2057
    2057
  • Thumbnail: Page 
2058
    2058
  • Thumbnail: Page 
2059
    2059
  • Thumbnail: Page 
2060
    2060
  • Thumbnail: Page 
2061
    2061
  • Thumbnail: Page 
2062
    2062
  • Thumbnail: Page 
2063
    2063
  • Thumbnail: Page 
2064
    2064
  • Thumbnail: Page 
2065
    2065
  • Thumbnail: Page 
2066
    2066