Access

You are not currently logged in.

Access your personal account or get JSTOR access through your library or other institution:

login

Log in to your personal account or through your institution.

If You Use a Screen Reader

This content is available through Read Online (Free) program, which relies on page scans. Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.

Game Theory, Maximum Entropy, Minimum Discrepancy and Robust Bayesian Decision Theory

Peter D. Grünwald and A. Philip Dawid
The Annals of Statistics
Vol. 32, No. 4 (Aug., 2004), pp. 1367-1433
Stable URL: http://www.jstor.org/stable/3448538
Page Count: 67
  • Read Online (Free)
  • Download ($19.00)
  • Subscribe ($19.50)
  • Cite this Item
Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.
Preview not available

Abstract

We describe and develop a close relationship between two problems that have customarily been regarded as distinct: that of maximizing entropy, and that of minimizing worst-case expected loss. Using a formulation grounded in the equilibrium theory of zero-sum games between Decision Maker and Nature, these two problems are shown to be dual to each other, the solution to each providing that to the other. Although Topsøe described this connection for the Shannon entropy over 20 years ago, it does not appear to be widely known even in that important special case. We here generalize this theory to apply to arbitrary decision problems and loss functions. We indicate how an appropriate generalized definition of entropy can be associated with such a problem, and we show that, subject to certain regularity conditions, the above-mentioned duality continues to apply in this extended context. This simultaneously provides a possible rationale for maximizing entropy and a tool for finding robust Bayes acts. We also describe the essential identity between the problem of maximizing entropy and that of minimizing a related discrepancy or divergence between distributions. This leads to an extension, to arbitrary discrepancies, of a well-known minimax theorem for the case of Kullback-Leibler divergence (the "redundancy-capacity theorem" of information theory). For the important case of families of distributions having certain mean values specified, we develop simple sufficient conditions and methods for identifying the desired solutions. We use this theory to introduce a new concept of "generalized exponential family" linked to the specific decision problem under consideration, and we demonstrate that this shares many of the properties of standard exponential families. Finally, we show that the existence of an equilibrium in our game can be rephrased in terms of a "Pythagorean property" of the related divergence, thus generalizing previously announced results for Kullback-Leibler and Bregman divergences.

Page Thumbnails

  • Thumbnail: Page 
1367
    1367
  • Thumbnail: Page 
1368
    1368
  • Thumbnail: Page 
1369
    1369
  • Thumbnail: Page 
1370
    1370
  • Thumbnail: Page 
1371
    1371
  • Thumbnail: Page 
1372
    1372
  • Thumbnail: Page 
1373
    1373
  • Thumbnail: Page 
1374
    1374
  • Thumbnail: Page 
1375
    1375
  • Thumbnail: Page 
1376
    1376
  • Thumbnail: Page 
1377
    1377
  • Thumbnail: Page 
1378
    1378
  • Thumbnail: Page 
1379
    1379
  • Thumbnail: Page 
1380
    1380
  • Thumbnail: Page 
1381
    1381
  • Thumbnail: Page 
1382
    1382
  • Thumbnail: Page 
1383
    1383
  • Thumbnail: Page 
1384
    1384
  • Thumbnail: Page 
1385
    1385
  • Thumbnail: Page 
1386
    1386
  • Thumbnail: Page 
1387
    1387
  • Thumbnail: Page 
1388
    1388
  • Thumbnail: Page 
1389
    1389
  • Thumbnail: Page 
1390
    1390
  • Thumbnail: Page 
1391
    1391
  • Thumbnail: Page 
1392
    1392
  • Thumbnail: Page 
1393
    1393
  • Thumbnail: Page 
1394
    1394
  • Thumbnail: Page 
1395
    1395
  • Thumbnail: Page 
1396
    1396
  • Thumbnail: Page 
1397
    1397
  • Thumbnail: Page 
1398
    1398
  • Thumbnail: Page 
1399
    1399
  • Thumbnail: Page 
1400
    1400
  • Thumbnail: Page 
1401
    1401
  • Thumbnail: Page 
1402
    1402
  • Thumbnail: Page 
1403
    1403
  • Thumbnail: Page 
1404
    1404
  • Thumbnail: Page 
1405
    1405
  • Thumbnail: Page 
1406
    1406
  • Thumbnail: Page 
1407
    1407
  • Thumbnail: Page 
1408
    1408
  • Thumbnail: Page 
1409
    1409
  • Thumbnail: Page 
1410
    1410
  • Thumbnail: Page 
1411
    1411
  • Thumbnail: Page 
1412
    1412
  • Thumbnail: Page 
1413
    1413
  • Thumbnail: Page 
1414
    1414
  • Thumbnail: Page 
1415
    1415
  • Thumbnail: Page 
1416
    1416
  • Thumbnail: Page 
1417
    1417
  • Thumbnail: Page 
1418
    1418
  • Thumbnail: Page 
1419
    1419
  • Thumbnail: Page 
1420
    1420
  • Thumbnail: Page 
1421
    1421
  • Thumbnail: Page 
1422
    1422
  • Thumbnail: Page 
1423
    1423
  • Thumbnail: Page 
1424
    1424
  • Thumbnail: Page 
1425
    1425
  • Thumbnail: Page 
1426
    1426
  • Thumbnail: Page 
1427
    1427
  • Thumbnail: Page 
1428
    1428
  • Thumbnail: Page 
1429
    1429
  • Thumbnail: Page 
1430
    1430
  • Thumbnail: Page 
1431
    1431
  • Thumbnail: Page 
1432
    1432
  • Thumbnail: Page 
1433
    1433