# Approximate Entropy for Testing Randomness

Andrew L. Rukhin
Journal of Applied Probability
Vol. 37, No. 1 (Mar., 2000), pp. 88-100
Stable URL: http://www.jstor.org/stable/3215661
Page Count: 13

You are not currently logged in.

Access your personal account or get JSTOR access through your library or other institution:

Preview not available

## Abstract

This paper arose from interest in assessing the quality of random number generators. The problem of testing randomness of a string of binary bits produced by such a generator gained importance with the wide use of public key cryptography and the need for secure encryption algorithms. All such algorithms are based on a generator of (pseudo) random numbers; the testing of such generators for randomness became crucial for the communications industry where digital signatures and key management are vital for information processing. The concept of approximate entropy has been introduced in a series of papers by S. Pincus and co-authors. The corresponding statistic is designed to measure the degree of randomness of observed sequences. It is based on incremental contrasts of empirical entropies based on the frequencies of different patterns in the sequence. Sequences with large approximate entropy must have substantial fluctuation or irregularity. Alternatively, small values of this characteristic imply strong regularity, or lack of randomness, in a sequence. Pincus and Kalman (1997) evaluated approximate entropies for binary and decimal expansions of e, $\pi, \sqrt{2}$ and $\sqrt{3}$ with the surprising conclusion that the expansion of $\sqrt{3}$ demonstrated much less irregularity than that of π. Tractable small sample distributions are hardly available, and testing randomness is based, as a rule, on fairly long strings. Therefore, to have rigorous statistical tests of randomness based on this approximate entropy statistic, one needs the limiting distribution of this characteristic under the randomness assumption. Until now this distribution remained unknown and was thought to be difficult to obtain. To derive the limiting distribution of approximate entropy we modify its definition. It is shown that the approximate entropy as well as its modified version converges in distribution to a χ2-random variable. The P-values of approximate entropy test statistics for binary expansions of e, π and $\sqrt{3}$ are plotted. Although some of these values for $\sqrt{3}$ digits are small, they do not provide enough statistical significance against the randomness hypothesis.

• 88
• 89
• 90
• 91
• 92
• 93
• 94
• 95
• 96
• 97
• 98
• 99
• 100