## Access

You are not currently logged in.

Access your personal account or get JSTOR access through your library or other institution:

## If You Use a Screen Reader

This content is available through Read Online (Free) program, which relies on page scans. Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.

# A Generalization of the One-Sided Two-Sample Kolmogorov-Smirnov Statistic for Evaluating Diagnostic Tests

Mitchell H. Gail and Sylvan B. Green
Biometrics
Vol. 32, No. 3 (Sep., 1976), pp. 561-570
DOI: 10.2307/2529745
Stable URL: http://www.jstor.org/stable/2529745
Page Count: 10
Preview not available

## Abstract

Suppose a continuous diagnostic measurement is used to classify patients, and suppose E$_1$ false negative errors and E$_2$ false positive errors result. The quantities E$_1$ and E$_2$, and the total number of misclassifications, L = E$_1$ + E$_2$, depend on the choice of cut-off value. We have determined the null distribution of min L, where minimization is over all possible cut-off values. The statistic, min L, can be used as a quick one-sided two-sample test, and min L is also useful for evaluating publications which present only a 2 x 2 table of false positives, false negatives, true positives and true negatives. In such cases, one can use min L to assess the usefulness of the diagnostic measurement, even if one suspects that the authors chose that particular cut-off value which minimized L after looking at the data. We extend these results to a more general weighted loss L = $\nu$E$_1$ + $\mu$E$_2$ where $\nu$ and $\mu$ are positive integers, and we show that min L is a generalization of the one-sided two-sample Kolmogorov-Smirnov statistic, and, indeed, exactly equivalent to that statistic for appropriate choices of $\nu$ and $\mu$.

• 561
• 562
• 563
• 564
• 565
• 566
• 567
• 568
• 569
• 570