Automated Surveillance of Clostridium difficile Infections Using BioSense
To determine the feasibility of using electronic laboratory and admission‐discharge‐transfer data from BioSense, a national automated surveillance system, to apply new modified Clostridium difficile infection (CDI) surveillance definitions and calculate overall and facility‐specific rates of disease.
Retrospective, multicenter cohort study.
Thirty‐four hospitals sending inpatient, emergency department, and/or outpatient data to BioSense.
Laboratory codes and text‐parsing methods were used to extract C. difficile–positive toxin assay results from laboratory data sent to BioSense during the period from January 1, 2007, through June 30, 2008; these were merged with administrative records to determine whether cases were community associated or healthcare onset, as well as patient‐day data for rate calculations. A patient was classified as having hospital‐onset CDI if he or she had a C. difficile toxin–positive result on a stool sample collected 3 or more days after admission and community‐onset CDI if the specimen was collected less than 3 days after admission or the patient was not hospitalized.
A total of 4,585 patients from 34 hospitals in 12 states had C. difficile–positive assay results. More than half (53.0%) of the cases were community‐onset, and 30.8% of these occurred in patients who were recently hospitalized. The overall rate of healthcare‐onset CDI was 7.8 cases per 10,000 patient‐days, with a range among facilities of 1.5–27.8 cases per 10,000 patient‐days.
Electronic laboratory data sent to the BioSense surveillance system were successfully used to produce disease rates of CDI comparable to those of other studies, which shows the feasibility of using electronic laboratory data to track a disease of public health importance.
Clostridium difficile is a gram‐positive, anaerobic, spore‐forming bacillus that causes a wide spectrum of disease ranging from simple diarrhea to toxic megacolon or even death. The incidence and severity of disease due to this organism have increased.1‐8 Moreover, the population affected has changed—it is now infecting healthy individuals outside the healthcare setting.9,10 The change in the epidemiological characteristics of C. difficile infection (CDI) has prompted a more systematic approach, including the development of standardized case definitions to monitor and track disease.11 These definitions use hospital admission and discharge dates and previous healthcare exposures to categorize the location of onset of CDI. The nature of these definitions makes CDI an ideal disease to track using electronic laboratory data.
Electronic data are increasingly being used in health care and provide a rich source of information for population‐based surveillance.12‐14 BioSense is a national automated surveillance system operated by the Centers for Disease Control and Prevention (CDC) that receives, analyzes, and visualizes electronic health data for public health use.15,16 The objective of this study was to determine the feasibility of using electronic laboratory and admission‐discharge‐transfer data from BioSense to apply new CDI surveillance definitions and calculate overall and facility rates of disease.
As of June 2008, approximately 550 acute care hospitals were transmitting chief complaint and/or diagnosis data to BioSense. Data from most of the hospitals (476 hospitals) are received via state or local syndromic surveillance systems and include only emergency department demographics and chief complaints. However, for 103 hospitals, data are sent directly from the hospital to the CDC and may include additional types (eg, laboratory, pharmacy, or radiological data), including 44 hospitals that send microbiological laboratory results. For some of the hospitals that send data directly to the CDC, a “split feed” also sends a copy of the data to the applicable state health departments.
BioSense data are captured from the interface engine of hospital information systems, transformed into Health Level 7 messages conforming to the standards of the Public Health Information Network, and sent to health departments and the CDC by means of a secure internet connection. At the CDC, the messages are translated and mapped using standard vocabularies.
The data include a patient identifier, which links all data for a given patient across multiple visits to a particular hospital system. Thus, a patient returning to a hospital within the same parent system will be assigned the same patient identifier, making it possible to track patients longitudinally. Hospital personnel, but not the CDC, have access to the database that links the patient identifiers to the patients’ actual identities.
Information for this study came from laboratory and admission‐discharge‐transfer data. Laboratory data included specimen collection date, specimen type, Logical Observation Identifiers Names and Codes (LOINC) to identify the laboratory test, and test results supplied either as free text or as Systematized Nomenclature of Medicine—Clinical Terms (SNOMED CT) codes. Admission‐discharge‐transfer data included the dates of hospital admission and discharge, patient demographic characteristics, and patient setting at the time of discharge (inpatient, emergency department, or outpatient). Unit‐specific location, such as surgical intensive care unit, is not designated in BioSense.
The study period was January 1, 2007, through June 30, 2008. Among the 44 hospitals that sent laboratory data, 40 sent LOINC codes for the test ordered; 5 sent SNOMED CT codes for test results. For coded data, we identified C. difficile–positive toxin assay and culture results by searching for appropriate LOINC and SNOMED CT codes (Table 1). We also looked for a generic “positive” result (SNOMED CT code 10828004) paired with a C. difficile–specific LOINC code. Within our coded data, we found 1 C. difficile–specific SNOMED CT code and 4 LOINC codes. For data from laboratories that sent text reports, we searched for keywords to identify CDI starting with the broad search term “diff” (Table 1, query 1). We then eliminated negations with the keywords “absent,” “negative,” “not present,” “not detected,” “no clostridium,” “specimen rejected,” “pending,” “cancelled,” and “invalid” (Table 1, query 2; Table 2). We further refined the query using the keywords “clostridium,” “difficile,” and “c.difficile” (Table 1, query 3). We manually checked results at all steps to ensure that our algorithms captured and negated appropriate results. The specimen type field was also used in the final stages to verify that samples were from stool. All text‐parsing code used regular expressions to take into account differences in spacing between words and capital and lowercase letters.
|SNOMED CT codes|
|120953000. Clostridium difficile antibody|
|423590009. Clostridium difficile colitis|
|96001009. Clostridium difficile toxin B|
|404907009. Toxic megacolon due to Clostridium difficile|
|186431008. Clostridium difficile infection|
|121963002. Clostridium difficile antibody assay|
|121897008. Clostridium difficile detection|
|5933001. Clostridium difficilea|
|255823007. Clostridium difficile enterotoxin A|
|310541005. Clostridium difficile toxin A detected|
|122209009. Clostridium difficile culture|
|72415005. Clostridium difficile assay|
|118114008. Clostridium difficile antigen assay|
|122174009. Clostridium difficile toxin A assay|
|75332002. Clostridium difficile toxin assay|
|413047002. Clostridium difficile toxin detection|
|12671002. Clostridium difficile toxin|
|117963005. Clostridium difficile toxin A AND B assay|
|121964008. Clostridium difficile toxin B assay|
|20761–3. C dif Stl Ql Aggl|
|20762–1. C dif Stl Ql Aerobe Cul|
|34712–0. C dif Stl Ql|
|563–7. C dif XXX Ql Cult|
|562–9. C dif Stl Ql Cult|
|31308–0. C dif Ab Ser‐aCnc|
|9365–8. C dif Ab Titr Ser|
|26697–3. C dif IgA Ser‐aCnc|
|26702–1. C dif IgG Ser‐aCnc|
|26694–0. C dif IgM Ser‐aCnc|
|13957–6. C dif Tox A Stl Ql EIA|
|6359–4. C dif Tox A Stl EIA‐aCnc|
|6360–2. C dif Tox A XXX EIA‐aCnc|
|34468–9. C dif Tox A+B Stl Ql EIAa|
|34713–8. C dif Tox A+B Stl Qla|
|6361–0. C dif Tox A+B Ser EIA‐aCnc|
|6362–8. C dif Tox A+B Stl Ql CT Tiss Culta|
|6363–6. C dif Tox A+B Stl EIA‐aCnc|
|6364–4. C dif Tox A+B XXX EIA‐aCnc|
|33947–3. C dif Tox Ab Titr Ser Nt|
|43055–3. C dif Tox Ab Titr Ser|
|10895–1. C dif Tox B Stl Qla|
|46131–9. C dif Tox B Stl Ql CT Tiss Cult|
|6365–1. C dif Tox B Stl EIA‐aCnc|
|6366–9. C dif Tox B XXX EIA‐aCnc|
|Reports with negations|
|• C.difficile toxins are absent or below the limit of detection|
|• NEGATIVE FOR C.DIFFICILE TOXINS A AND/OR B FINAL 05/24/2008|
|• ******* MICROBIOLOGY ******* C. DIFFICILE TOXIN A & B EIA @ ACC#:02‐xxxxx COLL D/T:06/18/08 0630 –––––– FINAL REPORT –––––– 18JUN08 CLOSTRIDIUM DIFFICILE TOXIN A & B NOT DETECTED|
|• Clostridium difficile toxin A and/or B not present|
|• No Clostridium difficile toxin detected|
|• C. diff Toxin EIA SPECIMEN DESCRIPTION STOOL COMMENTS NONE TEST RESULT CANCELLED REQUEST CANCELLED. THIS TEST EXCEEDED REPLICA LIMIT. SPECIMEN WILL BE HELD 24 HOURS. CALL LAB AT xxx‐xxx‐xxxx IF NECESSARY. REPORT STATUS FINAL|
|• ******* MICROBIOLOGY ******* C. DIFFICILE TOXIN A & B EIA @ ACC#:xx‐xxxxxx COLL D/T:01/05/07 1040 –––––– FINAL REPORT –––––– 06JAN07 SPECIMEN REJECTED. Testing for C. difficile toxins will only be performed on one specimen within a 24 hour timeframe. Patient account has been credited for this test.|
|• EIA positive for C.difficile toxin|
|• POSITIVE FOR C.DIFFICILE TOXINS A AND/OR B CALLED TO, READ BACK AND CONFIRMED BY KM 03/07/08 1330 BY CAM FINAL 03/07/2008|
|• C. diff Toxin B SPECIMEN DESCRIPTION STOOL COMMENTS NONE TEST RESULT POSITIVE FOR CLOSTRIDIUM DIFFICILE TOXIN B REPORT STATUS FINAL 03192007|
|• Soft stool: Positive for Clostridium difficile toxin|
|• ******* MICROBIOLOGY ******* C. DIFFICILE TOXIN A & B EIA @ ACC#: COLL D/T:12/31/07 1800 –––––– FINAL REPORT –––––– 02JAN08 CLOSTRIDIUM DIFFICILE TOXIN A & B POSITIVE .END OF REPORT|
|• SP 2020 01 RAPID MICROBIOLOGY TESTS –––––– PROCEDURE: CLOSTRIDIUM DIFF TOXIN A/B @ COLLECTED: 03/18/08 0945 SOURCE: STOOL RECEIVED: 03/18/08 1552 STARTED: 03/18/08 1603 –––––– FINAL REPORT –––––– FINAL REPORT 03/18/08 1954 POSITIVE for C. difficile Toxin A and/or Toxin B @ = CLOS DIFF TXN A|
Definitions and Exclusions
We defined a case of CDI as a positive result of a toxin assay or culture for C. difficile for a patient during the study period. If any patient had more than 1 positive laboratory result during the study period, only the first case was analyzed. We further classified cases of CDI by using the date that the specimen was collected as a surrogate for the date of symptom onset. The following modified Clostridium difficile Surveillance Working Group categories were used.11
Hospital‐onset (HO). CDI case in which the specimen that yielded a positive result was collected on or after the fourth day of hospitalization (where the date of admission was day 1).
Community‐onset, hospital‐associated (CO‐HA). CDI case in which the specimen that yielded a positive result was collected from an inpatient before the fourth day of hospitalization or from an outpatient, and the patient had been hospitalized overnight at least once in the same healthcare system no more than 30 days before the date of specimen collection.
Community‐onset, non–hospital‐associated (CO‐NHA). Same as CO‐HA, but the patient had not been admitted to a hospital in the same healthcare system during the previous 30 days.
We did not include community‐associated and indeterminate categories, because we were cautious about patient healthcare utilization practice. Previous healthcare exposures in different healthcare systems would not be captured in BioSense and could lead to misclassification. To make this study comparable to others, we excluded ambulatory patients who received hemodialysis on a long‐term basis with repeated exposure to health care. We also excluded cases in which inconsistencies in the data prevented distinction between hospital‐ and community‐onset disease.
We described cases of CDI according to the patient's age, sex, race and ethnicity, and the setting and categorized disease types using the definitions described above. Because previous studies17 have shown an epidemiologic relationship between HO CDI and CO‐HA CDI, we calculated rates for HO CDI alone and also the combination of the 2 categories (HO CDI and CO‐HA CDI). We also evaluated the association between facility monthly rates of HO CDI and CO‐NHA CDI using the Pearson correlation coefficient and fit a regression line to the data.
We calculated overall and facility‐specific rates of CDI for facilities with C. difficile laboratory results. The denominator consisted of all inpatient‐days from hospitals with cases of CDI. Hospitalizations of excluded patients were eliminated, leaving 614,300 hospitalizations. Discharge dates were considered inaccurate if they resulted in calculated lengths of stay of more than 1 year or of more than 90 days with interrupting emergency department and/or outpatient visits. For the remaining 597,560 hospitalizations (97.3%) that we considered to be accurate, we calculated the median length of stay in days by facility and used this quantity to replace inaccurate or missing length of stay data to arrive at an estimated total.
For facilities that provided final diagnosis data, we examined the International Classification of Diseases, Ninth Revision, Clinical Modification (ICD‐9‐CM) codes associated with the CDI visits. BioSense does not limit the number of ICD‐9‐CM codes accepted from the sending facility. We determined the number of CDI cases that had a specific ICD‐9‐CM code for C. difficile (008.45), and for cases involving inpatients, we used the Wilcoxon rank sum test to compare the median number of days from specimen collection to discharge for cases that did or did not have the appropriate ICD‐9‐CM code.
From January 1, 2007, through June 30, 2008, we found C. difficile–positive laboratory results in 34 of 44 study hospitals located in 12 states. The median size of these 34 hospitals was 275 beds (range, 44–1,039 beds).
After the exclusion of 214 patients with missing or incomplete data, our final sample included 4,585 unique patients with C. difficile–positive laboratory results. Of these, 282 (6.2%) were identified from coded elements from 4 of the hospitals; the remainder were identified from free‐text laboratory result data from the remaining 30 hospitals. The median age was 67 years, and 2,452 (54.2%) of 4,530 patients were female (Table 3). The majority were white (3,301 [73.7%] of 4,478 patients), and 437 (9.7%) of 4,501 were described as of Hispanic or Latino ethnicity. Most of the patients (4,166 [90.9%] of 4,585) were hospitalized at the time of specimen collection or were admitted shortly afterward.
|Age, years, median (range) (n = 4,520)||67.4 (<1–102.6)|
|Female sex (n = 4,520)||2,452 (54.2)|
|Race (n = 4,478)|
|American Indian or Alaska Native||7 (0.2)|
|Native Hawaiian or Pacific Islander||1 (<0.1)|
|Hispanic or Latino ethnicity (n = 4,501)||437 (9.7)|
|Patient setting (n = 4,585)|
|Emergency department||162 (3.5)|
Of the 4,585 cases of CDI, 2,156 (47.0%) were HO cases (Figure 1). The 2,429 community‐onset cases consisted of 748 CO‐HA cases (30.8%) and 1,681 CO‐NHA cases (69.2%). The overall pooled rate of HO CDI was 7.8 cases per 10,000 patient‐days (2,156 cases in 2,755,865 patient‐days), with a range of 1.5–27.8 cases per 10,000 patient‐days among the facilities (Figure 2; Table 4). The combined rate of HO CDI and CO‐HA CDI was 10.5 cases per 10,000 patient‐days (2,904 cases in 2,755,865 patient‐days), with a range of 1.6–27.8 cases per 10,000 patient‐days. There was a significant correlation between the monthly number of HO cases and of CO‐NHA cases (r = 0.86; P < .001) (Figure 3).
|CDI category||BioSense||Kutty et al17||Dubberke et al18||Sohn et al19||Campbell et al20||Chang et al21,a|
|Proportion, % of cases||47.0||42.2||…||…||…||…|
|Rate, cases per 10,000 patient‐days||7.8||…||8.9||8.7||6.4–7.9||…|
|Rate in each facility, cases per 10,000 patient‐days, range||1.5–27.8||1.4–16.8||3.9–15.8||…||…||…|
|HO + CO‐HA|
|Rate, cases per 10,000 patient‐days||10.5||…||…||10.5||…||…|
|Rate in each facility, cases per 10,000 patient‐days, range||1.6–27.8||3.0–19.0||…||…||…||…|
|Community onset proportion|
|Proportion, % of cases||53.0||57.8||…||…||…||…|
|Proportion of CO‐HA, % of cases||30.8||15.6||…||…||…||64.0|
The median length of stay of inpatients with CDI, regardless of CDI type, was 9 days (interquartile range, 5–18.5 days). Of the 4,364 patients with CDI from hospitals that sent final diagnosis codes, ICD‐9‐CM codes linked to the CDI visit were available for 3,834 (87.9%). Of these, 2,758 (71.9%) had the ICD‐9‐CM code for CDI (008.45); most (2,671) of these patients were inpatients. The proportion of patients with CDI‐positive laboratory results with a corresponding ICD‐9‐CM code varied substantially by patient setting: 2,671 (76.2%) of 3,505 inpatients, 47 (25.3%) of 186 outpatients, and 40 (28.0%) of 143 patients seen in the emergency department. Inpatients who did not have an ICD‐9‐CM code for C. difficile had dates of specimen collection closer to discharge than did patients with an accompanying C. difficile code (P < .001). The proportion of CDI coded varied by hospital from 0% to 100% (Figure 2). In the hospital with 0%, all CDI patients were seen in the emergency department; in the 2 hospitals with 100%, all patients were inpatients.
Previous studies have shown improved completeness and timeliness of disease reporting using electronic laboratory reporting systems.22‐24 To our knowledge, this is one of the first studies25 that used strictly electronic data to track CDI and applied modified updated surveillance definitions from the Clostridium difficile Surveillance Working Group. Similar to recent publications, we found that more than half (53.0%) of cases of CDI were community‐onset, that 30.8% of community‐onset cases occurred in patients who had recently been hospitalized in a facility in the same hospital system, and that the overall rate of HO CDI was 7.8 cases per 10,000 patient‐days. This study reveals the feasibility of using electronic laboratory data to track C. difficile, a rapidly evolving pathogen, without adding additional burden to hospital personnel.
Two recent studies used the new recommended surveillance definitions for CDI. Kutty et al17 performed a study in 6 acute care hospitals in North Carolina using manual review of laboratory and medical records. They found that 42.2% of CDI cases were HO CDI, similar to the 47.0% of HO CDI in our study (Table 4). The range of rates among the facilities of HO CDI alone and HO CDI plus CO‐HA CDI, although narrower, was also comparable to that in our study. Dubberke et al18 applied a modified version of the definitions to inpatients in 5 geographically diverse tertiary care centers using electronic hospital databases. Similar to Kutty et al,17 the authors reviewed the medical records of all community‐onset CDI cases to determine the patients' previous healthcare exposures. Their overall rate of HO CDI of 8.9 cases per 10,000 patient‐days was higher than the rate that we report (7.8 cases per 10,000 patient‐days), which is possibly attributable to the tertiary care demographic of the hospitals. Another study19 in tertiary care hospitals found rates similar to those of Dubberke et al.18
Dubberke et al18 concluded that surveillance for HO CDI was sufficient for outbreak detection and that the HO and CO‐HA categories were generally sufficient for surveillance purposes. They found that 72.4% of their community‐onset cases were discovered within the first 30 days after hospital discharge. Chang et al21 reached the same conclusion, finding that 84.5% of the community‐onset cases were CO‐HA CDI. However, the Dubberke et al18 study included only cases detected among inpatients, which potentially affects the proportions of community‐onset CDI. We included all patient settings and found that 5.6% and 3.5% of CDI cases were detected among outpatients and patients who were seen in the emergency department, respectively.
We found a significant correlation between HO CDI and CO‐NHA CDI cases. Cases of CDI that have their onset in the community but are not associated with recent inpatient care at the index hospital (ie, CO‐NHA) include those occurring in patients who became symptomatic before or shortly after being transferred from long‐term care facilities or other acute care facilities, those occurring in patients who developed CDI symptoms while living in the community but were recently discharged from an acute care facility other than the index hospital, and those occurring in patients who were hospitalized recently and developed CDI symptoms while living in the community. In contrast to cases of CO‐HA CDI, in which either transmission of C. difficile and/or exposure to antimicrobials may be related to a previous hospitalization, a particular hospital likely has less control over the number of patients admitted or transferred with CO‐NHA CDI. Nonetheless, such patients with CO‐NHA CDI may be an important source of C. difficile that may be transmitted to other patients, thereby driving the rates of HO CDI. Indeed, the strong correlation between CO‐NHA CDI and HO CDI cases suggests that it will be important to adjust rates of HO CDI according to the rate of CO‐NHA CDI (eg, cases of CO‐NHA CDI per admitted patients) before benchmarking across acute care hospitals.
Ohio made healthcare facility–onset CDI a reportable disease for hospitals and nursing homes in 2006.20 For the 210 hospitals in the state, Campbell et al20 found the rate of HO CDI to be 6.4–7.9 cases per 10,000 patient‐days and estimated the personnel cost of CDI surveillance for that year to be $2,486,000. Our study was unique in that we were able to classify cases of CDI and calculate rates by using existing laboratory and admission‐discharge‐transfer data, with no extra cost or effort to hospital personnel. Although hospitals incur initial costs in capturing electronic data, the data are useful for tracking many diseases other than CDI.
The National Healthcare Safety Network (NHSN) is a voluntary, secure, internet‐based surveillance system managed by the Division of Healthcare Quality Promotion at the CDC.26 One optional module in this system is dedicated to CDI surveillance and allows the user to conduct surveillance by means of either traditional methods with infection control practitioners or the less labor‐intensive laboratory identification method (LabID Event Reporting), which uses definitions similar to those in our study. It is anticipated that the Clinical Document Architecture for electronic laboratory reporting of LabID events in the NHSN will soon become available.
On December 30, 2009, the Department of Health and Human Services proposed regulations describing how healthcare providers and institutions can qualify for incentive Medicare and Medicaid payments through the “meaningful use” of electronic health records.27 This incentive will likely stimulate additional adoption of health information technology infrastructure in all healthcare settings. Along with the proposed rules, a second set of regulations describe standards and certification criteria. Although not yet defined, these standards will facilitate the use of electronic data for surveillance purposes. In our study, substantial effort was required to parse text fields for C. difficile‐positive test results. Few hospitals had LOINC‐ or SNOMED‐coded laboratory test and result data, which emphasizes the need for widespread adoption of standard vocabularies to facilitate public health use of electronic data.
Overall, 71.9% of the laboratory‐diagnosed cases of CDI had a corresponding ICD‐9‐CM code for C. difficile, and the proportion varied substantially among hospitals and among patient settings. The proportion was low among outpatients and patients seen in the emergency department, likely as a result of confirmed laboratory results being unavailable until after the visit and never being coded for the patient for that visit or subsequent visits. Among inpatients, 76.2% of the CDI cases had an ICD‐9‐CM code for C. difficile, which is lower than the 84.1% found in a study performed in 1 hospital over a 1‐year period28 but similar to the 78.1% found in a 6‐year study in 5 tertiary care hospitals.29 Our study included a larger sample from 34 hospitals and may be more representative. Similar to Dubberke et al,28 we found that inpatient records that did not have an ICD‐9‐CM code for C. difficile had a specimen collection date closer to the patient's discharge date, which suggests that the laboratory result was unavailable at the time of discharge. The variability among hospitals was at least partially due to differences in the proportions of inpatient, outpatient, and emergency department visits. However, it also raises the question of the reliability and accuracy of coding practices among hospitals. Although administrative databases are more readily available, our data suggest that CDI surveillance based on ICD‐9‐CM codes may undercount laboratory‐confirmed cases, especially among outpatients, and is less timely than laboratory data.
This study was subject to several limitations. Although BioSense received microbiology laboratory data from 44 hospitals during the study period, we found CDI‐positive reports from only 34. In the data from the 10 hospitals missing laboratory‐positive CDI reports, we searched for any CDI‐related test and found none. To investigate, we contacted 2 of the 10 hospitals that did not submit CDI reports. In both, C. difficile toxin assays were not performed in the microbiology section of the laboratory and thus results were not included in the BioSence data feed. We assume that this was the case in the other 8 hospitals as well. The 34 hospitals that did send CDI results varied widely in rate of HO CDI, which raises the question of whether this variability is due to differences in disease occurrence, differences in completeness of the data sent through the electronic feed, or both. The electronic data that we analyzed were not validated by comparison with hospital records. However, because other multicenter studies show wide variability of CDI rate among facilities17‐19 and because our findings are similar to those of other studies, we feel that the data received are complete. Validation of laboratory data is a project that the BioSense Program hopes to undertake in the future. Finally, for the 2.7% of the inpatient visits with missing or questionable admission or discharge dates, we estimated the length of stay by using the median of the reliable dates, possibly introducing an error in rate calculations.
This study shows the feasibility and, relative to contemporary reports of manually collected rate and case categorization data, the accuracy of automated surveillance using recommended case definitions for a pathogen emerging across a spectrum of healthcare and nonhealthcare settings. Automated surveillance has the potential advantages of being timely and objective without taxing human resources for data collection and interpretation. Expansion of electronic laboratory reporting systems will allow for future automated monitoring of laboratory data related to infectious, chronic, and toxin‐mediated diseases.
We thank all of the BioSense facilities that contribute laboratory and admission‐discharge‐transfer data to the BioSense system.
Financial support. None.
Potential conflicts of interest. All authors report no conflicts of interest relevant to this article.
- 1.McDonald LC, Killgore GE, Thompson A, et al. An epidemic, toxin gene‐variant strain of Clostridium difficile. N Engl J Med 2005;353(23):2433–2441.
- 2.Loo VG, Poirier L, Miller MA, et al. A predominantly clonal multi‐institutional outbreak of Clostridium difficile‐associated diarrhea with high morbidity and mortality. N Engl J Med 2005;353(23):2442–2449.
- 3.Pépin J, Valiquette L, Alary ME, et al. Clostridium difficile‐associated diarrhea in a region of Quebec from 1991 to 2003: a changing pattern of disease severity. CMAJ 2004;171(5):466–472.
- 4.Eaton L. C difficile cases rising, but MRSA rates falling. BMJ 2007;335(7612):177.
- 5.Archibald LK, Banerjee SN, Jarvis WR. Secular trends in hospital‐acquired Clostridium difficile disease in the United States, 1987–2001. J Infect Dis 2004;189(9):1585–1589.
- 6.Dallal RM, Harbrecht BG, Boujoukas AJ, et al. Fulminant Clostridium difficile: an underappreciated and increasing cause of death and complications. Ann Surg 2002;235(3):363–372.
- 7.Muto CA, Pokrywka M, Shutt K, et al. A large outbreak of Clostridium difficile‐associated disease with an unexpected proportion of deaths and colectomies at a teaching hospital following increased fluoroquinolone use. Infect Control Hosp Epidemiol 2005;26(3):273–280.
- 8.McEllistrem MC, Carman RJ, Gerding DN, Genheimer CW, Zheng L. A hospital outbreak of Clostridium difficile disease associated with isolates carrying binary toxin genes. Clin Infect Dis 2005;40(2):265–272.
- 9.Centers for Disease Control and Prevention. Severe Clostridium difficile‐associated disease in populations previously at low risk—four states, 2005. MMWR Morb Mortal Wkly Rep 2005;54(47):1201–1205.
- 10.Dial S, Delaney JA, Barkun AN, Suissa S. Use of gastric acid‐suppressive agents and the risk of community‐acquired Clostridium difficile‐associated disease. JAMA 2005;294(23):2989–2995.
- 11.McDonald LC, Coignard B, Dubberke E, Song X, Horan T, Kutty PK; Ad Hoc Clostridium difficile Surveillance Working Group. Recommendations for surveillance of Clostridium difficile–associated disease. Infect Control Hosp Epidemiol 2007;28(2):140–145.
- 12.Buehler JW. Review of the 2003 National Syndromic Surveillance Conference—lessons learned and questions to be answered. MMWR Morb Mortal Wkly Rep 2004;53(suppl):18–22.
- 13.Mandl KD, Overhage JM, Wagner MM, et al. Implementing syndromic surveillance: a practical guide informed by the early experience. J Am Med Inform Assoc 2004;11(2):141–150.
- 14.Centers for Disease Control and Prevention. Automated detection and reporting of notifiable diseases using electronic medical records versus passive surveillance—Massachusetts, June 2006–July 2007. MMWR 2008;57(14):373–376.
- 15.Loonsk JW. BioSense—a national initiative for early detection and quantification of public health emergencies. MMWR Morb Mortal Wkly Rep 2004;53(suppl):53–55.
- 16.Bradley CA, Rolka H, Walker D, Loonsk J. BioSense: implementation of a National Early Event Detection and Situational Awareness System. MMWR Morb Mortal Wkly Rep 2005;26:54(suppl):11–19.
- 17.Kutty PK, Benoit SR, Woods CW, et al. Assessment of Clostridium difficile–associated disease surveillance definitions, North Carolina, 2005. Infect Control Hosp Epidemiol 2008;29(3):197–202.
- 18.Dubberke ER, Butler AM, Hota B, et al; Prevention Epicenters Program from the Centers for Disease Control and Prevention. Multicenter study of the impact of community‐onset Clostridium difficile infection on surveillance for C. difficile infection. Infect Control Hosp Epidemiol 2009;30(6):518–525.
- 19.Sohn S, Climo M, Diekema D, et al; Prevention Epicenter Hospitals. Varying rates of Clostridium difficile–associated diarrhea at prevention epicenter hospitals. Infect Control Hosp Epidemiol 2005;26(8):676–679.
- 20.Campbell RJ, Giljahn L, Machesky K, et al. Clostridium difficile infection in Ohio hospitals and nursing homes during 2006. Infect Control Hosp Epidemiol 2009;30(6):526–533.
- 21.Chang HT, Krezolek D, Johnson S, Parada JP, Evans CT, Gerding DN. Onset of symptoms and time to diagnosis of Clostridium difficile–associated disease following discharge from an acute care hospital. Infect Control Hosp Epidemiol 2007;28(8):926–931.
- 22.Effler P, Ching‐Lee M, Bogard A, Ieong MC, Nekomoto T, Jernigan D. Statewide system of electronic notifiable disease reporting from clinical laboratories: comparing automated reporting with conventional methods. JAMA 1999;282(19):1845–1850.
- 23.Panackal AA, M’ikanatha NM, Tsui FC, et al. Automatic electronic laboratory‐based reporting of notifiable infectious diseases at a large health system. Emerg Infect Dis 2002;8(7):685–691.
- 24.Overhage JM, Grannis S, McDonald CJ. A comparison of the completeness and timeliness of automated electronic laboratory reporting and spontaneous reporting of notifiable conditions. Am J Public Health 2008;98(2):344–350.
- 25.Tabak YP, Sievert DM, Zilberberg MD, et al. The epidemiology of initial and recurrent Clostridium difficile infections in US hospitals, 2007–2008. In: Program and abstracts of the 47th Annual Meeting of the Infectious Diseases Society of America. Philadephia, PA; 2009. Abstract 179.
- 26.C. difficile infection surveillance and C. difficile LabID event reporting training. National Healthcare Safety Network Website. http://www.cdc.gov/nhsn/wc_MDRO_CDAD_ISlabID.html. Accessed March 10, 2010.
- 28.Dubberke ER, Reske KA, McDonald LC, Fraser VJ. ICD‐9 codes and surveillance for Clostridium difficile‐associated disease. Emerg Infect Dis 2006;12(10):1576–1579.
- 29.Dubberke ER, Butler AM, Yokoe DS, et al; Prevention Epicenters Program of the Centers for Disease Control and Prevention. Multicenter study of surveillance for hospital‐onset Clostridium difficile infection by the use of ICD‐9‐CM diagnosis codes. Infect Control Hosp Epidemiol 2010;31(3):262–268.