Abstract

Background. Automated extraction of data from electronic health records has allowed high-quality retrospective analyses of large cohorts.

Objectives. To derive and validate an automated electronic search algorithm to identify surgical patients with a diagnosis of or at high risk for obstructive sleep apnea (OSA).

Methods. From 558 adult patients who underwent surgery from January 1, 2011, through December 31, 2015, we constructed a derivation cohort of 100 subjects selected using the initial search algorithm to have equal numbers of patients with high and low likelihood of having OSA. This algorithm conducted a free-text electronic search of patient diagnoses and interrogated results of a preoperative checklist that specifically queried patients regarding OSA history and screened for OSA risk using Flemons criteria. The derivation cohort was then manually reviewed to identify patients with OSA risk and results were used to refine the algorithm. Second, the algorithm was validated with the other 458 patients (the validation cohort). The sensitivity and specificity were compared again with manual chart review of the respective group.

Results. In the derivation cohort, the automated electronic algorithm achieved a sensitivity of 98.2% and a specificity of 100.0% compared with the manual review. In the validation cohort, sensitivity was 100.0% and specificity was 98.4% in this comparison.

Conclusion. An automated electronic search algorithm was developed that interrogates electronic health records to identify, with a high degree of accuracy, surgical patients with a diagnosis of or at high risk for OSA.

Key words: Flemons criteria, obstructive sleep apnea, search algorithm

Introduction

Background and Importance

Automated extraction of data from electronic health records (EHRs) has become a sophisticated tool that has allowed investigators new avenues for conducting high-quality retrospective analysis of large patient cohorts. (1) These automated techniques have been used with a high degree of accuracy to determine preoperative prognosticators such as the Charlson Comorbidity Index (2) and identify postoperative complications, such as postoperative myocardial infarction, (3) in large cohorts of surgical patients.

Surgical patients with a diagnosis of or at high risk for obstructive sleep apnea (OSA) have increased risk of postoperative pulmonary complications. (4,5) OSA is frequently undiagnosed, and a substantial proportion of adult surgical patients should be considered at high risk for undiagnosed OSA. (6,7) In response to the many undiagnosed cases, several assessment tools have been developed to screen surgical patients for OSA. (8-10) Investigators have proposed that all surgical patients be screened preoperatively for OSA risk and that this information be used in decision-making algorithms to triage surgical patients to appropriate levels of postoperative monitoring. (11) The current practice at our institution during the preoperative nursing check-in is to screen surgical patients for OSA diagnosis and to screen those without an OSA diagnosis using the Flemons criteria. (9)

Reliable data identifying surgical patients with the diagnosis of or at high risk for OSA would be useful for investigators conducting outcomes research on large patient cohorts where manual data extraction is not practical. However, evidence is limited on the derivation and validation of an electronic search technique that identifies these patients and on its effectiveness compared with manual review of EHRs. Herein, we describe the derivation and validation of an automatic electronic search algorithm for identifying surgical patients with the diagnosis of or at high risk for OSA.

Objective

The study’s primary objective was to derive and validate an automated electronic search algorithm that identifies which surgical patients should undergo preoperative screening for OSA because of either previous diagnosis or high risk, using an assessment tool. The secondary objective was to calculate sensitivity and specificity values of our electronic search algorithm compared with the reference standard of manual comprehensive EHR review.

Methods

Participants

The Mayo Clinic Institutional Review Board approved this study. Consistent with Minnesota Statute 144.295, the study included only patients who provided authorization for research use of their EHRs. The setting of this study was the Mayo Clinic Hospital – Rochester Campus, Minnesota. Participants in this report were 558 surgical patients who underwent general anesthesia at our institution from January 1, 2011, through December 31, 2015—a subset included in a separate and unrelated retrospective study designed to assess postoperative outcomes related to anesthetic management variables. For the present study, we randomly selected 100 patients to construct the derivation cohort and selected the other patients (n=458) for the validation cohort.

Manual Data Extraction Strategy

For the present study, manual review of the EHRs was considered the gold standard for identification of patients with or deemed to be at high risk for OSA. Surgical patients on arrival at our institution’s presurgical area undergo a preoperative checklist by a registered nurse. This checklist includes a question of whether the patient has a history of OSA, with the answer marked yes or no. If the answer is negative, the patient is screened for OSA using the Flemons criteria, to categorize patients with high or low risk of OSA. (9) The results of this inquiry are recorded in the EHR.

For this study, the EHRs of the patient cohort were manually reviewed by the lead author (O.O.O.). The review consisted of evaluating the EHR antecedent to the date of surgery, including past medical history and diagnosis sections of all clinical notes, as well as information obtained during the nursing preoperative checklist.

Automated Electronic Data Extraction Strategy

Our institution uses the United Data Platform, a clinical data warehouse that obtains, consolidates, and standardizes all clinical data collected within the institution (eg, demographic information, diagnoses, clinical notes, hospital flow sheets). The platform can be interrogated to extract clinical data through a Web-based query-building tool called Advanced Cohort Explorer (ACE). Using Boolean logic, researchers can use the ACE system to develop distinct text search strategies of the United Data Platform and identify pertinent clinical data, such as specific keywords.

To develop the electronic search query for OSA, we entered synonyms, abbreviations, and medical acronyms associated with OSA into an ACE text query. Furthermore, a comprehensive list of terms to exclude patients who did not have OSA was developed to make the electronic search algorithm more specific. For instance, we excluded such phrases as “no history of,” “denies,” “rule out,” and “negative for” OSA. To establish a more uniform methodology, we restricted the application of the automated algorithm to note searches to the Diagnosis section of a patient’s clinical notes. In addition, we interrogated the nurse-administered preoperative checklist obtained on the day of surgery. Patients who provided an affirmative response (ie, “yes”) to the nurse’s query regarding a history of OSA were coded by ACE as having OSA. Patients who denied a history of OSA (ie, a “no” response) subsequently underwent OSA screening with Flemons criteria. (9) Patients whose Flemons scores were consistent with a high risk of OSA were coded by ACE as having OSA. The results of queries were combined, and when a patient had a diagnosis of OSA, affirmed a history of OSA during the checklist, or had a high risk of OSA, the patient was coded to “yes” for OSA.

Subjects were divided into derivation and validation cohorts. The derivation cohort of 100 subjects was constructed using the initial OSA electronic search algorithm to consist of two equal samples of subjects at high or low likelihood for OSA. The derivation cohort underwent manual review of subject medical records to detect a true risk or presence of OSA. On the basis of obtained information from the manual review, the initial algorithm was refined to incorporate mismatches between automated search and manual review. Using this refined algorithm, we queried the remainder of the study patients as a validation cohort (figure 1). Disagreements between the automated searches and the manual searches were adjudicated by the senior author (T.N.W.), to whom the search results were masked. These 2 authors (O.O.O. and T.N.W.) were not involved in algorithm design and implementation.

Statistical Analyses

The study subjects were divided into derivation and validation cohorts as described above. For each sample, the sensitivity and specificity of the final automated electronic search algorithm for identifying OSA were calculated using the manual review of the EHRs as the gold standard. Findings are summarized using point-estimates and corresponding 95% exact binomial confidence intervals (C.I.). Statistical software (JMP version 10.0; SAS Institute Inc) was used to compare and validate the automated search vs the manual review.

Results

From a pool of 558 adult surgical patients, a derivation cohort of 100 subjects was selected with an initial automated electronic search algorithm for equal numbers of patients with high or low likelihood for OSA. The validation cohort consisted of the remaining 458 subjects. A comparison between manual chart review for OSA and the final automated electronic search algorithm is summarized in the table 1. In the derivation cohort, the automated digital algorithm achieved a sensitivity of 98.2% (95% C.I., 90.3%, 99.9%) and a specificity of 100.0% (95% C.I., 92.0%, 100.0%) compared with the manual review. For the validation cohort, the sensitivity of the automated digital algorithm was 100.0% (95% C.I., 89.1%, 100.0%) and the specificity was 98.4% (95% C.I., 96.0, 99.3).

Discussion

This study showed that an automated electronic search algorithm can be developed to interrogate the EHR to identify surgical patients with a diagnosis of or at high risk for OSA in a clinical practice where all surgical patients are screened for OSA. The comparison between the algorithm and the manual chart review found that this automated strategy performs favorably and with a high degree of accuracy. Our results add credence to the use of computerized searches of EHRs of large patient cohorts to extract clinical variables, processes, and outcomes of interest. (2,3,12-14)

The increased perioperative risk that OSA poses for surgical patients is well recognized, (4,5) and the condition has become the topic of clinical management guidelines that call for increased vigilance of affected patients. (11,15) Despite the risk, OSA may present with vague symptoms (eg, increased daytime sleepiness), and population studies have suggested that most (approximately 90%) of OSA cases are undiagnosed, (7) including OSA of surgical patients. (6) In response, clinicians have increasingly called for preoperative screening of all surgical patients for OSA. However, the gold standard to diagnose OSA—overnight polysomnography—is time consuming, expensive, and of limited access, and thus it is an impractical screening tool. Various simple assessment tools have been developed to screen patients for OSA. (810,15)

These caveats have implications for the development of an automated search algorithm. Because reliance on billing codes or free-text searches of clinical notes and diagnoses are inadequate, OSA is usually undiagnosed. Although these OSA screening tools perform reasonably well, they are by no means completely accurate. (16) Yet, patients identified through this automated algorithm should be considered at high risk for OSA. Further, although an automated search algorithm still requires data entry into the EHR, it can greatly facilitate the transparency of diagnosis, which although textually recorded, may be buried within redundant health records and be unnoticed by perioperative providers.

Limitations

This study has the inherent limitations of a retrospective study design. Several aspects of our clinical practice may limit the usefulness of an automated search strategy. Our practice has a registered nurse screen the surgical patients for OSA risk during the preoperative checklist. However, many practices have not adopted this practice of universal screening, (17,18) and in such a clinical setting our automated digital search algorithm would be less accurate. Undoubtedly, some patients bypass the nurse-administered checklist and therefore the OSA screen, but such cases typically occur only in emergencies or when a patient is already intubated and mechanically ventilated. In addition, our practice to assess sleep apnea uses the Flemons criteria. (9) This use raises a question of portability when the algorithm is applied in a practice that uses an alternative assessment tool, such as STOP-BANG. (8) Because our automated search algorithm relies on assessment tools that screen for OSA, its accuracy is limited by the performance of the screening tool. (16) In addition, the data entries in EHRs represent a dynamic process, and in OSA, conditions may change with time (corrective oral surgery and weight loss). Therefore, these factors always need to be considered in final risk assessment. Incomplete data points or inconsistencies with the text search phrases can limit the applicability of this search algorithm. However, this latter limitation likely accounts for a small number of patients in the database. Lastly, the algorithm was designed for retrospective identification of surgical patients at risk for OSA. An area of future direction is to develop this algorithm for surveillance purposes.

Conclusion

The present study details the development of an automated digital search algorithm that interrogates EHRs to identify not only surgical patients with the diagnosis of OSA but also those at high risk for OSA. These results reflect our clinical practice, where all surgical patients are preoperatively screened for OSA using an assessment tool.

Clinical Relevance Statement

OSA is common, is frequently undiagnosed, and is associated with increased risk of postoperative complications in surgical patients. This study describes the development, derivation, and validation of an automated digital search algorithm that interrogates EHRs to identify with a high degree of accuracy the surgical patients who have or are at high risk for OSA.

References

  1. Hsiao CJ, Hing E, Socey TC, Cai B. Electronic health record systems and intent to apply for meaningful use incentives among office-based physician practices: United States, 2001-2011. National Center for Health Statistics data brief, no. 79. 2011 Nov (revised 2012 Feb 8); 8 p.
  2. Singh B, Singh A, Ahmed A, Wilson GA, Pickering BW, Herasevich V, Gajic O, Li G. Derivation and validation of automated electronic search strategies to extract Charlson comorbidities from electronic medical records. Mayo Clin Proc 2012 Sep;87(9):817-24.
  3. Tien M, Kashyap R, Wilson GA, Hernandez-Torres V, Jacob AK, Schroeder DR, Mantilla CB. Retrospective derivation and validation of an automated electronic search algorithm to identify post operative cardiovascular and thromboembolic complications. Appl Clin Inform 2015 Sep 9;6(3):565-76.
  4. Gali B, Whalen FX, Schroeder DR, Gay PC, Plevak DJ. Identification of patients at risk for postoperative respiratory complications using a preoperative obstructive sleep apnea screening tool and postanesthesia care assessment. Anesthesiology 2009 Apr;110(4):869-77.
  5. Weingarten TN, Herasevich V, McGlinch MC, Beatty NC, Christensen ED, Hannifan SK, Koenig AE, Klanke J, Zhu X, Gali B, Schroeder DR, Sprung J. Predictors of delayed postoperative respiratory depression assessed from naloxone administration. Anesth Analg 2015 Aug;121(2):422-9.
  6. Singh M, Liao P, Kobah S, Wijeysundera DN, Shapiro C, Chung F. Proportion of surgical patients with undiagnosed obstructive sleep apnoea. Br J Anaesth 2013 Apr;110(4):629-36. Epub 2012 Dec 19.
  7. Young T, Evans L, Finn L, Palta M. Estimation of the clinically diagnosed proportion of sleep apnea syndrome in middle-aged men and women. Sleep 1997 Sep;20(9):705-6.
  8. Chung F, Elsaid H. Screening for obstructive sleep apnea before surgery: why is it important? Curr Opin Anaesthesiol 2009 Jun;22(3):405-11.
  9. Flemons WW, Whitelaw WA, Brant R, Remmers JE. Likelihood ratios for a sleep apnea clinical prediction rule. Am J Respir Crit Care Med 1994 Nov;150(5 Pt 1):1279-85.
  10. Netzer NC, Stoohs RA, Netzer CM, Clark K, Strohl KP. Using the Berlin Questionnaire to identify patients at risk for the sleep apnea syndrome. Ann Intern Med 1999 Oct 5;131(7):485-91.
  11. Seet E, Chung F. Obstructive sleep apnea: preoperative assessment. Anesthesiol Clin 2010 Jun;28(2):199-215.
  12. Alsara A, Warner DO, Li G, Herasevich V, Gajic O, Kor DJ. Derivation and validation of automated electronic search strategies to identify pertinent risk factors for postoperative acute lung injury. Mayo Clin Proc 2011 May;86(5):382-8.
  13. Newton KM, Peissig PL, Kho AN, Bielinski SJ, Berg RL, Choudhary V, Basford M, Chute CG, Kullo IJ, Li R, Pacheco JA, Rasmussen LV, Spangler L, Denny JC. Validation of electronic medical record-based phenotyping algorithms: results and lessons learned from the eMERGE network. J Am Med Inform Assoc 2013 Jun;20(e1):e147-54. Epub 2013 Mar 26.
  14. Smischney NJ, Velagapudi VM, Onigkeit JA, Pickering BW, Herasevich V, Kashyap R. Retrospective derivation and validation of a search algorithm to identify emergent endotracheal intubations in the intensive care unit. Appl Clin Inform 2013 Sep 4;4(3):419-27.
  15. American Society of Anesthesiologists Task Force on Perioperative Management of patients with obstructive sleep apnea. Practice guidelines for the perioperative management of patients with obstructive sleep apnea: an updated report by the American Society of Anesthesiologists Task Force on Perioperative Management of patients with obstructive sleep apnea. Anesthesiology 2014 Feb;120(2):268-86.
  16. Weingarten TN, Kor DJ, Gali B, Sprung J. Predicting postoperative pulmonary complications in high-risk populations. Curr Opin Anaesthesiol 2013 Apr;26(2):116-25.
  17. Benumof JL. Mismanagement of obstructive sleep apnea may result in finding these patients dead in bed. Can J Anaesth 2016 Jan;63(1):3-7. Epub 2015 Oct 19.
  18. Cordovani L, Chung F, Germain G, Turner K, Turgeon AF, Hall R, Gay PC, Bryson GL, Choi PT; Canadian Perioperative Anesthesia Clinical Trials Group. Perioperative management of patients with obstructive sleep apnea: a survey of Canadian anesthesiologists. Can J Anaesth 2016 Jan;63(1):16-23. Epub 2015 Oct 19.

Figure 1. Flowchart of Study Cohorts

Table 1. Performance of an automated electronic search algorithm for the detection of obstructive sleep apnea by the interrogation of electronic health records of surgical patients.

Derivation Cohort Validation Cohort
Result with algorithm OSA No OSA Total OSA No OSA Total
Positive OSA 55 1 56 80 0 80
Negative OSA 0 44 44 6 372 378
Total 55 45 100 86 372 458
Sensitivity, % (95% CI) 98.2 (90.3 – 99.9) 100.0 ( 89.1 – 100.0)
Specificity, % (95% CI) 100.0 (92.0 – 100.0) 98.4 (96.0 – 99.3)

OSA, Obstructive Sleep Apnea.
Sensitivity and Specificity are presented using point estimates and 95% exact binomial confidence intervals.

Corresponding author:
Toby N Weingarten
Department of Anesthesiology
Mayo Clinic, 200 First Street SW
Rochester, MN 55905
Phone: (507) 255-1612
Fax: (507) 255-6463
E-mail: weingarten.toby@mayo.edu

Creative Commons LicenseThis work is licensed under a Creative Commons Attribution 4.0 International License