Sound Information Handling:
|
Procedures |
Central venous catheter
punctures pleura |
Preventive Medicine |
Inadequate preparation
before surgery |
Diagnosis |
Delay in notifying
patient after positive lab test |
Drugs |
Wrong dose, method, or
drug |
Drug Errors. The proposed automated system can check prescriptions against patient conditions, known allergies, and relevant physical characteristics. Drug protocols can also be checked for ambiguities. The automated support for error handling can speed up detection of drug errors and help to identify their underlying causes. It can also participate in countermeasures to suppress adverse drug reactions.
Diagnostic Errors. The proposed system can perform integrity checks that flag unlikely diagnoses. It can help doctors fulfill implied commitments by notification of need for additional diagnostic procedures as well as timely and accurate delivery of lab results, for example. The system can also handle diagnostic errors after their detection and suppress further use of resulting misinformation.
Preventive Errors. The proposed system can perform integrity checks on planned surgical procedures to ensure that prerequisite activities are performed (e.g., avoidance of solid foods for twelve hours before surgery, availability of needed lab results). Not only can the system prompt for confirmation of prerequisite activities, it can also actively participate in determining the causes of failures after they occur.
Procedure/Performance Errors. In the short term the proposed automated system can provide support that consists primarily of accurate, timely error-handling. Over a longer term, collected data can reduce further incidence of errors by identifying error-prone procedures that can then be targeted for improvement or replacement.
The first three error categories account for roughly one-half of all serious adverse events. They can be addressed with existing technology and eliminated. These changes can take place immediately. We have the opportunity to make a profound difference in terms of both patient suffering and medical costs.
Important as system design is to success, education and commitment of human beings is even more critical. Table 1 shows what medical practitioners, administrators, and patients need to do. All roles are critical for success.
Soundness Role |
Information-Handling Responsibilities |
Stakeholder |
Qualified User |
Provide correct inputs |
Patient, doctor, nurse, |
Error Reporter |
Detect incorrect outputs |
Patient, pharmacist, nurse, |
Error Investigator |
Determine causes of errors |
Researcher, quality review |
Error-Tracking |
Manage informatic roles, find errors in informatic systems |
Hospital administrator, |
Table 1. Roles and Responsibilities
Qualified Users provide credible inputs that are accurate, timely, and highly reliable after integrity validation, possibly by an automated system. For example, although an MD’s license to practice medicine is comprehensive, hospitals increasingly require certification by a specialty review board and evidence of continuing education before granting the doctor admitting privileges at the hospital.
Error Reporters identify and document potential errors. Full error-reporting is essential. If errors are not detected, they can easily propagate and compound one another with increasingly harmful results until finally detected and handled. With proper informatic support, spurious error reports are harmless, imposing only additional processing overhead.
Error Investigators determine the validity of the error reports they receive and provide corrective inputs to mitigate the effects of errors. A finding by an error investigator must have greater credibility than information which is discredited by the error report and subsequent investigation. Because error investigators may invalidate certified inputs, they also are reporters of certification errors.
Error-Tracking Administrators make role assignments, ensuring that users are adequately qualified, that all errors are reported, and that error investigators successfully detect the causes of errors, including errors in the informatic systems themselves.
Shortly after our first model of informatic soundness went to press, Betsy A. Lehman, Health Editor for the Boston Globe, died of a drug overdose at the Dana-Farber Cancer Institute. The events surrounding her untimely death became the subject of a careful investigation and extensive press coverage (e.g., Kong 1995; Knox 1995a; Knox 1995b). The reported events illustrate most of the key requirements from our theory.
1. Stable form. The drug manufacturer's treatment summary specified 4,000 mg in four days in a way that could have meant either 4g each day for four days or 4g total over a four-day treatment cycle. The doctor who ordered the medication misinterpreted the manufacturer's intent. Lack of stable form is a common source of drug-related errors; similar problems include sound-alike names and look-alike containers.
2. Integrity-validation checks. The amount prescribed for Ms. Lehman was inconsistent with what she had received in a previous treatment cycle. This inconsistency was not checked, contrary to Dana Farber policy. At the time, there was no dosage ceiling at Dana Farber for the drug.
3. Higher credibility for error investigations. The same drug error occurred in another patient at Dana Farber at about the same time. It was reported by a pharmacist. The error report, investigated by the same doctor who was treating Betsy Lehman, was overridden. Neither the doctor nor the pharmacist consulted the detailed protocol description for the drug. Nor did they call the pharmaceutical company which had issued the ambiguous treatment protocol summary for clarification.
4. Higher credibility for corroborated data. Two other pharmacists corroborated the original error report. These reports were also dropped in favor of the original erroneous interpretation of the ambiguous treatment summary.
5. Basis for investigation. After the first dose, Betsy Lehman herself reported that something was wrong relative to her previous experience. She reported quite a different reaction to the chemotherapy. This report was overridden by her attendants. Error reports were not routinely logged. There was no process by which such information could accumulate and provide the basis for a thorough investigation.
6. Investigation of antecedent causes. Lab results showed an abnormal spike in a metabolite of the administered drug. This did not lead to discovery of the original antecedent error which led to the spike.
7. Propagation of error retractions. Six months later, the same semantic ambiguity in daily versus treatment-cycle doses killed a cancer patient at the University of Chicago Hospital.
Examples similar to one cited above have driven the development of our methodology and the three primary objectives that it supports:
Correctness: |
All certified inputs are correct. |
Basis: |
All inputs responsible for a given warranted output can be identified. |
Error handling: |
Incorrect or discredited outputs can be retracted. Further use of incorrect or discredited inputs can be suppressed. |
Where do we go from here? As medical informatics specialists we need to act both globally and locally. At the global level we can work to influence the development of standards. The evolving HL7 standard, for example, needs to be subjected to a rigorous analysis of its ability to support informatic soundness.
In our own institutions we should initiate, educate, assess, plan, and implement error-reduction strategies. The health-care industry is presently undergoing a massive automation effort aimed primarily at reducing costs. Estimates of projected total expenditures run as high as $3T (Vendeland, 1995). We need to enhance our own understanding of error-handling and share it throughout our institutions in applications ranging from purchasing decision-making for IS installations to quality review boards.
The task will not be easy, but the rewards will be great and measurable, one life at a time.
Causes of Adverse Events
Technical errors in surgical procedures |
35% |
Inadequate preventive measures |
22% |
Diagnostic errors |
14% |
Drug-related errors |
9% |
Inadequate Infrastructure |
2% |
Etc.. |
17% |
The above table is taken from the work of Leape et al.
The first correctness objective is to maintain correctness in the absence of introduced errors. As illustrated in the following diagram, the notion of correctness is slightly different for different kinds of information:
The basis objective is to be able to justify each input and output on the basis of previously justified rules of acceptability. Thus, not all inputs can be accepted and made use of by the system, as suggested in the following diagram:
The error handling objective pertains to error handling once errors have been introduced. As illustrated in the following diagram, errors arise and enter a system where they may become falsely treated as correct and lead to additional errors that propagate out into the system's environment. The error handling objective is to discover such errors, issue revocations for propagated errors, and restore sound information handling by eliminating the errors and all of their erroneous consequences from the system.
Bates, D.W., et al., July 1995, “Incidence of Adverse Drug Events and Potential Adverse Drug Events,” Journal of the American Medical Association, Vol. 274, No. 1, pp. 29-34.
Clark, D.D., and D.R. Wilson, April 1987, “A Comparison of Commercial and Military Computer Security Policies,” Proceedings of the 1987 Symposium on Security and Privacy, IEEE.
Guttman, J. D., and D.M. Johnson, 1994, “Three Applications of Formal Methods at MITRE,” in FME '94: Industrial Benefits of Formal Methods, edited by M. Naftalin, T. Denvir, and Miguel Bertran, Springer Lecture Notes in Computer Science, Vol. 873,
pp. 55--65.
Knox, R.A., and D. Golden, May 28, 1995, “Dana-Farber turmoil seen,” Boston Sunday Globe, pp. 1&13.
Knox, R.A., December 26, 1995, “Overdoses still weigh heavy at Dana-Farber,” The Boston Globe, pp. 1&20.
Kong, D., March 25, 1995, “Safeguards failed at Dana-Farber,” The Boston Globe, pp. 1&5.
Leape, L.L., et al., February 1991, “The Nature of Adverse Events in Hospitalized Patients: Results of the Harvard Medical Practice Study,” New England Journal of Medicine, Vol. 324, No. 6, pp. 377-384.
Leape, L.L., December 1994, “Error in Medicine,” Journal of the American Medical Association, Vol. 272, No. 23, pp. 1851-1857.
Tarski, A., 1936, “The Concept of Truth in Formalized Languages,” in Logic, Semantics, Metamathematics, translated by J. H. Woodger, Oxford University Press, 1956, pp. 152-278.
Vendeland, A. J., “Medical Moon Shot,” Computerworld, April 10, 1995.
Williams, J. G., and L. J. LaPadula, 1995, “Modeling External Consistency of Automated Systems,” Journal of High Integrity Systems, Vol. 1, No. 3, pp. 249-267.
The author thanks Len LaPadula for many substantive discussions and Dr. Lucian Leape for valuable suggestions.
James Williams has extensive experience in creating and evaluating high-assurance systems. His primary interests are in promoting sound information handling in health care. Dr. Williams has contributed professionally in the area of program verification — developing and testing methodologies for ensuring that computer programs perform their intended functions. He has published many papers on formal modeling, program verification, automated deduction, computer security, and various mathematical topics including logic, topology, category theory, and algebra. A member of Computer Scientists for Social Responsibility, he is an avid programmer in his spare time. He holds a Ph.D. in mathematics from the University of California at Berkeley and did post-graduate work in computer science at the University of Texas at Austin.
† Published in Toward an Electronic Patient Record '96, Vol. 2, pp. 348-355, Medical Records Institute, May 1996.