One way to reduce errors is to learn from them. A Reporting System of errors is a reasonable first start and a recommendation of the Institute of Medicine’s report, To Err Is Human: Building a Safer Health System. Reporting systems can be mandatory or voluntary. Reporting systems need to be standardized so the data can be interpreted across states and regions, making complex analysis possible.
States have established mandatory reporting systems for errors, primarily designed to ensure provider accountability. Regulatory agencies investigate serious medical injuries or fatalities and may issue penalties or fines. These cases represent only the most severe instances of medical errors, with most incidents remaining unreported.
Several organizations sponsor voluntary reporting systems for errors. The Joint Commission on Accreditation of Healthcare Organizations (JCAHO) has several initiatives for medical error reporting. The US Pharmacopeia (USP) allows frontline practitioners to report medication errors via the Medical Errors Reporting Program (MER) and MedMARx program for hospitals and anonymous reporting of errors. The FDA encourages practitioners to report adverse drug reactions and devices via MedWatch. These are disparate systems with different focused purposes.
When Systems Fail: A Nurses Nightmare
In late 2017, 75-year-old Charlene Murphey was being discharged from Vanderbilt University Medical Center when a PET scan was scheduled for routine surveillance of her prior cancer diagnosis. Due to her anxiety, her physician prescribed the anti-anxiety medication Versed before the test.
A new nurse, RaDonda Vaught, acting in her capacity as a “help-all” nurse, was tasked with obtaining medication. She attempted to retrieve it from an automatic medication dispensing cabinet but was unable to locate it. Consequently, she performed a medication override, a common practice at that time. This action resulted in her obtaining vecuronium, a paralytic typically used during surgery, instead of the correct medication.
The patient became unresponsive. While medical staff attempted unsuccessfully to resuscitate Murphey, Vaught recognized her mistake and informed physicians, explaining that she had been distracted by another patient task while administering the medication. Distractions are a known cause of medication errors. Early investigation revealed that the error was not a matter of malicious intent but a convergence of human error and system failure. Vaught accepted the blame. She was fired immediately. The patient’s family was told she died of “natural causes.” The Tennessee Board of Nursing reviewed the case and agreed that the case was an accident.
Behind Closed Doors: The Concealment of Negligence
In the wake of the patient’s death, other details gradually surfaced. It became apparent that Vanderbilt University Medical Center had made significant efforts to conceal the full extent of the error. Initial reports from within the hospital downplayed the severity of the incident and narrowly framed it as an unfortunate accident rather than a systemic failure. Vanderbilt was rolling out a new electronic health record system and there were problems with the automated medication dispensing cabinet that required nurses to frequently override the system. Moreover, subsequent reviews including a 56-page Centers for Medicare and Medicaid Services (CMS) report detailed Vanderbilt’s deficiencies. It outlines how the hospital’s internal communications and procedures actively minimized the incident for public relations reasons and to avoid regulatory and legal repercussions. The cover-up was especially disturbing in that it concealed glaring deficiencies in patient monitoring protocols and the mismanagement of their computerized medication dispensing systems. These actions illustrated a broader pattern of hospital negligence.
How the Error Was Exposed
Although internal reports initially described the incident as a mere accident, the truth began to emerge following an anonymous tip that prompted federal authorities (CMS) to reopen the investigation. This renewed scrutiny led to the Tennessee Board of Nursing conducting its own review of the case. The publicity resulted in a police investigation with charges filed. Hindsight bias shifted the narrative from an isolated error to a matter of gross negligence and reckless disregard for patient safety by the nurse. The transformation in emphasis—from systemic (latent) problems in the hospital to individual accountability (active problem) of the nurse—demonstrates exactly what we discussed in Do Medical Errors Happen by Accident or Design.
The Outcome
In March 2022, after a highly publicized trial that reverberated throughout the healthcare industry, RaDonda Vaught was found guilty of criminally negligent homicide and gross neglect of an impaired adult. At trial, the family advocated for forgiveness and against prison. Rather than the potential eight-year prison sentence she might have faced, Vaught was sentenced to three years’ probation. While Nurse RaDonda Vaught was criminally prosecuted, no financial penalties or public disciplinary actions were imposed on the hospital itself because administrators claimed, “they didn’t know.”1
Opinion
This case is a tragedy on so many levels. Here are my observations:
Unnecessary Testing - A debate in the oncology community questions the use of routine radiological investigations in cancer survivors.2 Perhaps the patient did not even need the procedure at all. We will discuss in an upcoming essay, Are We Testing Too Much. Excessive testing was revealed in Why So Many Visits for a Simple Condition. Healthcare can be dangerous, and tests should have clear benefit.
Poorly Structured Organization - Only a non-clinical administrator would assign a newly hired nurse the role of “help all.” She could be sent to many different departments where she would not have been familiar with the processes or know how to get support when overwhelmed. This is a systemic latent error as discussed in Do Medical Errors Happen by Accident or by Design .
Bad Personnel Management - AI scheduling systems are available that align personnel requirements with patient flow, removing the necessity for a "help all" nurse role and shifting away from a fixed schedule based on a labor budget. Misalignment of personnel increases distractions as the employee tries to do too much. This latent error is poor management.
Incorrect Installation and Poor Maintenance - The error could not have happened if the computerized medication cabinet were interfaced correctly to the electronic health record. The doctor would order the medication and only that medication could be dispensed by the nurse. Failure of the cabinet is another latent error of the system.
Poor Design - The electronic health record systems were deployed because of a well-intentioned government mandate. However, they were not fully functional for the clinical needs of practitioners. Even today doctors and nurses find the use of EHRs at the bedside cumbersome so much of the record is completed long after care is rendered or simply cut and paste for billing.
The Chilling Effect: How Fear of Reporting Errors Endangers Patients
This case created a legal precedent for criminalization of medical errors. Punitive measures instill a chilling effect within the profession, discouraging the reporting of errors, and thereby inadvertently compromising transparency for patient safety. From a personal standpoint, Nurse Vaughn would have been better off to have said nothing. Her mistake might not have been recognized. The hospital tried to cover it up. If asked, the nurse should have denied giving the wrong drug. If questioned under oath, she could take the 5th. Despite the system’s failures, Nurse Vaught chose transparency, owned her mistake, and paid a steep price—losing her job, her license, and nearly her freedom. No one in the system was accountable. It’s unclear that anything was learned by the health system as a result of this preventable death.
The Illusion of Safety: The Absence of Patient Safety Reporting
The case has prompted renewed discussions on reporting both systemic shortcomings (latent errors) rather than individual (active) errors. Such an approach could promote transparency and continuous improvement. Every event could highlight areas wherein systems, procedures, or training can be strengthened to make healthcare safer in the future.
Foundation models (AI) are well-suited for analyzing large datasets and uncovering latent errors, but there are no existing plans to aggregate healthcare safety data for comprehensive study using these tools. Worse, academic researchers are constantly denied access to healthcare data for bogus privacy reasons sighting erroneously the Health Insurance Portability and Accountability Act (HIPAA).
Other industries have established national monitoring systems that have enhanced safety. The Aviation Safety Reporting System is a voluntary, confidential reporting system that allows individuals to report near misses or close call events to improve aviation safety. The Occupational Safety and Health Administration (OSHA) operates a similar system for industrial safety. There is no nationwide patient safety reporting system for healthcare. There is no organization authorized to address patient safety at the national level.
The system works exactly as designed. The purpose of a system is what it does.3 As we have learned from Dr. Stafford Beer, there is no point in claiming that the purpose of the healthcare system is to improve safety by learning from medical errors because it constantly fails to do so.
Conclusion
Healthcare is not broken because of bad nurses. Reporting of medical errors by people committing them (transparency) does not appear to be practical without changes to the basic design of the health system.
Preview
Following the holiday, we continue to examine how medical errors can be reduced through transparency, redundant safety measures, and building a culture of continuous learning and adaptability.
Welch, G., Dossett, Lesly, Routine Surveillance for Cancer Metastases — Does It Help or Harm Patients? N Engl J Med, April 26, 2025; Vol. 392, No. 17:1667-1670
Stafford Beer, 20th century British business professor and the father of systems thinking in organizations
Allen, this was a well thought out article. Keep it up.
Ed S