Crew Resource Management Applied to Ophthalmology Practices
How to create a culture of safety.
BY RICHARD J. RUCKMAN, M.D., F.A.C.S.
Medical errors kill between 44,000 and 98,000 hospitalized patients a year. The Institute of Medicine (IOM) published the foregoing statement in 1999 in "To Err is Human: Building a Safer Health System" following the death of prominent Boston Globe health columnist, Betsy Lehman. Lehman died in December 1994 after accidentally receiving an overdose of chemotherapy. At the time, a great deal of attention centered around the terrifying "epidemic" of medical mistakes being made by the healthcare system.
The IOM report also suggested that one possible means of reducing errors in the medical setting would be to implement formal training in teamwork, analogous to Crew Resource Management (CRM) in aviation.1 This concept has been embraced by The Joint Commission, formerly known as Joint Commission on Accreditation of Healthcare Organizations (JCAHO), which has suggested team training as part of a comprehensive safety plan.2
Why Crew Resource Management?
In the 1970s, investigators determined that more than 70% of air crashes resulted from human error rather than failure of equipment or weather. A 1999 NASA workshop determined that the majority of errors consisted of failures in leadership, team coordination and decision making, leading to the development of CRM. CRM emphasizes the role of human factors in high-stress, high-risk environments and is defined by the National Transportation Safety Board as "using all available resources — information, equipment and people — to achieve safe and efficient flight operations."3 CRM training emphasizes team building and flattening the chain-of-command to improve decision making. Does it work? In surveys, airline crew members consistently cite CRM training as relevant, useful and effective.4
Prior to the introduction of CRM, the aviation industry provided many examples of failed group dynamics. One example was the 1982 crash of an Air Florida airliner. Ice on an air-speed indicator caused the pilot to apply too little power as the plane ascended:
First Officer: Ah, that's not right.
Captain: Yes, it is; there's 80 [referring to speed].
First Officer: Nah, I don't think it's right. Ah, maybe it is.
Captain: Hundred and twenty.
First Officer: I don't know.
The speed was not correct and the timid response of the co-pilot contributed to the plane stalling and crashing into the Potomac River.5
Breakdown in Communication Leads to Errors
Errors in ophthalmology can be much harder to define than an aviation accident, but all of us have been in situations where breakdown in communication or protocol affected patient care. As an example, many years ago, long before "time out" was part of the surgical procedure, I was doing insurance-required surgery at our local hospital instead of at my ASC. By that time, most cataract surgeries were done in ASCs and many of the hospital OR personnel were not entirely familiar with the procedure.
In this case, the anesthesiologist discovered some requested lab work was not ready and changed the sequence of the surgical cases. However, this information was not conveyed to everyone and, as a result, the wrong IOL was implanted. The patient did well following an IOL exchange but had to endure a second surgery. Analyzing the situation revealed not only my failure to communicate to all team members but also a failure of protocols that would have normally been assumed in my ASC.
As ophthalmologists, our training has focused on developing technical skills with little emphasis on how we interact in the complex world of a busy clinic or OR. CRM recognizes that "to err is human" and that a more formalized environment for group dynamics identifies potential errors as "threats" to be avoided or mitigated. CRM is structured to manage these threats. In a small way, we already use the concepts of CRM. It exists in our guidelines for marking surgical sites and requiring a time out in the OR.
Risks and Errors
In the OR, we are focused. The surgeon and his team have completed a "pre-flight" checklist, minimized distractions and have a plan for the expected as well as the unexpected. Outside the OR, it may be a different story. Complacency for "minor" surgery may lead to lack of attentiveness to protocols or documentation. OMIC has reported that some of its largest claims have involved such "minor" procedures as eyelid biopsy, papilloma or cyst removal. One such claim was for $975,000 for visual loss after excision of a chalazion.6
The IOM study identifies adverse drug events and medication errors as the most common medical errors. As ophthalmologists, we write numerous prescriptions. We prescribe in more than 50% of office visits, which represents more than 40 million prescriptions per year. With this volume, even a 1% error would be huge, but a study from a British eye hospital shows their medication error rate as 8%.7 In the United States, medication errors are the third most common complaint in medical legal cases.8
The Principles of CRM
CRM training teaches that we are all fallible and prone to fatigue and error. While as physicians we feel that we can add a few more patients to a busy day, it is not a feeling shared by all. One CRM study asked OR staff to respond to the following statement: "Even when fatigued, I perform effectively during critical times." Surgeons responded "yes" 70% of the time, while nurses and pilots, respectively, answered "yes" 60% and 26% of the time.3 And while we may pride ourselves on our communication skills, our colleagues in the OR disagree. A British Medical Journal survey reveals that while 77% of surgeons thought they had good communication within the OR, only 40% of the nurses and anesthesia staff agreed.4
CRM's Seven Key Concepts |
---|
CRM training focuses on seven concepts. These were reviewed from the surgeon's perspective in the Bulletin of the American College of Surgeons in 2006.9 ■ Command. The surgeon remains the final authority in the operating room. The hierarchy is flattened so that each member should not be afraid to offer his/her input. ■ Leadership. Leaders allow team members to fulfill their responsibilities and know that conflict is not only expected but that effective management of conflict can lead to better outcomes. An article in the Journal of the American College of Surgeons states, "There is growing and compelling evidence that surgeons need to be capable managers of conflict … Surgeons must improve the quality of their communications in addition to increasing the quantity of communication."10 ■ Communication. Communication is the timely exchange of pertinent information in a manner appropriate for the situation. A surgeon about to operate on the wrong eye needs much more forceful intervention than simply a situation in which an IOL power is verified. The key is that communication must be two-way, from top to bottom. ■ Situational awareness. Even though initially intuitive, situational awareness is not only knowing what is going on now but what happened before and what may happen. For example, I may know that I am doing surgery on a patient at high-risk for pseudoexfoliation, but does my staff know that I may urgently need a capsular tension ring? ■ Workload management. A problem for all of us trying to run efficient ASCs is having the right type and number of staff for the right amount of time. For the ASC, this requires factoring not only the number of cases but contingencies, breaks and absenteeism. The risk of poor workload management is fatigue, which is of such importance that it remains as a JCAHO National Patient Safety Goal for 2008.11 ■ Resource management. For most of us, resource management means making sure we have the right IOL, adequate packs and viscoelastic, but in fact, it is much more. Resources are people, equipment and supplies; resource management is the ability to locate, manage or troubleshoot all of these. ■ Decision making. Although the hierarchy is flattened, CRM leadership is not a democracy. It may be 60/40 with the surgeon seeking input until he decides he has enough information for a final decision. |
As stated in Error Reduction Through Team Leadership: Seven Principles of CRM Applied to Surgery, "The theory behind team training and CRM is that complex systems break down not because of flaws in their engineering, but rather because the people operating within the system fail to interact in a manner that ensures efficiency and good outcomes."9 In aviation, that includes team training, use of flight simulators, group debriefings and observation of crew. This has been a dynamic process evolving from the rigid theory of the early years to more "real world" training.
CRM does not reduce the authority or accountability of the surgeon. It does encourage each team member to contribute his/her skills and knowledge in a disciplined but non-threatening environment. For the OR team, this may include briefings prior to surgery, procedure cards, check lists and familiarity with use of equipment.
CRM Evolution
CRM has continued to evolve in aviation and medicine. Drs. Robert L. Helmreich and J. Bryan Sexton of the University of Texas Center of Excellence in Patient Safety have taken a "Threat and Error Management" model from aviation and applied it to medicine.12 Threats are any factors which affect the decision-making process. Some may be obvious, such as the patient's condition or proposed course of therapy, while others are latent, such as failure of communication. The definition of "error" in this model is an actual or potential adverse event that must be categorized and addressed.
Types of errors may include:
► Task execution errors: unintentional physical acts that deviate from the intended course of action. For example, an unplanned vitrectomy.
► Procedural errors: failure to follow mandated procedures. For example, informed consent is not signed.
► Communication errors: failure to share information. "Oh, are we doing a pterygium also?"
► Decision errors: selecting wrong course of action. "It doesn't look malignant to me."
► Violations: consciously failing to follow formally mandated procedures such as treatment protocols. "I don't think that cornea is too thin for LASIK."
Threat and error management help to identify problems while the seven principles of CRM (See "CRM's seven key concepts," above) can be used as tools to avoid or mitigate errors. In medicine, these threats may include the expected risks of patient's care, status of support staff and equipment, or unexpected external errors of diagnosis or communication.
CRM and Ophthalmology
Do the principles of CRM apply to ophthalmology? As Dr. Helmreich has said, "The operating theatre is a more complex environment than the cockpit with multiple disciplines interacting while dealing with a patient whose reactions may be highly variable.12
Medicine already has a track record with CRM. Anesthesiology has used a medical simulator that is essentially a complete OR in order to teach surgical team skills. Reports have shown that one hospital which adopted team training reduced its ICU length of stay by 50%.14 Other reports have included "adverse outcomes for OB/GYNs declined by 53% over a 4-year period" and a study of emergency department staff showed "a 58% reduction in observable errors."13 The Veterans Health Administration has implemented a program called Medical Team Training (MTT) in 43 centers with the goal of improving patient outcomes and enhancing job satisfaction. The VA experience showed that for effective implementation, "the chiefs of surgery and anesthesiology, as well as the OR nurse manager, must be active members of the implementation team."14
These programs, although successful, involve a huge commitment in time and personnel. For the small private practice, the concepts of CRM along with threat and error management are the first steps toward creating a culture of safety.
Creating a Culture of Safety
In medicine, we are directed by "primum non nocere" — First, Do No Harm. Yet we know by being human, errors will occur. A culture of safety seeks to create a pattern of behavior that seeks to minimize patient harm, which may result from the process of healthcare delivery. The ophthalmologist, as the leader, upholds the expectations through example, good management practices and trust.
A culture of safety is proactive in that it not only addresses errors, but near misses. Errors are avoided through checklists, standard operating procedures and active review of one another's work. Technology can be used to create a "forcing function" to try to avoid errors. For example, my phaco unit will not allow me to proceed if improperly set up and will give me an error message explaining why. If an error does occur, then protocols exist to mitigate the damage. In the OR, we have our vitrectomy instruments readily available and the settings programmed.
A barrier to error reporting and ultimately a culture of safety is the fear of reprisal. In a New York Times article titled "What Pilots Can Teach Hospitals About Patient Safety," the author writes, "The definition of an error in health care is 'fuzzier' than in aviation … Healthcare providers' fear of litigation and losing their medical licenses also hinders the honest reporting of mistakes, whereas aviators are often inoculated against punishment if they promptly report incidents."15
Creating a culture of safety will be stymied until the concept is embraced by both legislators and medical boards. Some progress has been made through the Patient Safety and Quality Act of 2005 that provides legal protection for reporting errors to patient safety organizations.17 In aviation, CRM evolved over many years. In medicine successful programs have emerged in anesthesiology, emergency medicine and ICU care. These concepts should blend well with the discipline required of ophthalmology to create a culture of safety. OM
References
1. Kohn LT, Corrigan JM, Donaldson MD. To err is human: building a safer health system. Committee on the Quality of Health Care in America, Institute of Medicine. Washington, D.C.: National Academy Press; 2000.
2. The Joint Commission. Health Care at the Crossroads: Strategies for Improving the Medical Liability System and Preventing Patient Injury. 2005. Available at http://www.jointcommision.org. Accessed June 5, 2007.
3. Making health care safer: a critical analysis of patient safety practices. crew resource management and its applications in medicine. Washington, D.C.; AHRQ; 2001. Evidence Report/Technology Assessment; No. 43.
4. Sexton JB, Thomas EJ, Helmreich RL. Error, stress, and teamwork in medicine and aviation: cross sectional surveys. BMJ. 2000;320:745-749.
5. Psychology Matters. Making Air Travel Safer Through Crew Resource Management (CRM). Feb. 6, 2004. Available at: http://www.psychologymatters.org. Accessed January 30, 2007.
6. Shore JW. Minor distractions lead to major problems in the OR. OMIC Digest. 2006;16:3.
7. Mandal, K, Fraser SG. The Incidence of prescribing errors in an eye hospital. BMC Ophthalmology. 2005;5:4.
8. Minimizing medication errors: communication about drug orders. American Academy of Ophthalmology. Patient Safety Bulletin 3. Oct. 2001.
9. Healy GB, Barker J, Madonna G. Error reduction through team leadership: seven principles of CRM applied to surgery. Bull Am Coll Surg. 2006;91:10-26.
10. Rogers D, Lingard L. Surgeons managing conflict: a framework for understanding the challenge. J Am Coll Surg. Oct. 2006;203:568-574.
11. The Joint Commission. Potential 2008 National Patient Safety Goals and Requirements. Available at: http://www.jointcommision.org. Accessed June 5, 2007.
12. Helmreich, RL, Sexton, JB. Managing threat and error to increase safety in medicine. In: Dietrich R and Jochum K, eds. Teaming Up. Components of Safety under High Risk. Aldershot, UK: Ashgate; 2004.
13. Powell, SM, Haskins RN, Sanders W. Improving patient safety and quality of care using aviation CRM. Patient Safety and Quality Healthcare. 2005. Available at: http://www.psqh.com. Accessed March 26, 2007.
14. Dunn EJ, Mills PD, Neily J, Crittenden, M, Carmack, AL, Bagian, JP. Medical Team Training: applying crew resource management in the veterans health administration. Jt Comm J Qual Patient Saf. 2007;33:6.
15. Murphy K. What pilots can teach hospitals about patient safety. New York Times. October 31, 2006. Available at: http://www.nytimes.com. Accessed October 2006.
16. Patient Safety and Quality Act of 2005. Available at: http://www.whitehouse.gov. Accessed April 24, 2007.
Richard J. Ruckman, M.D., F.A.C.S., has been in practice since 1978, specializing in cataract surgery. He is the medical director of The Center For Sight, located in Lufkin, Texas, and may be reached by e-mail at rruckman@thecenterforsight.com. |