Twitter icon
Facebook icon
Google icon
LinkedIn icon

Origins and Application of Crew Resource Management

Elliot Carhart EdD, RRT, NRP

Introduction

Crew resource management (CRM) is a team-oriented concept of error management that originated in the aviation industry and has since been adopted in other dynamic, high-risk, and high-stress environments.1–3 This system of error management is based on the goal of improving safety through acknowledgment of the human contributions to error and the implementation of effective strategies for resource utilization.1,4,5 CRM training is intended to foster the development of both technical skills, which are procedural, and non-technical skills, which can be used with procedural skills to address the deleterious effects of such factors as stress, emotion, and fatigue on human performance.6,7

According to Salas, Wilson, Burke, and Wightman, “CRM training can take many shapes and forms.”8 However, the model adopted by the International Association of Fire Chiefs and subsequently shown to improve performance among EMS providers focuses on the following five non-technical skills:

Situational awareness;
Decision making;
Workload (task) management;
Teamwork, and
Communication.4,5

Origins of CRM

The concept of CRM originated with the aviation industry in response to a series of commercial airline crashes for which the cause had been attributed to human error. (See Table 1, below.) In one example, Eastern Airlines Flight 401 was bound for Miami when its landing gear indicator failed to illuminate. The captain aborted the landing, climbed back to 2,000 feet, and set the autopilot while the flight crew investigated. Over the next four minutes, the flight crew’s situational awareness gave way to tunnel vision, and they failed to notice that they were slowly losing altitude. According to the cockpit voice recorder, the flight crew recognized the loss of altitude just seconds before they crashed into Everglades National Park.9

Table 1: Notable Airline Crashes Attributed to Human Error

Year Airline Flight Numbers Location Number of Fatalities
1972 Eastern Airlines 401 Miami 101
1977 KLM Pan Am

4805, 1736

Tenerife, Canary Islands, Spain 583
1978 United Airlines 173 Portland, Ore. 10
1982 Air Florida 90 Washington, D.C. 78

In 1973, NASA began a research program to study human factors in aviation safety.10 NASA officials recognized the dynamic nature of aviation, including that aircraft must continue to be flown while malfunctions are resolved. Several years later, Pan Am Flight 1736 was taxiing down a runway on the island of Tenerife when a series of latent errors culminated in the deadliest accident in aviation history.11 A number of aircraft had been diverted to Tenerife North Airport, then called Los Rodeos Airport, following an explosion and subsequent bomb threat at Flight 1736’s intended destination in the Canary Islands. The overwhelmed capacity at Los Rodeos Airport, along with reduced visibility caused by fog and the overall stress of the day’s events, seems to have taxed the airport’s resources and contributed to multiple breakdowns of communication.

After several hours of closure, the Las Palmas Airport reopened and the aircraft waiting at Los Rodeos began taking off for their intended destination. Pan Am Flight 1736 was directed to taxi to an alternate position and hold for takeoff. After some initial difficulty understanding their instructions, compounded by the reduced visibility due to the fog, the Flight 1736 crew missed their turn to exit the runway. Radio traffic indicating their current location encountered interference from another simultaneous radio transmission. Despite questions from Flight 4805’s first officer regarding the location of the Pan Am aircraft, the captain of Dutch airliner KLM Flight 4805 hastily began takeoff. Flight 4805 accelerated down the runway toward Flight 1736. By the time the crew realized the error, it was too late to stop the aircraft. The ensuing collision resulted in the loss of 583 lives.11

In 1978, United Airlines Flight 173 was bound for Portland, Ore., when it met a similar fate to that of Eastern Airlines Flight 401, which crashed into the Everglades.12 In a similar series of events, the captain became distracted when a landing gear indicator lamp failed to illuminate. Despite the flight engineer’s confirmation that the landing gear was in fact down, the captain continued to troubleshoot the lamp. While circling Portland International Airport, the captain was informed by fellow crew members that they were dangerously low on fuel, but he failed to act on this critical information. The culture and command structure present in the cockpit at that time kept the other crewmembers from speaking up and confronting the captain’s error. Consequently, the pilot continued to troubleshoot the landing gear indicator lamp until the aircraft ran out of fuel and crashed. As a result, 10 lives were lost including eight passengers and two crew members.

NASA’s analysis of aviation crashes demonstrated that human error was the prevailing cause. In response to the NASA findings, officials set out to address such human error through a program aimed at the management of resources in a dynamic flight environment.10 In 1979, NASA convened a workshop consisting of airline representatives, government officials, and researchers in the field of human factors. This workshop was the birth of CRM, which was then referred to as cockpit resource management.

Although several airlines had begun implementation of CRM by the early 1980s, it was not adopted soon enough to prevent additional airline crashes resulting from human error. In 1982, Air Florida Flight 90 was taking off from Regan National Airport in Washington, D.C. during icy conditions. Ice clogged one of the airplane’s external sensors, which caused the speed indicators to read falsely high. The first officer questioned the accuracy of the readings as they sped down the runway, but the captain dismissed his concerns. Consequently, the captain did not apply enough power as they lifted off and the airplane crashed into the Potomac River, killing 79 people (including four passing motorists).13

CRM has since been widely adopted throughout the aviation industry, and its impact has been remarkable. In 1989, United Airlines Flight 232 suffered catastrophic engine failure while en route from Denver to Chicago.14 The flight crew managed to land the crippled aircraft despite a lack of typical airframe controls, a feat they attributed to the use of CRM.5 Although the crash landing resulted in the loss of 111 lives, the behaviors of the flight crew overcame the technical failures of the aircraft and saved the lives of the other 184 passengers.

In a more recent example, Captain “Sully” Sullenberger, one of the pioneers of CRM in the aviation industry and an expert in the area of aviation, safety, and human factors, became a household name when he and his crew safely landed crippled US Air Flight 1549 on New York’s Hudson River following a bird strike, a rare event that caused loss of both engines.15,16 The astute situational awareness, clear communication, effective decision-making, task management, and teamwork demonstrated by the crew of Flight 1549 saved the lives of all 155 passengers and now serves as a hallmark example of how the non-technical skills at the core of CRM can overcome the factors that might otherwise translate into human error.

In an Oct. 21, 2015 interview on the application of CRM in EMS, Sullenberger described the importance of the non-technical skills comprising CRM as being “Just as important, if not more important, than technical skills.” He went on to say, “What happened that day was no accident; it was the culmination of literally 40-years of effort, study, thinking about these important concepts, and then trying to hone my skills, both technically and the human skills, every day on every flight.”17

Following the results in the aviation industry, many other professions have adopted CRM, including several healthcare disciplines.3,6,8,18,19 Healthcare shares many of the characteristics of other high-risk fields, including the need for high-functioning teams and a high cost of error.2 According to Salas et al, the implementation of CRM training in the healthcare industry is still young but shows great potential.8

Human Error in EMS

The prehospital environment is ripe with potential errors because EMS providers are often fatigued and sleep-deprived.20 This is due to long and irregular hours in a highly unpredictable environment in which they must quickly and competently make decisions that may affect life and death. Further, they must do so despite distractions, such as loud noises, poor lighting, chaotic events, forces of nature, and other potential hazards.21–20 Such high-tempo situations, as well as the presence of “multiple, simultaneous” demands, can lead to task saturation and subsequent medical errors.18,25

A recent estimate of patient harm resulting from hospital care suggested that more than 200,000 people die from preventable medical errors each year.26 This is roughly the equivalent of three airline passenger planes crashing every day with no survivors.16 Despite the magnitude of this problem, Meisel et al found no available estimate of the incidence of prehospital medical errors, which is likely due to a lack of contextual research and an aversion to error reporting.27 Fairbanks et al elaborated on the causes of prehospital medical errors and identified five basic themes associated with these errors, one of which deals with the lack of error reporting due to a “blame and shame” culture in which punishment is associated with quality assurance.28 According to Vilke et al, such a punitive approach eliminates the benefit of an error reporting system and steals the opportunity for others to learn from our mistakes.29 As a consequence of this cultural misstep, Fairbanks et al found that EMS providers were more likely to assign blame to others rather than accept blame for an error.28 Nonetheless, serious injuries and death have been reported as a result of prehospital medication errors.30,31

Borrowing from the field of anesthesia, Cooper et al examined the connection between medical errors and human factors. They reported that a combination of restricted vision, fatigue, loss of vigilance, distraction, and a lack of familiarity with equipment accounted for as much as 87% of fatal medical errors.32 Whereas EMS shares many of the in-hospital contextual factors associated with medical errors (e.g., time sensitivity and broad clinical presentations), Meisel et al identified some unique factors in the prehospital setting (e.g., resource limitations and fewer providers) and found that although skills and medication delivery were sources of errors, more than half of the total errors in their study resulted from clinical judgment errors.27

Such factors as stress, emotion, and fatigue can all contribute to medical errors by increasing susceptibility to cognitive errors.33–36 LeBlanc et al found the effects of stress negatively affected the performance of paramedics, resulting from excessively broadened and narrowed levels of attention, as well as impaired decision making when participants were subjected to the complexities of typical EMS patient encounters.37 According to Barger et al, extended-duration shifts, as defined by equal to or greater than 24 hours, were also associated with an increased incidence of medical errors, adverse events, and loss of attention in a study of physician interns.33 Lockley et al reported that physician residents were 300% more likely to commit a fatal medical error when working a 24-hour shift than when working a 16-hour shift.38 These extended-duration shifts are similar to those worked by many firefighters and EMS providers.39

In addition to stress and fatigue, cognitive compromise can occur when individuals are faced with emotional events.40 In a study conducted in a simulated environment, LeBlanc et al found that paramedic participants experienced anxiety levels significant enough to affect the accuracy of their medication calculations.37 Such anxiety is likely not limited to a simulated environment, however, as Taber et al described the actual context of EMS as “emotionally tumultuous.”41

Even experienced healthcare providers are prone to error when subjected the effects of fatigue, emotion, and stress.37,42 This is why it is essential to foster an environment that supports communication among team members. According to Sexton et al, steep hierarchies exist in the medical field and the culture of these roles, including the inappropriateness of questioning superiors, has a negative effect on teamwork and, as a result, might contribute to medical errors.43

Applying CRM to EMS

The successful application of CRM to reduce errors in the field of EMS requires that the industry first develop a culture that is accepting of the realities of human fallibility. Only in such an environment will providers ever truly understand the root cause of errors and therefore be able to target interventions to reduce and mitigate those errors. Then researchers must carefully study and measure the effect of the interventions to ensure their success.

A culture of acceptance: The concept of error management addresses the predictable patterns of errors that humans make and acknowledges that these errors are unavoidable and will occur.6 According to Etchells et al, it’s essential to accept “the immutable reality that humans make mistakes.”44 By accepting this, attention can be focused on improvement instead of “demanding perfection from individual health care professionals.”44 Error management differs from the idea of error avoidance by establishing a culture that takes a positive approach and views errors as potential learning experiences.45 Van Dyck et al suggested that a culture with established norms for responding to errors allows for more open communication and subsequently leads to earlier detection and correction of errors.45 This inherently reduces the negative effects of those errors.

Understanding human error: One explanation of human error is known as the Swiss-cheese model.46 This theory argued for the evolution of errors rather than their occurrence. It used an analogy regarding the holes in several pieces of Swiss cheese to represent holes in multiple layers of defense. When these holes align, latent errors pass through defenses and become problematic.

A similar view of error management was introduced by Helmreich et al and has since been widely recognized.3,6,45,47,48 These researchers referred to this concept as the error troika, as it involves three layers of defense: avoidance of errors, trapping of errors, and mitigation.1 This multilayer defense is intended to combat the evolution of undetected errors into larger, more critical errors. Other researchers asserted that minimal harm typically results from latent errors but acknowledged the potential for greater frequency of occurrence and, thus, increased opportunities for critical evolution of these errors when faced with the effects of stress and fatigue.6 Bleetman et al identified the following four occurrences likely to trigger such errors:

Interruptions and distractions,
Tasks required out of the normal sequence,
Unanticipated new tasks, and
Interweaving multiple tasks.6

It is suggested that recognition of these potential triggers prepares people to avoid errors.

Targeting interventions: In 2010, Croskerry followed up on the Institute of Medicine’s report, “To Err is Human,” and suggested a focus on the human aspects of medical errors.20,49 Croskerry argued that it is in “the characteristics of human performance in the health care setting” where solutions should be sought.49 It is this pursuit that brought the search for a solution to medical errors away from technical skills and focused on non-technical skills, which are the basis for programs such CRM and provide the tools needed to mitigate the effects of stress, emotion, and fatigue.50 In 2010, Flin et al defined non-technical skills as “the cognitive, social, and personal resource skills that complement technical skills, and contribute to safe and efficient task performance.”7 In this realm of non-technical skills, they included situational awareness, decision-making, task management, teamwork, and communication.

Fletcher et al divided these non-technical skills into two categories. Cognitive skills include decision making, task management, and situational awareness. Interpersonal skills include teamwork and communication.51

Appropriate use of these skills can help overcome limitations of human performance in dynamic and complex situations. In 2010, Flin et al pointed out that not only are both skill sets essential for optimal human performance, but a synergistic effect on performance exists when both skill sets are good.7 Several studies have shown that CRM training can lead to improvements in these areas.47,52,53

Outcome measurement: One researcher asserted that research must be conducted to “establish a link between CRM training and patient outcomes.”3 Although CRM has been shown to have a positive effect on behavior in the various fields that have implemented such strategies, the connection has yet to be made between CRM and improved clinical outcomes.54 One researcher found that teamwork, task management, and decision-making can be improved through CRM training in a simulated EMS environment, but his work failed to show a statistically significant difference in the number of medical errors committed by participants.4 Similarly, a recent meta-analysis failed to find evidence that that these behavioral changes have translated into a quantifiable improvement in clinical outcomes in the acute care setting.2 It is not known if this is the result of study techniques, further research is needed.

Conclusion

The origins of CRM in the aviation industry serve as a foundation for what has evolved into a widely adopted concept of error management based on principles of teamwork, communication, and a flattened leadership hierarchy. Together, these principles serve to reduce human error and improve patient safety. Further, this concept requires the members of a profession to embrace the importance of pursuing excellence while recognizing that “Just good enough is never good enough.”17

References

  1. Helmreich RL, Merritt AC, Wilhelm JA. The evolution of crew resource management training in commercial aviation. Int J Aviat Psychol. 1999;9(1):19-32.
  2. O’Dea A, O’Connor P, Keogh I. A meta-analysis of the effectiveness of crew resource management training in acute care domains. Postgrad Med J. 2014 Dec;90(1070):699-708. doi: 10.1136/postgradmedj-2014-132800. Epub 2014 Nov 4. doi:10.1136/postgradmedj-2014-132800
  3. Oriol MD. Crew resource management: Applications in healthcare organizations. J Nurs Adm. 2006 Sep;36(9):402-6.
  4. Carhart E. Effects of crew resource management training on medical errors in a simulated prehospital environment [dissertation). Ft. Lauderdale (Fla.): Nova Southeastern University. Available from: ProQuest Dissertations and Theses database. UMI No. 3534980; 2012.
  5. International Association of Fire Chiefs [Internet]. Fairfax, Va.: International Association of Fire Chiefs; c1999-2016. Crew resource manual: A positive change for the fire service, 3rd ed. [32 pages] Available from: http://www.iafc.org/associations/4685/files/pubs_CRMmanual.pdf
  6. Bleetman A, Sanusi S, Dale T, Brace S. Human factors and error prevention in emergency medicine. Emerg Med J. 2012 May;29(5):389-93. doi: 10.1136/emj.2010.107698. Epub 2011 May 12.
  7. Flin R, Patey R, Glavin R, Maran N. Anaesthetists’ non-technical skills. Br J Anaesth. 2010 Jul;105(1):38-44. doi: 10.1093/bja/aeq134. Epub 2010 Jun 3.
  8. Salas E, Wilson KA, Burke CS, & Wightman DC. (2006). Does crew resource management training work? An update, an extension, and some critical needs. Hum Factors. 2006 Summer;48(2):392-412.
  9. National Transportation Safety Board [Internet]. Washington, D.C.: National Transportation Safety Board; c1973. Aircraft Accident Report. Eastern Airlines, Inc., N310EA. Report Number: NTSB-AAR-73-14. Available from: http://www.ntsb.gov/investigations/AccidentReports/Pages/aviation.aspx
  10. Cooper GE, White MD, Lauber JK, editors.  Resource management on the flightdeck: Proceedings of a NASA/industry workshop (NASA CP-2120).  Moffett Field (Calif.): NASA-Ames Research Center, 1980.
  11. Airline Pilots Association [Internet]. Washington, D.C.: Engineering and Airy Safety. c1978. Human factors report on the Tenerife accident. [cited 21 May 2015]. Available from: http://project-tenerife.com/engels/PDF/alpa.pdf
  12. National Transportation Safety Board [Internet]. Washington, D.C.: National Transportation Safety Board; c 1979. Aircraft Accident Report. United Airlines, Inc., N8082U. Report Number: NTSB-AAR-79-7. Available from: http://www.ntsb.gov/investigations/AccidentReports/Pages/aviation.aspx
  13. National Transportation Safety Board [Internet]. Washington, D.C.: National Transportation Safety Board; c 1983. Aviation Accident Report. Air Florida, Inc., N62AF. Report Number: NTSB-AAR-82-8. Available from: http://www.ntsb.gov/investigations/AccidentReports/Pages/aviation.aspx
  14. National Transportation Safety Board [Internet]. Washington, D.C.: National Transportation Safety Board; c 1990. Aviation Accident Report. United Airlines, Inc., N1819U. Report Number: NTSB-AAR-90-6. Available from: http://www.ntsb.gov/investigations/AccidentReports/Pages/aviation.aspx
  15. National Transportation Safety Board [Internet]. Washington, D.C.: National Transportation Safety Board; c National Transportation Safety Board [Internet]. Washington, D.C.: National Transportation Safety Board; c 2010. Aviation Accident Report. US Airlines, Inc., N106US. Report Number: NTSB-AAR-10-03. Available from: http://www.ntsb.gov/investigations/AccidentReports/Pages/aviation.aspx
  16. Stableford D. (2014, January). Sully: 5 years after the miracle on the Hudson. [cited 4 Aug 2015]. Available from: http://news.yahoo.com/sully-sullenberger-captain-hudson-miracle-15014700...
  17. Sullenberger S. Personal communication with Elliot Carhart. 21 Oct 2015.
  18. Andersen PO, Jensen MK, Lippert A, Ostergaard D. Identifying non-technical skills and barriers for improvement of teamwork in cardiac arrest teams. Resuscitation. 2010 Jun;81(6):695-702. doi: 10.1016/j.resuscitation.2010.01.024. Epub 2010 Mar 20.
  19. Flin R, Maran N. Identifying and training non-technical skills for teams in acute medicine. Qual Saf Health Care. 2004 Oct;13 Suppl 1:i80-4.
  20. Kohn LT, Corrigan JM, Donaldson MS, editors. To err is human: building a safer health system. Washington, D.C.: National Academy Press;  2000.
  21. American College of Emergency Physicians. International trauma life support for prehospital care providers, 6th ed. Upper Saddle River, NJ: Pearson Education; 2008.
  22. Elliot DL, Kuehl KS. Effects of sleep deprivation on firefighters and EMS responders. Fairfax, Va.: International Association of Fire Chiefs; 2007.
  23. Institute of Medicine, Committee on the Future of Emergency Care in the United States Health System. Emergency medical services: at the crossroads. Washington, DC: National Academies Press, 2006.
  24. Bureau of Labor Statistics. Emergency medical technicians and paramedics. Occupational Outlook Handbook (2010-2011 ed.), 2010
  25. Fan X, Sun S, McNeese M, Yen J. Extending the recognition-primed decision model to support human-agent collaboration. Utrecht, Netherlands: 4th International Joint Conference on Autonomous Agents and Multiagent Systems; 2005.
  26. James JT. A new, evidence-based estimate of patient harms associated with hospital care. J Patient Saf. 2013 Sep;9(3):122-8. doi: 10.1097/PTS.0b013e3182948a69.
  27. Meisel ZF, Hargarten S, Vernick J. Addressing prehospital patient safety using the science of injury prevention and control. Prehosp Emerg Care. 2008 Oct-Dec;12(4):411-6. doi: 10.1080/10903120802290851.
  28. Fairbanks RJ, Crittenden CN, Ogara KG, Wilson MA, Pennington EC, Shaw MN. Emergency medical services provider perceptions of the nature of adverse events and near-misses in out-of-hospital care: An ethnographic view. Acad Emerg Med. 2008 Jul;15(7):633-40.
  29. Vilke GM, Tornabene SV, Stepanski B, Shipp HE, Ray LU, Metz MA, Vroman D, Anderson M, Murrin PA, Davis DP, Harley J. Paramedic self-reported medication errors. Prehosp Emerg Care. 2007 Jan-Mar;11(1):80-4.
  30. Haynes BE. Two deaths after prehospital use of adenosine. J Emerg Med. 2001 Aug;21(2):151-4.
  31. Horowitz B, Jadallah S, Derlet R. Fatal intracranial bleeding associated with prehospital use of epinephrine. Ann Emerg Med. 1996 Dec;28(6):725-7.
  32. Cooper JB, Newbower RS, Long CD, McPeek B. Preventable anesthesia mishaps: A study of human factors. Qual Saf Health Care. 2002 Sep;11(3):277-82.
  33. Barger LK1, Ayas NT, Cade BE, Cronin JW, Rosner B, Speizer FE, Czeisler CA. Impact of extended-duration shifts on medical errors, adverse events, and attentional failures. PLoS Med. 2006 Dec;3(12):e487.
  34. Bernius M, Thibodeau B, Jones A, Clothier B, Witting M. Prevention of pediatric drug calculation errors by prehospital care providers. Prehosp Emerg Care. 2008 Oct-Dec;12(4):486-94. doi: 10.1080/10903120802290752.
  35. Jha AK, Duncan BW, Bates DW. Fatigue, sleepiness, and medical errors. In: Markowitz AJ, editors. Making healthcare safer: A critical analysis of patient safety practices. Agency for Healthcare Quality and Research Pub. No. 01-E058, 519-532 pp., 2001.
  36. Lim J, Dinges DF. A meta-analysis of the impact of short-term sleep deprivation on cognitive variables. Psychol Bull. 2010 May;136(3):375-89. doi: 10.1037/a0018883.
  37. LeBlanc VR, MacDonald RD, McArthur B, King K, Lepine T. Paramedic performance in calculating drug dosages following stressful scenarios in a human patient simulator. Prehosp Emerg Care. 2005 Oct-Dec;9(4):439-44.
  38. Lockley SW1, Barger LK, Ayas NT, Rothschild JM, Czeisler CA, Landrigan CP; Harvard Work Hours, Health and Safety Group. Effects of health care provider work hours and sleep deprivation on safety and performance. Jt Comm J Qual Patient Saf. 2007 Nov;33(11 Suppl):7-18.
  39. Patterson PD1, Suffoletto BP, Kupas DF, Weaver MD, Hostler D. Sleep quality and fatigue among prehospital providers. Prehosp Emerg Care. 2010 Apr-Jun;14(2):187-93. doi: 10.3109/10903120903524971.
  40. Gray JR. Integration of emotion and cognitive control. Curr Dir Psychol Sci. 2004 Apr;13(2);46–48. doi:10.1111/j.0963-7214.2004.00272.x
  41. Taber N, Plumb D, Joleman S. Grey areas and organized chaos in emergency response. Journal of Workplace Learning 2008 20(4):272-285  doi:10.1108/ 13665620810871123
  42. Leonard M, Graham S, Bonacum D. The human factor: The critical importance of effective teamwork and communication in providing medical care. Qual Saf Health Care. 2004 Oct;13 Suppl 1:i85-90. doi:10.1136/qshc.2004 .010033
  43. Sexton JB1, Thomas EJ, Helmreich RL. Error, stress, and teamwork in medicine and aviation: Cross sectional surveys. BMJ. 2000 Mar 18;320(7237):745-9.
  44. Etchells E, Juurlink D, Levinson W. Medication errors: The human factor. CMAJ. 2008 Jan 1;178(1):63-4. doi: 10.1503/cmaj.071658.
  45. Van Dyck C, Frese M, Baer M, Sonnentag S. Organizational error management culture and its impact on performance: A two-study replication. J Appl Psychol. 2005 Nov;90(6):1228-40.
  46. Reason J. Human Error. Cambridge: University Press, Cambridge, 1990.
  47. Moorthy K, Munz Y, Forrest D, Pandey V, Undre S, Vincent C, Darzi A. Surgical crisis management skills training and assessment: A stimulation-based approach to enhancing operating room performance. Ann Surg. 2006 Jul;244(1):139-47. doi:10.1097/01.sla.0000217618.30744.61
  48. Pizzi L, Goldfarb NI, Nash DB. Crew resource management and its applications in medicine. In Markowitz AJ, editor. Making healthcare safer: A critical analysis of patient safety practices. Agency for Healthcare Quality and Research Publication No. 01-E058, 501-510 pp., 2001.
  49. Croskerry P. To err is human—and let’s not forget it. CMAJ. 2010 Mar 23;182(5):524. doi: 10.1503/cmaj.100270. Epub 2010 Mar 15.
  50. Flin R, Fletcher G, McGeorge P, Glavin R, Maran N, Patey R. Rating anaesthetists’ non-technical skills: The ANTS system. Denver: 47th Human Factors and Ergonomics Society Conference; 2003.
  51. Fletcher GC, McGeorge P, Flin RH, Glavin RJ, Maran NJ. The role of non-technical skills in anaesthesia: A review of current literature. Br J Anaesth. 2002 Mar;88(3):418-29.
  52. Awad SS, Fagan SP, Bellows C, Albo D, Green-Rashad B, De la Garza M, Berger DH. Bridging the communication gap in the operating room with medical team training. Am J Surg. 2005 Nov;190(5):770-4. doi:10.1016/j.amjsurg.2005.07 .018
  53. Police One [Internet]. San Francisco: Praetorian Group; c2016. [updated 8 Feb 2008]. Rahman M, Grossman D, Asken MJ. c2008. High-velocity human factors: Factoring the human being into future police technology. [about 10 screens.] Available from: http://www.policeone.com/police-products/vehicle-equipment/ articles/1646301
  54. O'Connor RE, Slovis CM, Hunt RC, Pirrallo RG, Sayre MR. Eliminating errors in emergency medical services: Realities and recommendations. Prehosp Emerg Care. 2002 Jan-Mar;6(1):107-13.

 

The EMS Reference is a community project, and we encourage your suggestions. Give us your feedback.

Published: March 15, 2016
Revised: January 5, 2017