Good afternoon ladies and gentlemen. I am pleased to be here at the Center for Maritime Education and to address you concerning human error in marine accidents.
As you are aware, the National Transportation Safety Board is charged with improving the safety of the public by recommending changes in government and industry transportation policies, practices, and systems through independent accident investigations.
The CHALLENGER accident; the EXXON VALDEZ; the derailment of Amtrak's Sunset Limited in Mobile, Alabama; the 1991 runway collision in Los Angeles; the nation's worst drunk driving accident that killed 27 schoolchildren and adults in Carrollton, Kentucky. All of these accidents captured the attention of the nation and the world, all of them were investigated by the National Transportation Safety Board, and all of them involved deficiencies in human performance.Since its inception 28 years ago, the NTSB has investigated more than 100,000 aviation accidents and thousands of surface accidents as the world's premier transportation accident investigation agency. We maintain a list of Most Wanted Safety Accomplishments, which we update every year. These are recommendations that, in the Board's estimation, would accomplish the most benefits to the traveling public. For example, currently on the 17-item list are the safety of commuter airlines, heavy trucks and fishing vessels, and fatigue in transportation.
On call 24 hours a day, 365 days a year, Safety Board investigators travel throughout the country and to every corner of the world to investigate significant accidents, developing a factual record that often leads to the issuance of safety recommendations aimed at ensuring that such accidents never happen again.We have issued almost 10,000 recommendations in all transportation modes to more than 1,200 recipients. The performance -- or lack of performance -- by transportation agencies in carrying out their safety responsibilities is fully analyzed in Safety Board reports.
Transportation accidents kill tens of thousands of Americans a year and cost our economy hundreds of billions of dollars annually. We oversee the safety responsibilities of all the DOT modal agencies, yet to give you an idea of our relative sizes, the Safety Board's annual budget would fund the Department of Transportation for just nine hours.
In its role, the Safety Board attempts to determine first what happened in an accident and then, by establishing a "probable cause," why it happened. For two reasons, the Board has increasingly found the answer to that question in what is termed "human error." This is because over the years the machinery has become more reliable -- jet engines replacing piston engines, for example. The second reason is that we have a better appreciation for the role played by the interaction between people and machines, and in the importance of avoiding an atmosphere susceptible to human frailties. Concepts unheard of years ago, like "corporate culture," now illuminate for us how some human error problems are made inevitable by the context established for the human operator long before the accident scenario begins.
In almost every facet of accident investigation, from operator performance to routine maintenance, there are individual human errors that influence accident causation. They include errors in:
So you can see, human errors are not only pervasive, they are also diverse. And they occur in all modes of transportation. The good news is that strides are being achieved in all of those modes, and that responsible leaders of the transportation world are not reluctant to apply successes from one mode to another.
Individual human errors do not occur in a vacuum. They take place within a cultural, social and organizational context. That is, there are underlying causes and conditions that shape, facilitate or even nurture the behavior and actions of an accident-causing individual. These causes and conditions arise from government, industry, or individual company policies, procedures and programs that either do not exist or do not properly address the issues at hand. The Safety Board often cites such underlying contextual factors as precursors to human error accidents. The Mobile accident that I'll talk about shortly was such a case.
When our investigations identify human errors and the factors that help facilitate them, we recommend changes, corrections, and mitigating strategies to the government and industry entities best able to affect change and enhance safety. Our process though, does not end with a recommendation. The Safety Board is also proactive and seeks to disseminate information to a wide audience to facilitate safe transportation and protection of the public.
The impact of transportation accidents on American society is significant. Forty-three thousand (43,000) people died in all modes of transportation in 1994; 874 of those were from marine accidents, 74 in commercial shipping. Monetary cost estimates run to $100 billion, depending on how they are calculated. The costs to individual companies can be staggering. The EXXON VALDEZ has cost billions to date.
I'll take a moment to remind you that while the Safety Board listed several elements in its probable cause of that accident, the first was an individual human error. The third mate was cited for his failure to properly maneuver the vessel because of fatigue and excessive workload.
Three other elements of the probable cause were contextual, in that they provided the environment in which the first could occur. Personnel and manning policies encouraged employees to work long hours, particularly during cargo handling operations. The Coast Guard Vessel Traffic Service was ineffective because of inadequate equipment and manning levels, inadequate personnel training and deficient management oversight. And, the Board cited a lack of effective pilotage service. Consequently, in its simplest terms, the accident was the result of individual human errors within an organizational context that allowed them to occur.
The impact of such accidents, of course, extends far beyond the individuals or organizations involved. They have an effect on U.S. public policy and government, and thus can affect you.
The same is true of accidents in other modes of transportation. A 1987 collision between an Amtrak train and Conrail freight engines in Chase, Maryland resulted in the tragic deaths of sixteen people. The monetary cost for Amtrak was $82 million.
I mention this particular railroad accident because it had ramifications for all modes of transportation, including the marine industry. If you recall, that accident involved a train crew who had been smoking marijuana on duty. The public and Congress were outraged, and the accident became a catalyst for producing the federal drug testing regulations that the marine industry operates under today.Given this backdrop of accident costs and implications, and the pervasive nature of human errors, it is vitally important to recognize and address the interrelationship between the two.
To illustrate the types of human errors we see, how they lead to an accident and what preventive measures might be taken to preclude their future occurrence, I will touch on a few recent accidents.
Errors resulting from the lack of training and sufficient experience came to the forefront in 1993 as a result of the nighttime collision of the towboat MAUVILLA and its nearly 400-foot barge flotilla with a railroad bridge over the Big Bayou Canot near Mobile, Alabama. The impact displaced a bridge girder, causing the derailment of an Amtrak train eight minutes later. Forty seven people died.
The cause of the accident was a tow boat operator who became lost and disoriented in dense fog. He had no experience operating in fog, and though he had radar aboard his boat, he had no formal training in the use and interpretation of it. His radar knowledge came from on-the-job training on a different type and model from that on the MAUVILLA.
The operator committed several errors. He failed to get off the river and tie up before visibility seriously impacted his ability to do so. He did not use his radar to help him reacquire his true position because he did not know how. When he did use the radar, he misinterpreted the bridge return; he thought it was another tow. And, then he did not question his interpretation of the radar return even after he tried to call and, of course, got no answer. These errors were the operator's, but they happened within a context that allowed them to occur.
The enabling context included the towboat, itself, and the training of the operator. The boat was not equipped nor was it required to be equipped with a compass or charts that may have helped the operator correct his error. Likewise, the radar could certainly have been of assistance if the operator's knowledge had been more than rudimentary. Formal radar training was not required by either the company or the Coast Guard.
The MAUVILLA accident cost 47 lives and nearly 20 million dollars in equipment damage. Such a cost to society is unacceptable. However, the accident could have been prevented if regulators and overseers of safety, both in government and the private sector, had realized the need to more fully equip and train the operator, and recognized the vulnerability of bridges.
Two years ago in Tampa Bay, the nature of communications played into an accident involving three vessels, two carrying petroleum products. The outbound BALSA 37, and the inbound ITB (integrated tug and barge) SEAFARER, collided essentially at the intersection of two channels. The result was a fire and explosion aboard the SEAFARER. Immediately , the BALSA 37 veered toward a second inbound ITB, the CAPT. FRED BOUCHARD, and collided with its barge, rupturing a cargo tank. Approximately $25 million was spent in the oil clean up alone.
The Safety Board identified a contributing cause to that accident as the failure of the pilots and masters of the vessels to adequately communicate their intentions to each other. The vessels had talked to one another, but had not supplied all the pertinent information.
While port-to-port passing agreements were made by radio by all three of these vessels, the pilot aboard the BALSA 37 did not know that the other 2 vessels had also made an overtaking agreement between themselves. That overtaking agreement meant the BALSA 37 would pass the other two side-by-side in a bend, rather than in tandem as the pilot expected.
There are several possible reasons for the pilot's lack of understanding of the impending two-abreast passing. He didn't hear the agreement being made, possibly because he was distracted by other business. He also did not look at the ship's radar to monitor the approaching vessels, and he did not recognize the overtaking situation by observing the approaching vessels' running lights in the early morning darkness.
Of course, the pilot was not on the bridge by himself, though he might as well have been. The only other officer on the bridge was the ship's watch officer, but by and large he did not contribute to the navigation of the ship. This brings me to another subject the Safety Board has discussed in depth in a number of accident investigations and has championed for many years, Bridge Resource Management, or BRM.
BRM is a direct outgrowth of cockpit resource management, or CRM, from the aviation industry, and is a shining example of what I said earlier about transportation modes being willing to learn from successes in other modes. In 1978 a United Airlines DC-8 crashed, with multiple fatalities, about 6 miles short of the runway when it ran out of gas. From the flight data recorder, or more commonly called Black Box, our investigation revealed that the pilot and co-pilot were preoccupied with a hydraulic anomaly while they circled the airport. The flight engineer was aware of the fuel situation and mentioned it, but not forcefully enough to shift the pilot's focus. Our investigation also revealed that pilot training generally focused on training the individual and that each individual was trained and certified in their area of expertise. As a result of that accident, we recommended that the FAA review its flight training program and place more emphasis on participative management for captains and assertiveness training for the other members of the flight team. That was the beginning of CRM.
BRM came about as the result of an investigation of the ramming of the Spanish Bulk Carrier URDULIZ by the U.S. Navy's aircraft carrier, the USS DWIGHT D. EISENHOWER in August of 1988. BRM has somewhat varying definitions, but in general it is the utilization of all available resources -- equipment, information, and personnel -- to achieve safe vessel operation. It involves a sense of shared responsibility. The master must integrate the resources available at any given time through his leadership and command authority while at the same time indicate a willingness to accept operating information from subordinates. The role for others on the watch is to perform their assigned tasks responsibly, to help determine plans for vessel navigation, and to be aware of departures from the plan or from the expected performance of others. They must make any discrepancies known in time to avert operating errors. You can see how these methods could have prevented or mitigated the effects of the accidents I've described.
Using BRM methods and skills is not new. Many effective mariners use them intuitively. But, of course, others do not.
In another example of deficient use of resources, on November 6, 1993, the passenger ship NOORDAM collided with the bulk carrier MOUNT YMITOS near Southwest Pass, Louisiana. Fortunately no lives were lost and no oil pollution occurred, however, damage to the two vessels and lost revenue for the NOORDAM amounted to more than $10 million.
Several human errors contributed to the accident. The watch officers assigned to the NOORDAM's bridge that night did not adhere to basic watchkeeping standards. As a result, despite numerous opportunities, the watchstanders on the NOORDAM did not identify the MOUNT YMITOS until it was about 1 mile away at a closing rate of 23 knots.
The master of the MOUNT YMITOS had identified the NOORDAM when it was 5 miles away. However, he assumed that the passenger ship would soon turn, and he made no attempt to confirm this assumption while evasive action was still an option.
On the NOORDAM, a watch change was occurring at the time of the collision. Nine people were on the bridge. However, they seemed to have been more occupied with other tasks rather than keeping watch. The departing watch officers did not use their radar for collision avoidance or for observing moving targets. The oncoming watch crew did check the radar, but did not see any approaching ships.
The bridge officers on the NOORDAM did not recognize the extent to which their performance had deteriorated. If the precepts of BRM training had been employed aboard the NOORDAM, it is likely this accident would not have occurred.
A final human performance issue that deserves attention is a modern one. It involves society's need to adapt itself to the electronic age, the faith we put in these systems and how easily we can confuse accuracy and ease of use with dependability.
Automation can provide great benefits to all modes of transportation, provided it is properly designed to incorporate human strengths and compensate for human weakness. In aviation, we've made great safety strides with the installation of such technological advances as TCAS, Traffic Collision Avoidance Systems, and GPWS, Ground Proximity Warning Systems. The former has played a hand in the fact that we haven't had a catastrophic midair collision in the United States since 1986; the latter has virtually eliminated controlled ground impact accidents by transport category aircraft in this country.
In railroad, we've been advocating Positive Train Control for a long time and have placed it on our Most Wanted list. This would have prevented almost every major railroad collision we've investigated in recent years.GPS navigation, when used properly, has benefits in both aviation and marine transportation, and can prove to be a business benefit for the railroad industry.
As automation increases, the machine portion of the human/machine system is increasingly able to control its own performance. It can maintain a functional parameter at a given value without human involvement (like your car's cruise control); or progress to maintaining several parameters in the proper relationship (like ARPA); or even alter control schemes to optimize the relationship between functional parameters depending on conditions (like an autopilot).As the level of automation increases, the role of the human in the system becomes increasingly supervisory. Humans verify, monitor and evaluate the performance of the hardware and software portions of this human/machine system, but these are less active tasks and they bring human frailties such as maintaining alertness and vigilance to the forefront.
Humans bear the ultimate responsibility for recognizing, interpreting, compensating for, and correcting or mitigating the consequences of deficiencies, failures, and malfunctions in the hardware and software, and ironically in their own performance. Because the human retains responsibility for the system, regardless of its level of automation, human/machine system failures are often reported as human error.
But system failures can result from contextual factors masked as human failure. Such factors can include a mismatch between task design and human capabilities, inadequate training, or poorly conceived operating or maintenance procedures. The in-depth investigation and analysis of accidents provide proof that often inadequate attention is given to describing, evaluating and facilitating the human performance required of a human/machine system.
To illustrate my point, I will mention the grounding of the ROYAL MAJESTY, which the Safety Board is currently investigating. On June 10, 1995, the Panamanian passenger ship ROYAL MAJESTY grounded about 10 miles east of Nantucket Island, Massachusetts. There were no injuries but lost revenue and damage to the vessel were substantial.
The ROYAL MAJESTY was fitted with an integrated bridge system designed to assist bridge officers with voyage planning, navigation, shiphandling and collision avoidance tasks.
The autopilot portion using programmed information and position data from the vessel's Global Positioning System (GPS) and LORAN-C units, was capable of automatically steering the vessel along a predetermined route. The autopilot was engaged and operating in this mode for the 32 hours preceding the accident.During the investigation, the Safety Board learned that the shield wire component of the GPS antenna cable had separated from the connection at the antenna, causing the unit to transmit inaccurate position data to the integrated bridge system. Neither the system nor the bridge watchstanders detected the failure before the accident. In addition, investigators determined that the vessel's integrated bridge system did not have multiple independent inputs to the autopilot for comparison of position and the GPS was not designed to sound a continuous audio alarm to alert watchstanders in the event of a system failure. Although the system was not required to have these additional capabilities, their absence facilitated the grounding of the ROYAL MAJESTY, more than 17 miles off course.
The Safety Board's preliminary investigation into the grounding of the ROYAL MAJESTY raised such safety concerns that they warranted the issuance of urgent safety recommendations on August 9th of this year, relating to potential failures of integrated bridge systems.
The urgent recommendations, however, address only a portion of the potential issues related to integrated bridge systems. The bridge officers had several means, other than the GPS, by which to determine or evaluate their position. However, at this point in the Safety Board's investigation, it appears that the bridge officers relied solely on the GPS for navigating the vessel and did not effectively cross check with these other sources of position information. As evident in this case, such advanced human/machine systems consist of personnel as well as hardware and software. Each of those three components come together to varying degrees to accomplish the varying system functions. The Safety Board intends to conduct a public forum this winter to address these and other safety concerns related to integrated bridge systems aboard ships. It is hoped that information gathered during the forum will lead to effective recommendations and help improve maritime safety.
But finding yourself miles from where you thought you were is not an embarrassment left solely to mariners. It was only last month that we learned of a transatlantic jetliner that landed in Brussels, Belgium, rather than where the crew thought they were going, Frankfurt, Germany.
Another major factor in many human performance accidents is fatigue -- a subject the Safety Board will address in a proactive effort in about two weeks. On November 1-2, the Board and NASA are sponsoring a multimodal symposium titled "Managing Fatigue in Transportation". In our almost 30-year history, we have investigated accidents in every mode of transportation in which the effects of fatigue, circadian factors and sleep loss have been found to be causal or contributory. The Safety Board has issued nearly 80 fatigue-related safety recommendations since 1972 to the modal administrations in the Department of Transportation, transportation operators, associations and unions. Yet, these issues continue to permeate our society and place a heavy burden on safety and productivity.
The NASA Ames Research Center has developed a fatigue education and training module designed to explain the physiological mechanisms underlying fatigue, to show how this knowledge can be applied to improving operator sleep, performance, and alertness, and to offer countermeasure recommendations. The Safety Board and NASA Symposium is a way to provide this information on what can be done now to reduce the incidence of fatigue related accidents. I would encourage you to attend.
In closing, it is clear that human error accidents have broad implications. They can have a catastrophic impact on lives and the environment, and effect public policy, regulators, ship users and ship owners. We must work diligently to prevent them. We must educate ourselves about human capabilities and frailties, design our ships and their systems to accommodate those human abilities, effectively equip and train personnel to safely operate ships, and progressively refine the process until we get it right.
Thank you for your attention.