Remarks of Jim Hall, Chairman
National Transportation Safety Board
before the Symposium on Corporate Culture and Transportation Safety

Washington, D.C., April 24, 1997


Thank you, Secretary Slater, for those remarks, and thank you for expressing your commitment to exploring this vital safety issue by your presence here today.

I want to thank all of you for coming to this symposium, which I think will be remembered for years to come because it will, for the first time, focus all segments of our transportation system on the issue of corporate culture and its effects on safety.

I also want to take a moment to thank the members of the Safety Board staff who organized this symposium, especially Julie Beal, who has invested all of her waking moments and some of her sleep time in the past months heading up this effort. Many of you might know that Julie ran our highly successful Fatigue Forum in 1995, and I think she's done another excellent job on this symposium. Thank you, Julie.

We have with us today representatives of federal and state governments, representatives from domestic and overseas transportation industries, corporations and labor unions, and teachers and researchers from leading universities. All told, there are some 550 of you here.

When we are done tomorrow, I hope we all can agree on how organizational cultures can be committed toward promoting safety as a major element of their strategic plans. Although "corporate culture" has provided either negative or positive reinforcement on operational safety since the dawn of the industrial age, I'd like to open this discussion with one of the most famous examples of the negative influences corporate culture can exact on safety.

Eighty five years ago, mankind's belief in the infallibility of its work bore its terrible and inevitable result. The loss of the RMS TITANIC demonstrated the folly of management overconfidence in its operation, leading to its failure adequately to prepare for predictable, if unwelcome, events. But why would the finest ocean liner the world had ever seen at that time fall victim to such failures?

Granted, there was a regulatory culture that allowed these things to happen, but where was the conscience, or just the common sense, of company management? Just because the TITANIC was allowed to have so few lifeboats, did no one at the company consider the ramifications should the lifeboats be needed?

The fact that this calamity occurred on her maiden voyage I think was the ultimate irony and a monument to individual arrogance, but at the tragic loss of 1,500 lives. Yes, it was Captain Smith who refused to reduce his speed, but if investigators had stopped there, then we would surely have seen a repeat of that catastrophe. We wouldn't have had the imposition of ice patrols on the Atlantic, or international requirements for lifeboats to accommodate an entire ship's complement, for example. The loss of the TITANIC was a good example that the proximate cause is not the same as the probable cause; we must dig deeper to get to the true safety issues.

As you may know, the National Transportation Safety Board is the nation's independent accident investigation agency, with authority to investigate aviation, marine, highway, railroad, pipeline and hazardous materials accidents. Our mission is to learn exactly what happened in these accidents and why they occurred, to determine causes and contributing factors. We are mandated to make those findings public, and, most important, to issue safety recommendations aimed at eliminating future accidents, deaths and injuries.

These recommendations are directed to carriers, equipment manufacturers, unions, oversight agencies and professional associations in all modes of transportation that can set standards or otherwise communicate our safety messages to their members.

The Safety Board's concept of transportation safety has long since gone beyond the criterion of mere accident statistics. An accident, which represents a major failure of the operating system in which it occurred, often results from a combination of circumstances. These circumstances can range from mechanical failures to environmental conditions to human errors to organizational failings. Let me paraphrase one of my former colleagues on the Board, Dr. John Lauber, who said that the absence of accidents does not necessarily indicate the presence of safety.

The safer transportation carriers have more effectively committed themselves to controlling the risks that may arise from each and every one of these factors. The possibility of these factors effecting safety must be anticipated and safeguards must be systematically developed and implemented. All other things being equal, the better this is done, the safer the carrier will be, and the accident statistics will reflect these conditions.

The Safety Board is in a unique position to convene this symposium because for 30 years we have been the eyes and ears of the American people at accident sites. We are a national archive - funded by the taxpayer - of what not to do, to provide lessons so that the same mistakes are not made over and over again. As the federal government's only multi-modal accident investigation agency, we have worked with industries and regulators covering the entire spectrum of our nation's massive transportation network. We have had the opportunity to examine the corporate cultures of our largest transportation suppliers, and many of our smallest.

What is corporate culture? I hope you will have your own definition by the end of our meeting, but let me suggest that it might be defined as stable characteristics of one company or organization that distinguishes it from another organization. Or, put more plainly, "the way things are done around here."

Although the Board might not describe "corporate culture" per se in its reports, it does investigate, and always has investigated, how culture may have set the stage for accidents. We look at management practices, policies and attitudes. And while we use the term "management" broadly, we understand that the best management in the world cannot overcome the influences of a corporate culture that is bent on emphasizing other attributes over safety.

As our knowledge and understanding of the role of corporate culture has improved, our investigations has evolved to encompass more than just management. It takes the full cooperation and dedication of every level in an organization to produce an atmosphere where safety is given pre-eminent status in a corporation's strategic planning.

As I said, the Board recognizes that accidents are not usually caused by one solitary factor, nor do they occur in a vacuum. Safety and accident prevention is everyone's concern and responsibility:

One of the earliest Safety Board recommendations on corporate culture was issued in 1968, our second year of existence, to the Federal Railroad Administration. Following a review of a number of railroad accidents, we told the FRA that we believed that the primary responsibility for improved railroad safety should rest upon railroad management and labor. Our recognition then that safety and accident prevention is the responsibility of management, the individual workforce and government, holds true today.

But how should management fulfill its responsibility to assure safe operation? One role for management is to develop, nurture and maintain a healthy and safe corporate culture. In our practice of accident investigation at the Board, corporate culture issues fall within the organizational factors area. In the most elementary terms, we have treated the culture of any given transportation organization as their collective mindset. Let me give you a specific situation that defines the sort of mindset I'm referring to.

Consider this scenario: A transit train operator stops the train between stations and in doing so successfully avoids endangering the train and its passengers. However, stopping the train without permission violates an operating procedure. The operator calls the dispatcher to obtain permission but is told not to stop. He stops anyway.

But now, by stopping and avoiding the danger, the operator has no way to prove that an unsafe condition actually existed. The question is, what action does his superior take? Is the train operator disciplined reflexively for his disregard of the rules, or is he recognized for his alertness. I'll talk more about this situation when I describe an accident where an operator had to make just such a choice and he elected to conform to established procedures, with tragic results.

We at the Safety Board do not consider ourselves experts in the field of corporate culture; that is why we have organized this symposium, to hear from many of you who have devoted much of your professional careers to this subject. I acknowledge that there are different ways to describe corporate culture; let me suggest one possible way of describing its basic components:

I'll defer to our guests at this conference for a more scholarly treatment of the subject. However, in our practice of accident investigation, we have found this concept useful.

Unlike some causal factors, it is not easy to identify corporate culture problems in the early days after an accident. For instance, it can be apparent soon after a train derailment that perhaps a broken rail initiated the accident; or after a ship grounds on a shoal, investigators learn that there was confusion among the officers on the bridge about the ship's position; or that a truck drifted into oncoming traffic because the driver went to sleep.

But it takes additional information and analysis to conclude that the train derailment resulted because management had decided to postpone replacement of the defective rail. Similarly, confusion about the ship's position was because management had neglected to provide training for the deck officers on the computerized navigation system. And the truck driver went to sleep because management imposed an incentive for drivers to continue on duty rather than rest.

As a matter of good investigative practice, it is never assumed that any accident operator's actions occurred in isolation. Each driver, engineer, pipeline operator or ship's officer performs the job in an environment of policies, procedures, operating limitations and operating latitudes.

In the course of my remarks, I will be citing specific accidents that illustrate the evolution of our investigations of corporate culture issues. Our staff went through 30 years of NTSB records to find these examples. They are not offered to single out any particular person or company; indeed, in many cases, the organizations involved have taken steps to rectify their problems.

One flag for recognizing potentially unsafe cultures is management thinking and practices that are antagonistic or indifferent toward their employees in safety sensitive jobs. Another flag is an organization's practices that vary from the accepted standards found in the industry. Third, an unsafe culture may exist if it is determined that an employee's operating performance conformed to carrier procedures or reflected the accepted values and attitudes found in the carrier and an unsafe situation still occurred.

Let me give you another example. Years ago our assessment of corporate culture focused primarily on whether management actively discouraged their operators -- in aviation, that would be pilots -- from following the established company and government rules and procedures. Yet, considering how much we have learned about corporate culture, it is difficult to accept the fact that accidents occurred because some companies actually encouraged rule breaking. For example, on May 30, 1979, a DeHavilland Twin Otter, operated by Downeast Airlines, a regularly scheduled commuter flight from Boston, crashed near Rockland, Maine, while the pilot was attempting to land in restricted visual conditions. Both pilots and all but one of the 16 passengers were killed in the accident.

The investigation found that the visibility was so poor that the pilot could not have been able to see the airport at the point, known as the decision height, at which he was required to abandon the approach if he could not see the airport. Why then did he attempt to land anyway, given the known hazards and prohibitions against such attempts? Well, further investigation found a corporate culture in place at that airline that not only did not enhance safety but actively discouraged it. The owner of the airline, who as president directed its day to day operations, conveyed to the captain and to all pilots his expectations that they would cut corners in the interest of saving money. In fact, he criticized and threatened them when they did not.

In addition, we learned that the captain believed that he had no real authority to stand up to the company president and, moreover, feared for his job if he did not satisfy the president's wishes. Although this accident was certainly not the first one in which an abusive or threatening management style adversely affected flight safety, it was the first we could recall in which the Safety Board explicitly addressed the role of management in the cause of an accident. The lessons of this accident were unmistakable: a management climate that pressures pilots to ignore flight rules and safe operating practices, and threatens pilots if they do not conform to these practices, adversely affects the safety of the operation.

Several years later, an accident gave us new insights into how, even in the absence of explicit or implicit company orders to violate rules, management cultures can hurt the safety of transportation operations. On October 11, 1983, a Hawker-Sidley 748, operated by Air Illinois, crashed on its regularly scheduled commuter flight from Springfield to Carbondale, Illinois, killing the two pilots, the flight attendant and all seven passengers onboard.

We learned that within minutes after takeoff from Springfield the airplane lost its two electrical generators, leaving only batteries to supply electrical power to the airplane's systems. Despite the fact that the pilots could have readily attempted a safe return to Springfield, the captain decided to continue the flight to its scheduled destination, estimated to be about 45 minutes away. After the batteries died, instrumentation was lost and the pilots were unable to maintain level flight. When the plane entered storm cells, it crashed.

We found that, to comply with FAA requirements, the airline had created a management structure to oversee maintenance, but it existed on paper only. Key inspector positions were unfilled. Although there was no evidence that the captain had been directed to circumvent or ignore FAA rules, in contrast to the previously cited Downeast Airlines accident, the NTSB learned that captains who had taken weather-related delays were questioned by the airline's management and asked to explain their conduct. A commendation from company management to this captain was found in his files complimenting him for his efforts to successfully maintain schedules.

The Air Illinois accident showed us that transportation companies can, through their actions, communicate to their employees an attitude that subsequently influences the degree to which employees comply with operating rules and with safe operating practices. Airlines and other organizations that question those who may be willing to risk on-time arrivals in the interests of safety, even if no further action is taken, may suggest to their personnel that safety is secondary to on-time performance.

I want to offer you one more example of corporate culture providing a negative influence on safety, the fatal collision on our local Metrorail system in the Maryland suburbs. I alluded to this accident earlier.

For those of you who have not heard about this accident, it was a collision in January 1996 between two trains on Metro's Red Line. A moving train struck an unoccupied standing train that was not in service, killing the operator of the moving train.

The reason I selected this accident for presentation is because the Safety Board's investigation found markers of an organizational culture that considerably detracted from safe rail transit operation.

At first, it appeared that a train operator simply did not comply with his training. Then different pictures emerged suggesting that a superintendent at the train dispatching facility ignored warnings and did not stop the train. Later still, it appeared that an executive manager had acted capriciously when he changed a long-standing operating policy without consideration of the consequences.

This accident resulted not from singular actions but from an organization-wide set of beliefs held about the infallibility of the automatic train control equipment, somewhat reminiscent of the perceived infallibility of an ocean liner's watertight compartments so many years ago.

The Metro accident occurred shortly after a major snowstorm had begun and trains had started to overrun station platforms at several of the above-ground stations. The accumulating snow and ice reduced the effectiveness of the train's braking system. All trains were operating in the fully automatic mode; that is, they were being controlled by the system computer, not by the operator on board the train or by controllers in the system's central control facility.

Shortly after 10:00 pm, train 111, northbound out of Washington, D.C. enroute to the Shady Grove station in suburban Maryland, emerged from below ground to above ground track. When it reached Twinbrook Station, 12 minutes before the accident, the automatic train control directed the train to stop at the platform. The train did stop, but completely overran the platform.

At Rockville, too, the train partially overran the station. Because the operator had to secure the controls to assist passengers at that station, the train had lost its automated command to operate at the reduced speed of 44 mph. So, after departure from Rockville, the train began accelerating automatically beyond that speed, heading for 75 mph, still within the design limitations of the rails and signals, at least when weather conditions were favorable. The operator called the controller to report the over-speed situation and was told that this was due to his overrunning the previous station and that he was to continue in automatic operation.

As the train approached the Shady Grove station, the controller, who could see the location of the train on his monitor, called the operator and asked if the speed had dropped. Because it had, the controller later told Safety Board investigators that he had a feeling the system was doing what it was supposed to do and didn't believe that he had to put his job on the line by telling the operator to go into manual mode.

Previously, when the rails were slippery, operators were given the authority to run the trains manually so they could properly adjust the approach speeds to fit the weather. However, a new policy had been put into effect two months earlier that forbade any manual operation...I'll get to that in a moment.

At Shady Grove station, a gap train was parked 470 feet beyond the platform on the same track that the accident train was using. A gap train stands by to fill in for unexpected needs, such as when a scheduled train breaks down. It was parked there despite an unwritten Metrorail order that these trains were to be kept on the adjacent inactive track.

You must know what happened next. When train 111 arrived at the Shady Grove station, it slid past the platform at about 30 mph and struck the standing gap train. The operator was found crushed in wreckage near the cab door. There were no other injuries.

The operator had no advance warning of the gap train's position because the trackside signal showed a clear indication. Metrorail had learned a procedure for circumventing the signal system to keep trains from slowing or stopping before they reached the station platform when the gap train was on the same track.

The Safety Board investigation team was launched that night. Once the follow-up investigation work began, a fairly clear picture emerged for what had happened and why. Here are several of the findings from our investigation that explain the accident and also have a direct connection to the organizational and management issues of interest to us today.

  1. Metrorail had recognized that leaving the gap train on the incoming active track presented an unnecessary hazard. But we learned that the gap trains were frequently stored on the active track, apparently because of confidence in the automatic train control system.
  2. There was no formal training program for controllers at the center, other than an annual operating rules examination. Consider that the controller knew of the accident train's overruns at two previous stations, he knew about the deteriorating weather, he knew about the unusually high speed of the train, and he knew the gap train was parked beyond the Shady Grove station. And yet he did not see a need to stop the train.
  3. Equally disturbing was our finding that no one, including the superintendent of the central control center, had the authority to intervene and stop the accident train unless a collision was certain. The controller felt that he would be putting his job on the line if he were to allow the operator to take manual control of the train, just as the Downeast Airline captain believed that his job, too, was on the line if he did not satisfy the president's wishes, even if they ran contrary to his better judgment about safety.
  4. Because manually operated braking was being blamed for a perceived flat-wheel problem, which would indicate poor braking technique, a decision was apparently made by the Deputy General Manager to eliminate all manual operation, even in inclement weather. No train operator knew of or was consulted about the change, and most of management including the General Manager did not learn of the decision until after the accident.

The Safety Board investigation examined the environment for culture that enabled experienced managers at Metrorail to produce such ill-advised and unsafe operating policies, and how such improper methods could then be tolerated. We found that the management style, processes and organizational structure were rigid and militaristic from the Deputy General Manager on down, with no tolerance for opposition that could have provided better informed decisions.

It would be tempting to blame the conditions and circumstances of this accident on one person, the Deputy General Manager. But this would not have recognized corporate culture as a safety problem. Certainly he was part of the problem at Metrorail, but what about the seeming indifference and disregard by some employees' for safety precautions, and the absence of informed opposition when flawed solutions to problems were being considered?

The logic for substantive causation of this accident only holds together when we consider the extent to which management and operating personnel believed the automated train control system would protect them, and that the system would provide adequate margins of safety regardless of the quality of the decisions and policies. Again, I am reminded of the mindset at the White Star Line leading up to the TITANIC disaster.

We made 20 recommendations to the Metrorail Authority as a result of the Shady Grove accident. I realize that I have been somewhat critical in describing Metro's operations, but such is the nature of accident investigation. I would like now to balance some of these concerns with an optimistic observation.

Earlier this month, I met with Metrorail officials to discuss the actions on the Safety Board's recommendations from this accident, and it was apparent that major changes in Metrorail management and in their operating practices were taking place. Not least, several top level executive managers have left, including the Deputy General Manager. A new Director of Safety will be joining Metrorail and he will be reporting directly to the General Manager. This is an organizational structure we have been recommending for years in all modes of transportation and is particularly evident among the major airlines.

Frankly, I was impressed by the aggressiveness of Metrorail management's actions on many of our recommendations, and at this time we have every reason to believe there will be an effective follow-through.

If you look at all of the accidents I've cited, their root causes go beyond a mere lack of planning or poor personnel decisions. Each of the accidents were set up by one or more of the following characteristics:

There are a myriad other accidents that illustrate corporate culture failings in all modes of transportation; tragedies like the Ledger, Montana head-on train collision in 1991; the Brenham, Texas salt dome petroleum storage facility explosion in 1992; the OMI CHARGER tankship explosion in 1993; the crash of an FAA aircraft in 1993; and the Fox River Grove grade crossing accident in 1995. Through our investigations of these and other accidents, as well as through the work of many of our speakers today, our understanding of the role of management in the safety of their organization's operations has increased.

In the last 15 years, much has been written, learned, and communicated about the role of corporate culture on transportation safety, and we will hear from the leading authorities on this subject over the next two days. Largely because of the work of Dr. James Reason and the others who will follow shortly, transportation companies and governments around the world have come to recognize and understand better how operator errors, irrespective of their immediate causes, are often influenced by management conduct and attitudes.

We and the transportation community have come to recognize that management has responsibility for creating and fostering a climate that encourages safe operations. We hope to learn more today and tomorrow about how best to ensure that a climate of safety remains paramount in any organization's corporate culture.

Your presence here is important, not just because many of you are leaders of your organizations, but because you will have to be leaders of your industries. The practice of good corporate culture is not just good for safety, it is good for business. But you will have to show the way for others, because it only takes one or two bad apples to sully the reputation of an entire industry.

Thank you for coming. I look forward to a spirited discussion.


Jim Hall's Speeches