
Features
Procedures
Safety & Training
The Domino Effect
May 4, 2010 By Paul Dixon
In six short years, the CHC Safety & Quality Summit has expanded from an in-house program to a successful conference attracting more than 600 delegates from around the globe, giving ample testimony to the hunger that exists for knowledge in creating and maintaining higher safety levels in aviation
In six short years, the CHC Safety & Quality Summit has expanded from an in-house program to a successful conference attracting more than 600 delegates from around the globe, giving ample testimony to the hunger that exists for knowledge in creating and maintaining higher safety levels in aviation. One of the themes throughout this year’s Summit, held March 22-24 in Vancouver, was determining accident accountability. While the overall number of rotary-wing accidents has declined substantially over the years thanks to improvements in design, engineering, and fuel systems, the number of accidents attributable to pilot error has remained constant. And, while the number of accidents has declined, the number of pilot error events remains the same, meaning the percentage of pilot error events has spiked.
![]() |
Captain Gene Cernan (last man to walk on the moon) with CHC’s CEO Sylvain Allard.
|
Of course, finding fault with pilots is easy, as they are the last persons to have possession of an aircraft and are the first to arrive at the scene. But it’s not that simple. Several questions remain: Is there a disconnect between the regulatory side of the equation with government-mandated SMS and corporate safety policies on one side, opposed by the operational needs of the organization on the other? And if we have stringent guidelines imposed by the appropriate regulatory bodies, and companies are following these guidelines, why are we seeing accidents at the rate we do? “Accident” is the wrong word in most cases, but it remains in our lexicon, as though these events are unpredictable, and therefore unpreventable, which is what a true accident is.
“Why do intelligent people make dumb decisions” was a key theme at this year’s CHC Summit. As keynote speaker Gene Cernan said, reflecting on his career as a U.S. Navy aviator and NASA astronaut: “there are accidents and then there are deliberate acts.” It goes beyond the act of a single person in making a consciously wrong decision.
Steve Newton, superintendent of aviation management for the B.C. Ministry of Forests, echoed a similar sentiment a week earlier at the Aerial Firefighting conference in Vancouver. As Newton noted, there are moments that make one say, “huh?” These are events we experience or witness that cause us to stop and say, “what the heck was that?” It’s the moment when the output doesn’t match the input or when we realize pure luck spared us from disaster. All too often, we register these moments and, when the moment passes and no one is worse for wear, we carry on as if nothing happened and without any regard for future consequences.
A Harrowing Incident
Step outside the aviation world for a true “huh?” moment. In 1988, the roof of a new supermarket in suburban Vancouver collapsed on the morning of its grand opening. The quick thinking of the mayor, on hand to cut the ribbon – and coincidently the former fire chief of the municipality – in ordering the evacuation of the more than 1,000 people in the building at the first hint of a problem was credited with saving many lives. A design flaw in the main roof beam caused it to fail when vehicles were driven onto the rooftop parking lot for the first time. The mayor heard the creaking of the structural steel and saw cracks spreading across the inside of the roof.
A board of enquiry determined fault lay with the professional engineer who had designed the beam. He had allowed a student intern to complete the design calculations and had merely added his stamp of approval and signature. What he failed to realize was that the student had made a simple mathematical error in his calculations, dividing where he should have multiplied, resulting in a woefully engineered beam. While the engineer was found liable and subsequently lost his professional standing, many others had the opportunity to say, “huh, there’s something wrong here” but didn’t.
Larry Mattiello, president of Dallas, Texas-based insurer Airsure, noted at the CHC Summit that “in eight out of 10 investigated accidents, someone somewhere knew something that would have prevented the accident, but they felt no compulsion to speak out.” Steve Walters of New Zealand Helicopters calls this the “Cassandra Syndrome,” where predictions are made (as in a SMS), yet the steps towards prevention are not taken. The result? “Pilot error is likely the last of many events in a long chain that contributes to the air crash,” he said. “In a preponderance of accidents, pilots and air crew are set up for failure.”
A Trail of Events
If SMS is to ensure people are set up to succeed, how is it pilots and air crew can arrive at a point in time when failure is the only option? New Zealand Helicopters uses Zotov error mapping to establish a timeline that works backwards from the incident, identifying all the processes that had any impact on the final outcome. The trail may go back years in order to fully incorporate all factors. It becomes clear that what is easy to write off to “pilot error” at the end of the equation, was really a combination of events that may have included a series of errors that began on the drawing board where the aircraft was first conceived and continued through the life of the aircraft until final impact. These are small errors, where no one error leads directly to a crash, but a series of dominoes once set in motion could have only one conclusion. “Most accidents result from a long chain of events that develop over time,” Walters said. “A slow drift from safe practices to at-risk practices is not easily detectable.”
![]() |
From left to right: Dennis Rigo (director training & licensing, CHC), Terry Palmer (base manager, CHC), Guido Lepore (director flight standards, CHC).
|
Attitude, or the human factor, is a key ingredient in creating and maintaining a safety culture. The corporate attitude has to be in line with the message and the front-line pilots, aircrew and mechanics have to see that connection and invest themselves in the process. Regulations can be imposed by governments, insurance companies and senior management, but if the safety culture does not exist on the shop floor or flight deck, then it’s all moot.
Inside Information
While determining accountability and responsibility in aviation accidents was a strong theme throughout the conference, Scott Shappell addressed two human factors of note: the effects of fatigue on aviators and “Spin and Puke” – an overwhelming crowd favourite. Shappell is a professor of engineering at Clemson University in South Carolina, with a long and storied background with the U.S. Navy and NASA in the human engineering factors affecting aviators. Fatigue is difficult to judge and we are often the worst judges of our own state of fatigue and the effects on our ability to perform complex tasks. “We don’t have a fuel gauge on our foreheads,” Shappell says.
He has done testing with pilots describing themselves as totally fatigued who then ace complex simulator tests. Those same pilots fail the same tests miserably when reporting themselves fully rested. The broad answer is that rested pilots will score much higher than fatigued pilots, but the individual is often the worst person to judge their particular performance level at any given time.
“Spin and Puke” addressed issues affecting the balance mechanism of the inner ear and the resulting behaviours of pilots when their perception of events does not equate with reality. Of particular note was Shappell’s description of the effects of alcohol on the inner ear and the resulting impairment of balance and judgment. Alcohol that enters the endolymphatic fluid of the inner ear does not metabolize at the same rate as alcohol in the blood and there can be significant impairment of balance long after the blood alcohol level has returned to zero. Shappell asked his audience if anyone could list the determining factors behind the eight-hour “bottle to throttle” restriction for pilots or the U.S. military’s 12-hour restriction. It was a trick question: from his personal research there is no supporting scientific data, simply an arbitrary decision based on “best guess.”