Terrain-avoidance warning systems (TAWS) technology, which has been credited with preventing several potential major accidents, underscores the need for continued flight-operations vigilance, especially during the approach and landing phases, according to safety consultant Capt. Dan Gurney.
âIndustry [must] maintain focus on the problems of human error, particularly situations that have potential for error or contain threats that were not identified or were mismanaged. [We still have] much to learn (or remember) about how to identify and counter latent threats,â the former UK Royal Aircraft Establishment and British Aerospace Regional Aircraft test pilot told the Flight Safety Foundation European safety seminar.
Large airplanes began to carry ground-proximity warning systems in 1974, and since then the number of controlled flight into terrain (CFIT) events has declined significantly, said Gurney, who analyzed six sample incidents in which TAWS prevented potentially fatal CFIT accidents.
Ground proximity warning systems (GPWS) were extended to commuter operations in 2000. But that technology is limited to terrain detection immediately belowârather than ahead ofâthe aircraft; it does not detect a sharp change of forward terrain until it is too late to avoid it.
Further, landings free from unwarranted warnings require the GPWS detection logic to be disengaged, so the system cannot issue a warning if ground clearance becomes insufficient.
Enhanced GPWS (or TAWS) combines a digital terrain database with accurate navigation equipment to warn of a perceived discrepancy between aircraft navigation position and the ground almost down to the runway threshold. It advises if steeply rising ground is detected ahead. Now, warnings of obstacles as well as terrain are available, and the equipment is mandatory for all turbine-powered aircraft with six or more seats.
Since EGPWS/TAWS entered airline and corporate-aviation operations in the past five years there has been no CFIT accident involving such an equipped aircraft, said Gurney. Nevertheless, there have been a number of potential mishaps, including serious incidents requiring formal investigations (for which published reports are expected). Other incidents have been studied by operators or manufacturers trying to understand what prevented crews from detecting the circumstances that triggered TAWS warnings.
Gurneyâs analysis of six incidents since March 2003 involving premature descents for landing has found many threats and errors that pilots may encounter routinely. âMost crews on most days will manage these and avoid error-prone behavior, but [circumstances] arise where threats and opportunity for error overcome human capability and a technological solution is required. Ultimately, the crew [is] responsible for managing threats and avoiding error before and after the [TAWS] alert,â said Gurney.
The incidents involved two kinds of threat: pre-existing conditions not posing independent risk (such as airport susceptibility to âblack-holeâ illusion; charts without altitude/range tables; ambiguous procedures; unsatisfactory chart layout, scaling or format; non-precision approaches; and offset distance-measuring equipment) and conditions arising from variable situations (such as darkness; instrument flight rules; late plan change; or failure to react to alerts or warnings).
Improving Risk Assessment
Crew vigilance or audit management would have identified all the pre-existing conditions, said Gurney. Risk assessments should consider other likely conditions that could increase risk. For example, flying a non-precision approach (NPA) in conjunction with weak charts is âa particularly high risk.â
Gurney suggested that the errors apparently made in the events he analyzed appeared to originate from circumstantial conditions, or unidentified or mismanaged threats. Crews did not understand situations, or they chose incorrect courses of action. Misunderstanding included visual illusion or misidentifying visual cues, misinterpreting or misunderstanding procedures, not having or sharing mental models and suffering mental map slips. âThese errors originate in the cognitive (thinking) processesâ what we think about, how and on what we focus, and why we think that something is important,â said Gurney.
Cognitive Errors
Choosing wrong courses of action often involved simple mistakes or memory lapses, perhaps through lax training or poor discipline. âThey originate from weaknesses in cognitive controlâ the way in which we control thinking: self-discipline, double checking, managing time, avoiding preconceptions and not rushing to conclusions.â
The errors should have been detected with self- or cross-crew monitoring. âIt is essential that individuals self-debrief to clarify their understanding of any error, the situational circumstances and threats, or the behavior that may have led to the error.â
Monitoring, which had failed in every incident he analyzed, must be accurately defined, trained and practiced to enable skillful application, according to Gurney. âMonitoring must be truly independent, [which] starts with the approach briefing. Each pilot monitors by crosschecking the [chart] details and his understanding of the plan for the approach. Briefingsâflight plans for the mindâprovide a pattern for subsequent comparisons. The crew needs a shared mental model [that must] be the correct model for the situation.â
With people tending to build internal models (patterns) of how things should be, Gurney advises that crews must guard against short-term tactical thinking where response to expectations often dominates sounder assessment and judgment in strategic thought. â[They should] make an earlier consideration of what a situation could be, consider options and alternatives, and if in doubt ask. In [every] incident, the crew lost awareness of position relative to the runway in altitude, distance and time.â
Since the objective was to land safely, crewsâ focus must include situational awareness of the runway location and continually updating the mental model. They should use available physical tools, said Gurney: âDisplay runway position on the EFIS [screen], pay attention to vertical displays, and select terrain maps for all approaches as well as
for departure.â
Gurney emphasized the need to react to alert messages. âTAWS warnings require action without thought. To gain this skill crews need to practice pull-up techniques in response to a TAWS warning, [preferably] in surprising, stressful train- ing situations. Use a âglass mountainâ terrain model during simulator training.â
In debriefing, if crews argue that âthey âknew where they wereâ and there was no terrain threat,â they must recognize that this was âexactly the erroneous mindset that all the incident crews may have held. They were convinced that they knew where they were and âIt was the TAWS warning that was wrong,â not them. It is essential that training overcomes the desire to understand before acting; a pull-up must be flown without hesitation,â he explained.
Gurney pointed out that in the analyzed incidents, the industry was fortunate to maintain its good safety record. Every event involved aircraft with modern âglassâ cockpits with equipment to enhance situational awareness, but each had been exposed to terrain hazards. In most cases, crews were apparently unaware of their position. In two, the aircraft were at very low altitude, yet were still 1.5 nm from the runway. The single incident triggering an obstacle warning involved the only operatorâs aircraft with âobstacle modeâ activated.
All identified potential threat conditions âmust be reported, removed, avoided or any residual effects countered. Crews [must] recognize situational threats as they are the last line of defense,â according to Gurney. âLuck could be defined as having safety defenses that just matched the hazard or risk. However, [when each] incident involved the crew pulling up following a warning, this definition of luck is unacceptable. We cannot expect that the last line of defense always to hold.â
Active threat and error management, at all management and operational levels, requires constant vigilance, risk assessment and timely decisions to select corrective courses of action, concluded Gurney. âThese processes depend on critical thinking skillsâthe foundations of airmanship, leadership and professional management.â