Read accident case studies and aviation stories to help you stay sharp.
By Dan Sobczak
Editor's note: This content does not constitute flight instruction. Consult a certified flight instructor in your area for proper flight instruction.
If you're a pilot, you've undoubtedly heard about the DECIDE model for decision making in aviation.
The DECIDE model for aeronautical decision making.
In 2003, a pilot with 4 passengers onboard attempted to make an ILS instrument approach to runway 32 at Jacksonville Executive Airport (KCRG), where weather was reported and observed as 100-foot ceilings with 1/4-mile visibility, far below the minimums for the ILS approach.
The aircraft collided with trees while on approach. Instrument meteorological conditions (IMC) prevailed and an instrument flight rules (IFR) flight plan was filed. The airplane was destroyed.
The NTSB ruled that:
"The pilot's descent below decision height while performing an ILS approach with low ceilings and fog, resulting in an in-flight collision with trees and the ground. A factor associated with the accident was the pilot's decision to attempt the instrument approach with weather below the prescribed minimums."
A screenshot from Flight Chain App showing the accident in which the pilot died after attempting to make an ILS instrument approach to runway 32 at Jacksonville Executive Airport (KCRG), where weather was reported and observed as 100-foot ceilings with 1/4-mile visibility, far below the minimums for the ILS approach.
The pilot was attempting a precision approach -- an ILS approach that is meant to keep the pilot on a precise course laterally and horizontally, leading directly to the runway.
So how could this have happened? And more importantly, why did it happen?
To answer that, let's look at this accident in terms of the DECIDE model embedded within an accident chain.
The quotes mentioned here in this accident chain format are taken directly from the FAA's Risk Management Handbook (publication FAA-H-8083-2), which provides a unique look at this particular accident within the realm of the DECIDE model.
As the Risk Management Handbook states:
Analytical decision-making is a form of decision-making that takes both time and evaluation of options. A form of this type of decision-making is based upon the acronym 'DECIDE.' It provides a six-step process for the pilot to logically make good aeronautical decisions. For example, a pilot who flew from Houston, Texas to Jacksonville, Florida in a Merlin failed to use the decision-making process correctly and to his advantage. Noteworthy about this example is how easily pilots are swayed from taking best courses of action when convenient courses are interpreted as being in our best interest.
DETECT: Detect the fact that change has occurred.
"In the case at hand, the pilot was running late after conducting business meetings early in the morning. He and his family departed one hour later than expected. In this case, one would assess the late departure for impact to include the need to amend the arrival time.
However, if the pilot is impetuous, these circumstances translate into a hazard. Because this pilot was in a hurry, he did not assess for impact and, as a result, did not amend the arrival time. Key in any decision-making is detecting the situation and its subtleties as a hazard; otherwise, no action is taken by the pilot. It is often the case that the pilot fails to see the evolving hazard.
On the other hand, a pilot who does see and understand the hazard, yet makes a decision to ignore it, does not benefit from a decision-making process; the issue is not understanding decision-making, but one of attitude."
ESTIMATE: Estimate the need to counter or react to the change.
"As the pilot progressed to the destination, it became apparent that the destination weather (at Craig Field in Jacksonville) was forecast as below approach minimums (due to fog) at the time of arrival.
However, weather at an alternative airport just 40 miles away was visual flight rules (VFR). At this time, the pilot should have assessed several factors to include the probability of making a successful approach and landing at Craig versus using an alternative field. In one case, the approach is certainly challenging, but it is an approach at the intended destination. The other location (unencumbered by weather) is inconvenient to the personnel waiting on the ground, requiring that they drive 40 miles to meet the pilot and his family."
CHOOSE: Choose a desirable successful outcome for the flight.
"Selecting a desirable outcome requires objectivity, and this is when pilots make grave errors. Instead of selecting the course of outcome with consideration to challenges of airmanship, pilots typically select an outcome that is convenient for both themselves and others.
And without other onboard or external input, the choice is not only flawed but also reinforced by their own rationale. In this case, the pilot of the Merlin intends to make the approach at Craig despite 100 feet ceilings with 1/4-mile visibility."
IDENTIFY: Identify actions which could successfully control the change.
"In the situation being discussed, the pilot looks at success as meeting several objectives:
The pilot failed to be objective in this case. The identification of courses of action were for his psychological success and not the safety of his family."
DO: Do the necessary action.
"In this case, the pilot contaminates his decision-making process and selects an approach to the instrument landing system (ILS) runway 32 at Craig where the weather was reported and observed far below the minimums."
EVALUATE: Evaluate the effect of your action countering the change.
"In many cases like this, the pilot is so sure of his or her decision that the evaluation phase of his or her action is simply on track and on glideslope, despite impossible conditions. Because the situation seems in control, no other evaluation of the progress is employed.
The outcome of this accident was predictable considering the motivation of the pilot and his failure to monitor the approach using standard and accepted techniques. It was ascertained that the pilot, well above the decision height, saw a row of lights to his right that was interpreted as the runway environment.
Instead of confirming with his aircraft's situational position, the pilot instead took over manually and flew toward the lights, descended below the glidepath, and impacted terrain. The passengers survived, but the pilot was killed."
I'm particularly struck by this exchange, as quoted directly from the NTSB report, in particular the question the pilot asks the controller:
Jacksonville Approach Control advised CRG tower at 0751:06, that Jacksonville International Airport was reporting an RVR of more than six thousand feet and airplanes are making it in. In addition, Jacksonville Approach Control advised the tower controller what headings to issue to the pilot, if the pilot wanted to divert. CRG tower contacted the pilot and relayed the information from Jacksonville Approach Control. The pilot was asked what his intentions were in the event of a missed approach. The pilot replied "I got my brother bringing my Mom there into your airfield, so I don't know, what do you think is best, what's closest?" The CRG controller replied Jacksonville was closer than St. Augustine. The pilot informed the controller at 0752:03, that he would go to Jacksonville in the event of a missed approach. The controller cleared the pilot to land and there was no other radio communications between the pilot and CRG tower.
It would seem that -- because the pilot was asking the controller what the controller thought the pilot should do -- the pilot may not have had a solid plan for diverting if conditions at the destination warranted a missed approach and diversion to an alternate airport.
Additionally, it also seems that get-there-itis was a factor in this pilot's decision to continue an ILS approach is less than minimum conditions, as the pilot apparently didn't want to inconvenience family waiting on the ground, as that would require they drive 40 miles to the alternate airport to meet the pilot and his family.
In the end, the unfortunate outcome of this accident flight was so much more than just an inconvenience, and could have been avoided if the aeronautical decision making had been carried out differently on this day.
The Flight Chain App team
Dan Sobczak is the founder of www.FlightChainApp.com, a mobile app that helps pilots learn from accident chains by making NTSB reports more convenient and easier to digest. Dan received his private pilot certificate in 2003.
Flight Chain App and its companion blog www.AheadOfThePowerCurve.com are committed to reducing general aviation accidents, helping improve aviation safety, and growing the pilot population.
The only aviation accident app that helps you see and understand the accident chain from NTSB reports.
Flight Chain App and its blog Ahead of the Power Curve are committed to reducing general aviation accidents, helping improve aviation safety, and growing the pilot population.