If you wish to contribute or participate in the discussions about articles you are invited to join SKYbrary as a registered user


Difference between revisions of "Continuation Bias"

From SKYbrary Wiki

(Related Articles)
m (Integration.Manager moved page Work in progress:Continuation Bias to Continuation Bias over a redirect without leaving a redirect)
(No difference)

Latest revision as of 13:59, 30 October 2019

Plan Continuation Bias

Article Information
Category: Human Behaviour Human Behaviour
Content source: SKYbrary About SKYbrary
Content control: SKYbrary About SKYbrary


(Plan) Continuation Bias is the unconscious cognitive bias to continue with the original plan in spite of changing conditions.


The following explanation of continuation bias is derived from a Transport Safety Board of Canada accident report.

To make decisions effectively, a pilot or controller needs an accurate understanding of the situation and an appreciation of the implications of the situation, then to formulate a plan and contingencies, and to implement the best course of action. Equally important is the ability to recognize changes in the situation and to reinitiate the decision-making process to ensure that changes are accounted for and plans modified accordingly. If the potential implications of the situation are not adequately considered during the decision-making process, there is an increased risk that the decision and its associated action will result in an adverse outcome that leads to an undesired aircraft state.

A number of different factors can adversely impact a pilot's decision-making process. For example, increased workload can adversely impact a pilot's ability to perceive and evaluate cues from the environment and may result in attentional narrowing. In many cases, this attentional narrowing can lead to Confirmation Bias, which causes people to seek out cues that support the desired course of action, to the possible exclusion of critical cues that may support an alternate, less desirable hypothesis. The danger this presents is that potentially serious outcomes may not be given the appropriate level of consideration when attempting to determine the best possible course of action.

One specific form of confirmation bias is (plan) continuation bias, or plan continuation error. Once a plan is made and committed to, it becomes increasingly difficult for stimuli or conditions in the environment to be recognized as necessitating a change to the plan. Often, as workload increases, the stimuli or conditions will appear obvious to people external to the situation; however, it can be very difficult for a pilot caught up in the plan to recognize the saliency of the cues and the need to alter the plan.

When continuation bias interferes with the pilot's ability to detect important cues, or if the pilot fails to recognize the implications of those cues, breakdowns in situational awareness (SA) occur. These breakdowns in SA can result in non-optimal decisions being made, which could compromise safety.

In a U.S. National Aeronautics and Space Administration (NASA) and Ames Research Center review of 37 accidents investigated by the National Transportation Safety Board, it was determined that almost 75% of the tactical decision errors involved in the 37 accidents were related to decisions to continue on the original plan of action despite the presence of cues suggesting an alternative course of action. Dekker (2006) suggests that continuation bias occurs when the cues used to formulate the initial plan are considered to be very strong. For example, if the plan seems like a great plan, based on the information available at the time, subsequent cues that indicate otherwise may not be viewed in an equal light, in terms of decision making.

Therefore, it is important to realize that continuation bias can occur, and it is important for pilots to remain cognizant of the risks of not carefully analyzing changes in the situation, and considering the implications of those changes, to determine whether or not a more appropriate revised course of action is appropriate. As workload increases, particularly in a single-pilot scenario, less and less mental capacity is available to process these changes, and to consider the potential impact that they may have on the original plan.

Accidents and Incidents

SKYbrary includes the following reports relating to events where continuation bias was considered to be a factor:

  • A332, Caracas Venezuela, 2013 (On 13 April 2013, an Air France Airbus A330-200 was damaged during a hard (2.74 G) landing at Caracas after the aircraft commander continued despite the aircraft becoming unstabilised below 500 feet agl with an EGPWS ‘SINK RATE’ activation beginning in the flare. Following a superficial inspection, maintenance personnel determined that no action was required and released the aircraft to service. After take off, it was impossible to retract the landing gear and the aircraft returned. Considerable damage from the earlier landing was then found to both fuselage and landing gear which had rendered the aircraft unfit to fly.)
  • AS3B, vicinity Sumburgh Airport Shetland Islands UK, 2013 (On 23 August 2013, the crew of a Eurocopter AS332 L2 Super Puma helicopter making a non-precision approach to runway 09 at Sumburgh with the AP engaged in 3-axes mode descended below MDA without visual reference and after exposing the helicopter to vortex ring conditions were unable to prevent a sudden onset high rate of descent followed by sea surface impact and rapid inversion of the floating helicopter. Four of the 18 occupants died and three were seriously injured. The Investigation found no evidence of contributory technical failure and attributed the accident to inappropriate flight path control by the crew.)
  • B722, Lagos Nigeria, 2006 (On 7 September 2006, a DHL Boeing 727-200 overran the runway at Lagos by 400 metres after the First Officer was permitted to attempt a landing in challenging weather conditions on a wet runway off an unstable ILS approach. Following a long and fast touchdown at maximum landing weight, a go around was then called after prior selection of thrust reversers but was not actioned and a 400 metre overrun onto soft wet ground followed. The accident was attributed to poor tactical decision making by the aircraft commander.)
  • IL76, St John's Newfoundland Canada, 2012 (On 13 August 2012, an Ilyushin IL76 freighter overran landing runway 11 at St John's at 40 knots. The Investigation established that although a stabilised approach had been flown, the aircraft had been allowed to float in the presence of a significant tail wind component and had not finally touched down until half way along the 2590 metre long runway. It was also found that reverse thrust had then not been fully utilised and that cross connection of the brake lines had meant that the anti skid pressure release system worked in reverse sense, thus reducing braking effectiveness.)
  • RJ85, vicinity Medellín International (Rionegro) Colombia, 2016 (On 29 November 2016, a BAe Avro RJ85 failed to complete its night charter flight to Medellín (Rionegro) when all engines stopped due to fuel exhaustion and it crashed in mountainous terrain 10 nm from its intended destination killing almost all occupants. The Investigation noted the complete disregard by the aircraft commander of procedures essential for safe flight by knowingly departing with significantly less fuel onboard than required for the intended flight and with no apparent intention to refuel en route. It found that this situation arose in a context of a generally unsafe operation subject to inadequate regulatory oversight.)



Related Articles

Further Reading

  • The “Barn Door” Effect by C. West, Ph.D., NOAA - a paper about pilots’ propensity to continue approaches to land when closer to convective weather than they would wish to get while en route.