Human behaviour can generally be broken down into three distinct categories: skill-based, rule-based and knowledge-based behaviour.
These are covered in greater detail in Professor James Reason’s book “Human Error”. Each of these behaviour types have specific errors associated with them.
Examples of skill-based errors are action slips, environmental capture and reversion.
Action slips as the name implies are the same as slips, i.e. an action not carried out as intended.
The example given in the figure below may consist of a pilot intending to key in an Initial Approach Fix SCBSG into the GPS but keying in SCBSD by mistake, after having been distracted by a query from his co-pilot.
Image: Example of an Action SLIP.
Environmental capture may occur when a pilot carries out a certain task very frequently in a certain location. Thus, a pilot used to reaching for a certain switch to select function A on an Airbus A320, may inadvertently select the same switch on an Airbus 321 when, in fact, it has a different function.
Reversion can occur once a certain pattern of behaviour has been established, primarily because it can be very difficult to abandon or unlearn it when it is no longer appropriate.
Thus, a pilot may accidentally carry out a procedure that he has used for years, even though it has been recently revised. This is more likely to happen when people are not concentrating or when they are in a stressful situation. Reversion to originally learned behaviour is not uncommon under stress.
Rule-based behaviour is generally fairly robust and this is why the use of procedures and rules is emphasised in aircraft maintenance.
However, errors here are related to the use of the wrong rule or procedure. For example, a pilot may misdiagnose a fault and thus apply the wrong SOP, thus not clearing the fault.
Errors here are also sometimes due to faulty recall of procedures. For instance, an engineer not remembering the correct sequence when performing a procedure.
Errors at the knowledge-based performance level are related to incomplete or incorrect knowledge or interpreting the situation incorrectly.
An example of this might be when a pilot makes an incorrect diagnosis of a situation without having a full understanding of how the aircraft systems work. Once he has made such a diagnosis, he may well look for information to confirm his (mis)understanding, while ignoring evidence to the contrary (known as confirmation bias).