Difference between revisions of "Human error"
(→Failure-timing Errors) |
(→In aviation maintenance) |
||
(121 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
− | A [[human error]] (hereinafter, the ''Error'') is any | + | A [[human error]] (hereinafter, the ''Error'') is any action or inaction of a human being working on a system that can potentially and unintentionally degrade this system. In other words, the ''Error'' can be defined as an individual's deviation from acceptable or desirable practice which culminates in undesirable or unexpected results. |
− | + | Those actions or inactions that can potentially degrade the system that shouldn't be degraded and are made intentionally are called [[violation]]s. The residual ''Errors'' and those [[violation]]s that have not made on the purpose to degrade the system are mutually known as [[unsafe behavior]]s. | |
− | == | + | ==Classification== |
− | + | Dr. Jens Rasmussen developed a scientific classification of the ''Errors''. This classification claims that a human being is making the ''Error'' while performing either: | |
+ | #An incorrect task. This type of the ''Error'' has been categorized as a [[plan mistake|mistake]], which was further classified as either a ''rule-based'' or ''knowledge-based''; OR | ||
+ | #A correct task incorrectly. Commonly, this type of the ''Error'' is categorized as a [[slip error|slip]] or [[planning mistake|lapse]]. | ||
− | === | + | ===Mistakes=== |
− | : | + | :''Main wikipage: [[Plan mistake]]'' |
− | : | ||
− | |||
− | === | + | ===Slips=== |
− | :J.T. Reason,as expressed in the [[CAA Flight-crew human factors handbook CAP737]], developed the classification of unsafe | + | :''Main wikipage: [[Slip error]]'' |
+ | |||
+ | :[[Slip of action error]], [[slip of action]] | ||
+ | |||
+ | ===Lapses=== | ||
+ | :''Main wikipage: [[Memory lapse]]'' | ||
+ | |||
+ | ==Other groupings== | ||
+ | Similarly to human performance, the ''Errors'' can be grouped in many ways. | ||
+ | |||
+ | ===Well-adjusted vs residual=== | ||
+ | :Errors tend to be inevitable; the Roman Philosopher Cicero stated: “It is the nature of man to err”. | ||
+ | |||
+ | :The most of ''Errors'' may possibly be corrected. A ''well-adjusted Errors'' cannot harm the system. At the same time, the ''Error'' that hasn't been corrected timely within the same set of [[enterprise effort]]s can be called ''residual'' and should be considered as an [[unsafe behavior]]. | ||
+ | |||
+ | ===Dirty Dozen=== | ||
+ | :''Main wikipage: ''[[Dirty Dozen of Human Factors]]'' | ||
+ | |||
+ | :Originally developed by Transport Canada, the [[Dirty Dozen of Human Factors]] is a popular grouping of the ''Errors'' that is used in aviation. | ||
+ | |||
+ | ===Unsafe behaviors=== | ||
+ | :''Main wikipage: ''[[Unsafe behavior]]'' | ||
+ | |||
+ | :The residual ''Errors'' and those [[violation]]s that have not made on the purpose to degrade the system are mutually known as [[unsafe behavior]]s or [[unsafe act]]s. J.T. Reason, as expressed in the [[CAA Flight-crew human factors handbook CAP737]], developed the classification of [[unsafe act]]s that distinguishes between two types of the ''Errors''. They cause either: | ||
:#[[Active failure]]s, whose effects are felt immediately in a system. Active failures are usually the result of actions taken (or not taken) by front-line operators such as pilots, air traffic controllers, or anyone else with direct access to the dynamics of a system. | :#[[Active failure]]s, whose effects are felt immediately in a system. Active failures are usually the result of actions taken (or not taken) by front-line operators such as pilots, air traffic controllers, or anyone else with direct access to the dynamics of a system. | ||
:#[[Latent failure]]s, whose effects may lie dormant until triggered later, usually by other mitigating factors. Latent failures, on the other hand, are caused by those separated by time and space from the consequences of their actions in the dynamics of the system. Personnel working in vocations such as architectural design, hardware design and equipment maintenance are more prone to cause latent failures than active failures. On another hand, consider the case of a mechanic who assembled a component incorrectly which eventually led to a plane crash days or even weeks later. The defenses that should have normally caught this mistake were not in place. These defenses include proper training (the mechanic was taught to fix this particular component very informally and on-the-job), good situational awareness (the mechanic was tired from a double shift the night before), and independent inspection (the job was "pencil-whipped" to save time). | :#[[Latent failure]]s, whose effects may lie dormant until triggered later, usually by other mitigating factors. Latent failures, on the other hand, are caused by those separated by time and space from the consequences of their actions in the dynamics of the system. Personnel working in vocations such as architectural design, hardware design and equipment maintenance are more prone to cause latent failures than active failures. On another hand, consider the case of a mechanic who assembled a component incorrectly which eventually led to a plane crash days or even weeks later. The defenses that should have normally caught this mistake were not in place. These defenses include proper training (the mechanic was taught to fix this particular component very informally and on-the-job), good situational awareness (the mechanic was tired from a double shift the night before), and independent inspection (the job was "pencil-whipped" to save time). | ||
− | :The | + | :Major failures related to aircraft maintenance are listed at the [[list of maintenance-related failures]] wikipage. |
+ | |||
+ | ==Prevention== | ||
+ | The ''Error'' is one of the many contributing causes of risk events and a significant cause of disasters and accidents in industries such as aviation, nuclear power, space exploration, and medicine. Prevention of the ''Errors'' and/or their impact is a major contributor to reliability and safety of complex systems. | ||
+ | |||
+ | ===Pyramids=== | ||
+ | :''Main wikipage: [[Accident pyramid]]'' | ||
+ | |||
+ | :[[Unsafe behavior]]s may lead to [[incident]]s, and those may cause [[accident]]s, including fatal accidents. Proportions for those [[unsafe act]]s that led to the catastrophes are known as [[accident pyramid]]s such as: | ||
+ | :{|class="wikitable" width=100% style="text-align:center;" | ||
+ | |Type | ||
+ | !Description!!Number | ||
+ | |- | ||
+ | ![[Unsafe act]] | ||
+ | |Those ''Errors'' that haven't been corrected properly||300 | ||
+ | |- | ||
+ | ![[Incident]]s | ||
+ | |Those [[unsafe event]]s that lead to minor failures||29 | ||
+ | |- | ||
+ | ![[Accident]]s | ||
+ | |Those [[incident]]s that lead to fatal accidents or catastrophes||1 | ||
+ | |} | ||
+ | |||
+ | ===Employee resource management=== | ||
+ | :''Main wikipage: [[Employee resource management]]'' | ||
− | : | + | :Studies of [[human factors]] and [[ergonomics]] that allow for reduction of the ''Errors'' are the focus of several disciplines such as [[crew resource management]] ([[crew resource management|CRM]]) and [[maintenance resource management]] ([[maintenance resource management|MRM]]). |
− | : | + | ===Safety culture=== |
+ | :''Main wikipage: [[Safety culture]]'' | ||
− | + | ==Applications== | |
− | === | + | ===In maintenance training=== |
− | :The | + | :According to the [[FAA AC 120-72]], |
− | : | + | :#The way to understand [[MRM]] is to explore the nature of errors in maintenance operations. A widely accepted model of human error is the classification of unsafe acts developed by J.T. Reason. This classification distinguishes between two types of errors:<ol type="a"><li>[[Active failure]]s, whose effects are felt immediately in a system, and</li><li>[[Latent failure]]s, whose effects may lie dormant until triggered later, usually by other mitigating factors.</li></ol> |
− | : | + | :#The presence of defenses or safeguards in a system can usually prevent the effects of [[latent failure]]s from being felt by closing the window of opportunity during which an [[active failure]] may be committed. For example, consider the case of a mechanic who assembled a component incorrectly which eventually led to a plane crash days or even weeks later. The defenses that should have normally caught this mistake were not in place. These defenses include proper training (the mechanic was taught to fix this particular component very informally and on-the-job), good situational awareness (the mechanic was tired from a double shift the night before), and independent inspection (the job was "pencil-whipped" to save time.) |
+ | :#[[Active failure]]s are usually the result of actions taken (or not taken) by frontline operators such as pilots, air traffic controllers, or anyone else with direct access to the dynamics of a system. [[Latent failure]]s, on the other hand, are caused by those separated by time and space from the consequences of their actions in the dynamics of the system. Personnel working in vocations such as architectural design, hardware design and equipment maintenance are more prone to cause [[latent failure]]s than [[active failure]]s. | ||
+ | :#Both [[active failure|active]] and [[latent failure]]s may interact to create a window for accidents to occur. [[Latent failure]]s set the stage for the accident while [[active failure]]s tend to be the catalyst for the accident to finally occur. A good way to think of this model of accident creation is as slices of Swiss cheese. Each slice can be thought of as a defense to an accident (training, good management, teamwork, etc.) and each hole is a failure in that defense. The last slice is the final action which could serve as a defense before the accident event. The failure in that defense would constitute the [[active failure]] precipitating the accident. If the defenses to a situation contain a sufficient number of failures, which allow the holes to "line up," an accident will occur. | ||
+ | :#Differences between [[active failure|active]] and [[latent failure]]s cannot be over emphasized; each type of error helps to shape the type of training required to correct them. For example, because of the immediate demands and consequences of their actions, flight personnel require training that includes the psychomotor aspects of physical skills such as improving reaction time in emergency training. The strict physical requirements for employment as a flight officer demonstrate this emphasis clearly. On the other hand, maintenance personnel may require human factors and operations training to account for their susceptibility to [[latent failure]]s. In addition, the range of physical activities of maintenance personnel on the job also requires emphasis on workplace ergonomics. For example, maintenance personnel may be asked to lift heavy objects, work in awkward positions, or perform tasks in extreme weather conditions. These difficult work conditions all require knowledge of ergonomics to ensure safe, error-free performance. | ||
− | + | :Though [[CRM]] and [[MRM]] share the basic concepts of error prevention, the content of what is taught is specific to what is actually performed on the job. | |
− | : | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | : | + | ==See also== |
+ | *https://www.iwolm.com/en/do-you-know-the-3-types-of-human-errors-learn-from-them/ | ||
+ | *https://www.nopsema.gov.au/resources/human-factors/human-error/ | ||
+ | *https://www.safetyandhealthmagazine.com/articles/print/6368- | ||
+ | *https://www.isnetworld.com/events/ugm/osha2016/Sessions/Micah%20Backlund%20-%20Comparing%20the%20Heinrich%20Triangle%20Theory.pdf | ||
+ | *https://risk-engineering.org/concept/Heinrich-Bird-accident-pyramid | ||
+ | *https://www.lifetime-reliability.com/cms/free-articles/team-building/human-factors-human-error-mistake-proof/ | ||
+ | *https://www.faa.gov/about/initiatives/maintenance_hf/library/documents/media/hfacs/5_intervention.pdf | ||
+ | *https://www.nsf.org/newsroom_pdf/pb_human_error_prevention_solutions_and_answers.pdf |
Latest revision as of 13:02, 26 November 2019
A human error (hereinafter, the Error) is any action or inaction of a human being working on a system that can potentially and unintentionally degrade this system. In other words, the Error can be defined as an individual's deviation from acceptable or desirable practice which culminates in undesirable or unexpected results.
Those actions or inactions that can potentially degrade the system that shouldn't be degraded and are made intentionally are called violations. The residual Errors and those violations that have not made on the purpose to degrade the system are mutually known as unsafe behaviors.
Contents
Classification
Dr. Jens Rasmussen developed a scientific classification of the Errors. This classification claims that a human being is making the Error while performing either:
- An incorrect task. This type of the Error has been categorized as a mistake, which was further classified as either a rule-based or knowledge-based; OR
- A correct task incorrectly. Commonly, this type of the Error is categorized as a slip or lapse.
Mistakes
- Main wikipage: Plan mistake
Slips
- Main wikipage: Slip error
Lapses
- Main wikipage: Memory lapse
Other groupings
Similarly to human performance, the Errors can be grouped in many ways.
Well-adjusted vs residual
- Errors tend to be inevitable; the Roman Philosopher Cicero stated: “It is the nature of man to err”.
- The most of Errors may possibly be corrected. A well-adjusted Errors cannot harm the system. At the same time, the Error that hasn't been corrected timely within the same set of enterprise efforts can be called residual and should be considered as an unsafe behavior.
Dirty Dozen
- Main wikipage: Dirty Dozen of Human Factors
- Originally developed by Transport Canada, the Dirty Dozen of Human Factors is a popular grouping of the Errors that is used in aviation.
Unsafe behaviors
- Main wikipage: Unsafe behavior
- The residual Errors and those violations that have not made on the purpose to degrade the system are mutually known as unsafe behaviors or unsafe acts. J.T. Reason, as expressed in the CAA Flight-crew human factors handbook CAP737, developed the classification of unsafe acts that distinguishes between two types of the Errors. They cause either:
- Active failures, whose effects are felt immediately in a system. Active failures are usually the result of actions taken (or not taken) by front-line operators such as pilots, air traffic controllers, or anyone else with direct access to the dynamics of a system.
- Latent failures, whose effects may lie dormant until triggered later, usually by other mitigating factors. Latent failures, on the other hand, are caused by those separated by time and space from the consequences of their actions in the dynamics of the system. Personnel working in vocations such as architectural design, hardware design and equipment maintenance are more prone to cause latent failures than active failures. On another hand, consider the case of a mechanic who assembled a component incorrectly which eventually led to a plane crash days or even weeks later. The defenses that should have normally caught this mistake were not in place. These defenses include proper training (the mechanic was taught to fix this particular component very informally and on-the-job), good situational awareness (the mechanic was tired from a double shift the night before), and independent inspection (the job was "pencil-whipped" to save time).
- Major failures related to aircraft maintenance are listed at the list of maintenance-related failures wikipage.
Prevention
The Error is one of the many contributing causes of risk events and a significant cause of disasters and accidents in industries such as aviation, nuclear power, space exploration, and medicine. Prevention of the Errors and/or their impact is a major contributor to reliability and safety of complex systems.
Pyramids
- Main wikipage: Accident pyramid
- Unsafe behaviors may lead to incidents, and those may cause accidents, including fatal accidents. Proportions for those unsafe acts that led to the catastrophes are known as accident pyramids such as:
Type Description Number Unsafe act Those Errors that haven't been corrected properly 300 Incidents Those unsafe events that lead to minor failures 29 Accidents Those incidents that lead to fatal accidents or catastrophes 1
Employee resource management
- Main wikipage: Employee resource management
- Studies of human factors and ergonomics that allow for reduction of the Errors are the focus of several disciplines such as crew resource management (CRM) and maintenance resource management (MRM).
Safety culture
- Main wikipage: Safety culture
Applications
In maintenance training
- According to the FAA AC 120-72,
- The way to understand MRM is to explore the nature of errors in maintenance operations. A widely accepted model of human error is the classification of unsafe acts developed by J.T. Reason. This classification distinguishes between two types of errors:
- Active failures, whose effects are felt immediately in a system, and
- Latent failures, whose effects may lie dormant until triggered later, usually by other mitigating factors.
- The presence of defenses or safeguards in a system can usually prevent the effects of latent failures from being felt by closing the window of opportunity during which an active failure may be committed. For example, consider the case of a mechanic who assembled a component incorrectly which eventually led to a plane crash days or even weeks later. The defenses that should have normally caught this mistake were not in place. These defenses include proper training (the mechanic was taught to fix this particular component very informally and on-the-job), good situational awareness (the mechanic was tired from a double shift the night before), and independent inspection (the job was "pencil-whipped" to save time.)
- Active failures are usually the result of actions taken (or not taken) by frontline operators such as pilots, air traffic controllers, or anyone else with direct access to the dynamics of a system. Latent failures, on the other hand, are caused by those separated by time and space from the consequences of their actions in the dynamics of the system. Personnel working in vocations such as architectural design, hardware design and equipment maintenance are more prone to cause latent failures than active failures.
- Both active and latent failures may interact to create a window for accidents to occur. Latent failures set the stage for the accident while active failures tend to be the catalyst for the accident to finally occur. A good way to think of this model of accident creation is as slices of Swiss cheese. Each slice can be thought of as a defense to an accident (training, good management, teamwork, etc.) and each hole is a failure in that defense. The last slice is the final action which could serve as a defense before the accident event. The failure in that defense would constitute the active failure precipitating the accident. If the defenses to a situation contain a sufficient number of failures, which allow the holes to "line up," an accident will occur.
- Differences between active and latent failures cannot be over emphasized; each type of error helps to shape the type of training required to correct them. For example, because of the immediate demands and consequences of their actions, flight personnel require training that includes the psychomotor aspects of physical skills such as improving reaction time in emergency training. The strict physical requirements for employment as a flight officer demonstrate this emphasis clearly. On the other hand, maintenance personnel may require human factors and operations training to account for their susceptibility to latent failures. In addition, the range of physical activities of maintenance personnel on the job also requires emphasis on workplace ergonomics. For example, maintenance personnel may be asked to lift heavy objects, work in awkward positions, or perform tasks in extreme weather conditions. These difficult work conditions all require knowledge of ergonomics to ensure safe, error-free performance.
- The way to understand MRM is to explore the nature of errors in maintenance operations. A widely accepted model of human error is the classification of unsafe acts developed by J.T. Reason. This classification distinguishes between two types of errors:
- Though CRM and MRM share the basic concepts of error prevention, the content of what is taught is specific to what is actually performed on the job.
See also
- https://www.iwolm.com/en/do-you-know-the-3-types-of-human-errors-learn-from-them/
- https://www.nopsema.gov.au/resources/human-factors/human-error/
- https://www.safetyandhealthmagazine.com/articles/print/6368-
- https://www.isnetworld.com/events/ugm/osha2016/Sessions/Micah%20Backlund%20-%20Comparing%20the%20Heinrich%20Triangle%20Theory.pdf
- https://risk-engineering.org/concept/Heinrich-Bird-accident-pyramid
- https://www.lifetime-reliability.com/cms/free-articles/team-building/human-factors-human-error-mistake-proof/
- https://www.faa.gov/about/initiatives/maintenance_hf/library/documents/media/hfacs/5_intervention.pdf
- https://www.nsf.org/newsroom_pdf/pb_human_error_prevention_solutions_and_answers.pdf