Human Factors Experts, Article 3; Buffalo Law Journal, July 31, 2003
Robert C. Sugarman, PhD, PE; RCS Performance Systems, Inc., Buffalo NY
Forensic Human Factors specialists help lawyers analyze the root cause of an accident by determining who erred and why.
Human factors applies research from a number of fields to design and evaluate things that people use in work and everyday activities. The goal of human factors, now also known as ergonomics, is to provide efficient, safe, and comfortable equipment and work environments. Human factors is a specialty that began to mature during World War II when the need became urgent to make military equipment safer for our personnel to use. Military psychologists determined the capabilities and limitations of operators and maintainers so that equipment and jobs could be better designed and people could be selected and trained properly for their jobs.
When people become involved in an accident, observers often apply the phrase “human error” (or the related “pilot error” when aircraft go down) and seek out an obvious individual to blame. A human factors analysis that takes the entire situation into account might find that the “an accident waiting to happen” is a more applicable description.
To understand an accident, human factors looks at what caused the chain of unfortunate events and differentiates between errors made by the people directly involved and errors made by the “designers” of the accident environment, while also considering the role of chance (“acts of God”).
Part of the accident environment is made up of the pre-existing natural environment and ongoing natural processes such as the weather on the highway and aging processes in the body. The rest of the environment includes the features designed by people and their corresponding policies, procedures, regulations and laws.
The people who are the designers may make errors and those errors may be caused by knowledge limitations or by the designer’s intentions.
Knowledge limitations include unforeseen or unknown phenomena that cause accidents. For example, before it was recognized and studied, wind shear caused aircraft to crash during landing. It was not predictable when airplanes were first invented. Because of its importance, engineers and scientists developed instruments and procedures that have taken it out of the realm of acts of God and now have reduced the incidents.
But when designers make no attempt to understand the implications of their designs for the human users or the relevant science that should be considered, the result can be design-induced errors. The infamous Three Mile Island nuclear power plant accident is a classic example of the “accident waiting to happen”. Displays were located far from the controls that they were related to. Displays showed inappropriate information and some important information was left out. Information overload as the accident progressed made matters worse. These errors were foreseeable if the designers had only looked beyond their own specialties.
And then there were Florida’s butterfly ballots!
All too often, engineers and designers do not seek out the wealth of knowledge we have about human abilities and limitations and the decades-old scientific methods we have for including human characteristics in the design of their systems.
This happens when the designers would rather count on the adaptability and trainability of people to overcome any leftover or future problems.
The remaining errors in design are caused when knowledge is available, but is incorrect or misapplied. When laws regarding safety are influenced by politics or economics, designers may follow the regulations but end up with, for example, no sprinklers in a nightclub where knowledge of human behavior would dictate otherwise. Then, of course, there are designers and inspectors who know the rules and regulations and choose to ignore them. When an airline installed a cabinet in the bulkhead of an aircraft’s galley, the FAA inspection ignored its own prohibition for objects to project into workspaces. Sure enough, a flight attendant was hurt on the cabinet during turbulence.
Design errors in the medical field are finally receiving the attention they demand. These include instructions that are not standardized, names of medications that are easily confused, equipment with inadequate instructions or operation that is contrary to other equipment, connections for different gasses that are interchangeable so that nitrogen can be connected to where oxygen is needed, inadequate instructions for patients, and on and on. We have long known how to identify and eliminate these design problems.
Design error leads to accidents when designers do not take into account people’s abilities, characteristics, experience and the task’s complexity.
Finally we come to the errors made by people directly involved in the accident. If people make errors because of human limitations then a negative connotation to their behavior is a bad rap.
People are limited in their information gathering and processing capabilities and those may be different from one person to the next. When a situation places sudden or high extra demands on a person they may not be able to detect the cues that would otherwise warn them to alter their action. Any event that causes a person to narrow their focus of attention may result in their not perceiving other events happening. It is understandable that a driver might stop looking at the road when suddenly coming upon a mass of warning, directional, and advertising signs. Inadequate training or defective equipment may also cause or allow a person to take a dangerous action. These errors fall into the category of design-induced errors.
That leaves the final category of error which is where the negative connotations of “human error” correctly applies. This is when a person willfully disregards procedures, rules and laws or commits inappropriate acts that put other people in danger. This includes actions like removing a guard from a machine, driving too fast for conditions, showing up for work drunk, not wearing a hard hat in a construction area, and not attending to one’s responsibilities. But even then, the person must have been given sufficient training, warnings, instructions and equipment to be expected to perform adequately or safely.
When the same person is both the designer and the accident initiator, analysis becomes more complex. Consider a surgeon who operates on the wrong knee. This would be “human error” because it is likely that an established procedure was not followed. But what if in the middle of a delicate operation, the surgeon must make a choice between two procedures that seem equally likely to be acceptable, and the choice made ultimately leads to disaster? Was a “human error” committed? This depends on many factors, both natural and fabricated, including the knowledge available to the surgeon before and during the operation, but also on a multitude of “behavioral” factors that could affect perception and bias judgment.
Errors made by humans are rarely simple. For any accident, many errors may have been made by many people.