AS9100 currently mentions the concept of “human factors” in two places. The first comes from a note included in the original ISO 9001:2015 text for clause 7.1.4 on work environment, saying, “a suitable environment can be a combination of human and physical factors.” In this case, the term is used more generally and is not related to anything specific to aerospace or aviation.

The AS9100 D standard, however, adds a slightly more specific reference under clause 10.2 for Nonconformity and Corrective Action, saying, the organization shall determine “the causes of the nonconformity, including those related to human factors.” There is no note or guidance on this language, however, but the surrounding text suggests AS9100 is using the term human factors (HF) in the context of human error.

This is problematic, as HF has two distinct yet (at times) overlapping definitions. The next edition of AS9100, to be called “IA9100,” will add more text on HF, but still largely isolates it to something considered during a corrective action’s root cause analysis. Professionals in the aerospace industry, however, need to be aware of both definitions and should apply them to their QMS.

The FAA, meanwhile, tries to use a single definition for both contexts and fails on both fronts. That definition, per FAA Order 9550.8A, reads as follows:

Multidisciplinary effort to generate and compile information about human capabilities and limitations and apply that information to equipment, systems, facilities, procedures, jobs, environments, training, staffing, and personnel management for safe, comfortable, and effective human performance.

It’s bad because it defines HF as an “effort” focused on “compiling information,” making it sound like a purely academic exercise. This definition doesn’t reflect the actual real-world usages of HF in aviation and aerospace, so let’s break it down for practical, QMS purposes.

HF in the Human Error Context

In the context of human error, the FAA says one of the two “important subgoals” of HF is “to maintain, and when possible improve, aviation safety by reducing the impact of human error.” This ties HF directly to root cause analysis and human error, and is what the authors of AS9100 / IA9100 are fixated on.

This, first of all, raises a conflict in the world of quality management. Many quality management professionals wrongly assert that “human error can never be a root cause.” See here, here, here, and here for just a few examples of so-called experts making this bogus claim. This posture came about due to the encroaching belief with quality and safety management circles that suggests “blaming people” for an error is bad for morale, and that morale is more important than ensuring a safe and high-quality product. It pictures an imaginary world where people are victims of processes and systems, not the designers of them, and that one cannot identify humans as the source of error while at the same time ensuring a just culture or systemic approach to corrective action.

That’s false on its face and, quite frankly, deadly. Corrective action can be taken when humans make mistakes — and should be — without affecting morale. Companies that can’t crack that have other problems entirely. Ironically, that toxic culture is often the “error” of the “human” executives running the company, who thrive on a toxic culture of fear and intimidation. I’m looking at you, Elon.

The FAA disagrees, too, understanding that “human error has been identified as a factor in two-thirds to three-fourths of recent aviation accidents and incidents, including several recent high-profile cases.

FAA human factors personnel seek to understand the many potential contributors to human error, such as inadequate training and procedures, conflicting roles and responsibilities, badly designed equipment, poor communication, fatigue, distraction, and organizational factors.

What FAA fails to understand, however, is that the “inadequate training and procedures, badly designed equipment, and poor communication” are all the products of humans, too.

In a real-world QMS, however, HF must be considered when human error is identified as a root cause. This most certainly applies to AS9100, since it’s explicitly called out in clause 10.2, but should also apply to any non-aerospace QMS, such as one designed under ISO 9001.

How does that work, then? First, the root cause analysis (RCA) must have identified human error as the root cause. This assumes the company does not have a toxic culture and has not bought into the false arguments of the internet consultant crowd.

Next, the contributing factors must be considered. These are also created by humans (thus the name “HUMAN factors”) and would include the various aspects called out by FAA above. Blaming “procedures” alone ignores a further root cause such as, “why were the humans making shitty procedures?” If you don’t address that, a procedure might get fixed, but a new, shittier procedure could be produced a week later.

Expect that IA9100 will increase the requirements for HF in RCA when the updated standard is published.

HF in Product and Method Design

The second definition of HF is related to the design of products and methods. 

Within this context, FAA describes human factors as “practices and principles integral to the procurement, design, development, and testing of … systems, facilities, and equipment.” To this end, FAA has published the (excellent) standard HF-STD-001, “Human Factors Design Standard.

First, design engineers must consider HF when designing products, taking into consideration the fact that humans must be able to use, install, and maintain the products they create. This means designing a product’s form to allow for, say, a tool to be used to access an area where a screw or bolt will be inserted, or ensuring clearance for a wrench or hoist lift point. Failure to consider HF when designing a product can lead to the product being manufactured per print, yes, but then it will be impossible to pick up or physically install. If a product must be disassembled during maintenance, designers must consider how that disassembly will be done — by humans —  making it easier to accomplish.

Let’s say a part will need to be maintained by occasionally adding oil. Designing the part to have the port sealed behind a welded panel would make it impossible for a human to add the oil. Instead, the port would be covered by a removable door or hatch, sealed perhaps with bolts or a clamp. Or perhaps an enclosed area will need to be inspected via a borescope later; in that case, some port for the insertion of a boresope might need to be designed into the product.

For both aircraft and spacecraft, the need to maintain parts must also be considered. This can lead to physical dimensions and characteristics that must be designed up front.

Next, design engineers must consider HF in the context of methods engineering, remembering that it will be humans working on these parts. When designing a maintenance or manufacturing operation, human factors such as accessibility, noise, lighting, and other environmental factors must be considered.

Let’s say a part is intended to be installed on an aircraft parked in a hangar; is a ladder or scaffolding system available to allow the installers to access the aircraft?

For plating shops, where parts are dipped into tanks for a set dwell time, HF would consider the timers to be used on the tanks. If the shop is noisy, then a visual (strobe light) timer might be better than an audible one, since no one will hear the alarm.

The FAA standard HF-STD-001 is fantastic in that it lists a fairly comprehensive set of controls for these real-world cases. It discusses computer-human interfaces, workstation design, ventilation, illumination, biomechanics, user documentation, and more. Even though the standard is from 2016, there are a few things that would need to be updated to reflect a shop in 2026.

However, AS9100 is silent on this topic, and IA9100 does not seem to be adding any language on it. Nevertheless, if you work in aviation or aerospace, HF in product and method design is absolutely crucial.

Conclusion

In summary, aerospace and aviation shops must consider human factors in two settings: root cause analysis (when human error is determined to be the cause) and in the design of products and methods. There will be an overlap: occasionally a human error cause will lead to an actual root cause of poor product or method design. Likewise, an error in a product design might fall back on a simple human error, like someone typing the wrong number in SolidWorks.

But we need to abandon this idea that any “human error” or “operator error” is automatically taboo. That cuts off a dispassionate, objective analysis of a problem and, in an attempt to avoid a culture of blame, creates a culture that erases accountability. Instead, an understanding of human factors allows us to talk about human error like adults, without infantilizing our workforce, and to take corrections without impacting morale.

 

 

 

Advertisements

Aerospace Exports Inc