Do You Know Enough About the Airplane You Fly?

It’s never been an interest of ours to “pile-on” during the days, weeks, and months following an aviation accident. Our approach is to evaluate the facts and findings, and then view them through the lens of contemporary airmanship. In fact, our book (Automation Airmanship, McGraw-Hill Education, 2012) was written during the aftermath of the Air France 447 accident in June 2009, and we took specific caution during our writing to avoid speculation while the investigators carefully evaluated every available lead during the years that followed that accident.

We have patiently stood by since October 2018, watching the facts and findings build following two recent and now notorious accidents involving similar aircraft models (the Boeing 737 MAX) flown by operators on opposite sides of the globe. October 2018’s Lion Air Flight 610 (JT610) and March 2019’s Ethiopian Airlines Flight 302 (ET302) have generated renewed interest in the challenge of designing and operating advanced aircraft, interest which seems only to build after a very public tragedy. Before this interest recedes (as it most certainly will), there is much to be learned and discussed. We’ll stick to what we know best, and that which applies directly to all flight crews of all aircraft all the time, not just since learning of the potential shortcomings of one aircraft’s design featured in the daily headlines.

One of the most concerning aspects of both of these accidents—as it relates to all cockpit crewmembers—is the evidence of a trend across our industry over the past two decades when fielding new and complex aircraft:

The limiting of what manufacturers provide in the way of system operation and logic knowledge, and operators who then use these resources as the basis to train crews in flying the new aircraft.

Ultimately, cockpit crews must know and be able to put into immediate practice the crucial steps to restore a safe flight path, quickly, no matter the cause behind the undesired flight path. A key component of the critical responsibility is an expert-level knowledge of the system itself. Pilots must have this knowledge at their fingertips during initial and recurrent training, and for their own self-study.

The “logic” behind the 737 MAX Maneuvering Characteristics Augmentation System (MCAS) flight control law has been a central target of both accident investigations—exactly the kind of “logic knowledge” we identified over a decade ago as one of the foundational principles of automation airmanship that falls into “that special knowledge reserve of the best glass cockpit pilots” that all cockpit crewmembers should be expected to master.

Speaking of the grounding of the 737 MAX fleet, aviation investigative journalists writing for Aviation Week & Space Technology wrote in the March 25–April 7 edition of the journal that, “The [737 MAX] MCAS’ risks were not well-enough understood, by pilots or regulators, to allow the MAX to keep flying.1 How this lack of knowledge and understanding manifested itself prior to the grounding of the aircraft globally is summarized best in a letter-to-the-editor from a 737 MAX Captain, excerpted from the same March 25th edition:

“This system is still not referenced by name nor is detailed systems information provided in the FAA-approved flight manual supplied by my company…”

This Captain goes on to say that as part of his crew briefing (presumably prior to the global grounding of the MAX-8), now includes the following:

“We are flying an aircraft which, if not properly and closely monitored, may attempt to kill us. If, at any time in flight, we experience continuous un-commanded stabilizer trim movement we will execute the Runaway Stabilizer procedure and, if necessary, disconnect the trim cutout switches below the throttles. We will then use manual trim to return the aircraft to a trimmed condition and continue to use manual trim for the duration of the flight and land as soon as conditions permit.”2

This kind of knowledge has been called many things by manufacturers, training providers, management, and even flight crews; “nice to know, not need to know,” “geek-speak,” “techno-babble,” and “technical mumbo jumbo,” to name a few. No matter the outcome of these investigations, and whether decision makers choose to add this key knowledge for crews back into operating publications and training or decide to keep it out for reasons of cost or efficiency, pilots will find a way to remain responsible for the safe operation of their aircraft, just as this anonymous Captain has done.

In our own 2012 book, we introduced the 9 Principles for Operating Glass Cockpit Aircraft. The list can be viewed as a build-up from basic principles to the final capstone principle that we titled, “Logic Knowledge.” We organized our thesis and named 9 Principles as crucial to the modern crew member’s knowledge based on their durability for every pilot, in any kind of aircraft, even as the technology evolved. We now see, over a decade after the first mention of the concept of Automation Airmanship in 2008, that much of what our field work produced has in fact remained if not timeless, certainly durable, and over time, increasingly valuable as a factor in safe outcomes.

It seems obvious to us that every stakeholder in the industry, from the CEO to the infrequent flyer, would be interested in providing flight crews with knowledge and training of the most technical aspects of how the aircraft flight path is controlled by the automation.

Whenever I am compelled to write on the subject of failures, accidents, and how they relate to contemporary airmanship, I reach for the wisdom of Professor Henry Petroski, who in 2012 wrote these words about accidents that have a technology component:

“In all cases of surprise or failure, the greater technological tragedy is not having failure but not learning the correct lessons from them. Every failure is a revelation of ignorance, an accidental experiment, a found set of data that contains clues that point back to causes and further back to mistakes that might have been made in design, manufacture and use. Not to follow the trail to its source is to abandon an opportunity to understand better the nature of the technology and our interaction with it…every new failure—no matter how seemingly benign—presents a further means toward a fuller understanding how to achieve a fuller success.”3

Whether or not key information about how any aircraft’s flight control, flight guidance, autoflight, and flight path monitoring systems work are part of the provided guidance and training, every pilot must individually seek an understanding of these systems. This understanding must go beyond mere familiarity and ultimately lead to proficiency and mastery on the flight deck, across all situations and during all contingencies. Achieving “a fuller success” depends on it.

Think about it.

Until our next post, fly safe, and always, fly first.

References:

1 Sean Boderick and Thierry Dubois, in “MAX Chaos”; Aviation Week & Space Technology, March 25-April 7, 2019. P 14

2 From “Feedback”; Aviation Week & Space Technology, March 25-April 7, 2019. P. 5

3 Henry Petroski, To Forgive Design: Understanding Failure. 2012, Harvard University Press, Cambridge Massachusetts. P. 45

2 thoughts on “Do You Know Enough About the Airplane You Fly?”

  1. “Fly first” – a simple yet clear expression of the responsibilities of a pilot. How many amongst us need this reminder that it all boils down to flight path control, and the simple basics of pitch, thrust, trim?
    By all means, use the wonderful technologies that we have today – from autothrust, flight asymmetry computers, auto trim, flight envelope protection etc… but if she is going to do something unsafe, you need a sharp enough pilot at the controls to save the day. Worryingly, as Pilots across the industry rely more and more on these wonderful tech, what happens when the tech fails or goes awry?

    Reply
  2. It’s surprising to me the number of times I’ve asked a manufacturer a question about a particular system or procedure only to have the response akin to “I don’t know”. Proprietary systems with guarded mechanics and logic hidden inside a “magic box” seem to be more and more prevalent in modern aircraft and, unfortunately, it seems to be a more and more frequent cause for mishaps. It’s great when systems are working the way the pilot intends but knowing what to do when that system stops providing the expected output is a necessary part of airmanship. Without knowing how that system operates makes it difficult to establish those procedures for when that system malfunctions (or does exactly what the pilot told it to do but not necessarily what the pilot intended it to do).

    Reply

Leave a Comment