unsubject
Simon's Unsubject Podcast
Boeing's Downfall
0:00
Current time: 0:00 / Total time: -11:48
-11:48

Boeing's Downfall

When engineering processes gives way to bureaucratic performance metrics, and close relationships with regulators result in oversight failures

Boeing was once the gold standard in aviation, synonymous with safety, reliability, and engineering brilliance. For decades, its aircraft not only filled the skies but did so with unmatched precision and excellence. However, the Boeing of today is a far cry from the Boeing of yesterday. In the last few decades, the company has been embroiled in scandal, tragedy, and strategic miscalculations that seem almost unthinkable for an entity of its reputation.

In many ways, Boeing is a case study of what happens when engineering processes gives way to bureaucratic performance metrics, and when close relationships with regulators result in oversight failures. It also shows the inevitable problems that arise when engineering culture—built on responding to feedback, improving incrementally, and adhering to evidence-based practices—is replaced by the desire to accommodate the wishes of higher authority.

The 737 MAX: A Tragic Consequence of Shortsighted Decisions

The most glaring example of Boeing’s downfall came with the 737 MAX. In 2018 and 2019, two of these jets crashed—Lion Air Flight 610 and Ethiopian Airlines Flight 302 —taking the lives of 346 people. The tragedy sent shockwaves through the aviation industry and left the world questioning how Boeing, once the industry leader in safety, could have allowed such catastrophic failures.

The root cause of these crashes was the Maneuvering Characteristics Augmentation System, also known as MCAS. MCAS was supposed to help pilots handle the 737 MAX's altered flight dynamics due to the addition of larger, more fuel-efficient engines. Rather than redesign the entire airframe—a costly endeavor—Boeing opted for a software patch that effectively disguised the aerodynamic challenges presented by the new engines. It was an engineering shortcut made under immense pressure to bring the aircraft to market quickly and keep pace with Airbus’s A320 Neo.

The introduction of MCAS without sufficient pilot training or transparency was a denial of the evidence-based, safety-first philosophy that Boeing was once known for. When you ignore critical feedback—in this case, concerns from engineers about the rushed pace of the project—and fail to adapt your designs accordingly, the outcome is rarely a positive one. Boeing’s reluctance to acknowledge MCAS as a significant system worthy of attention and training ran directly counter to the fundamentals of sound engineering.

Beginning of the End - the 1997 Merger with McDonnell Douglas

To understand Boeing's broader missteps, we must revisit the 1997 merger with McDonnell Douglas, a pivotal moment that shaped the company’s subsequent trajectory. This merger was heralded as a bold move to create a dominant force in the aerospace industry. However, instead of being a union of equals, it represented a cultural collision that transformed Boeing from an engineering powerhouse into a bureaucratic government contractor.

Before the merger, Boeing’s culture was rooted in engineering excellence, where decisions were made methodically, and always with an eye on safety of its aircraft. McDonnell Douglas, however, came with a different ethos—one that emphasized on the matrics for winning government contracts. Post-merger, it was this latter culture that won out.

"The bean counters are now in the cockpit," lamented one former Boeing engineer. This encapsulated the reality that the decisions steering the company were no longer driven by engineering but by key performance indices. As Boeing moved away from its engineering roots, it also moved away from its capacity to identify problems early and make the necessary improvements—a vital aspect of maintaining safety and quality.

The problems Boeing faced were not limited to the 737 MAX. The KC 46 Pegasus tanker, developed for the U.S. Air Force, was plagued by delays, cost overruns, and quality control problems. Boeing underbid to secure the contract, hoping that subsequent defense spending would offset the initial losses. Instead, the program faced repeated setbacks, including issues with debris left in the aircraft and malfunctioning systems—issues that were emblematic of a company increasingly willing to cut corners. This wasn’t just about technical errors; it reflected a broader culture that no longer prized engineering discipline and rigor but prioritized speed and cost-saving measures instead. Without the iterative improvement and evidence-based engineering that had once defined Boeing, quality suffered.

The CST 100 Starliner project, part of NASA’s Commercial Crew Program, was another high-profile misfire. The 2019 test flight failed due to a software error that caused the spacecraft to miss its intended orbit, ultimately uncovering more than 80 software issues. The failure was shocking given Boeing's storied aerospace expertise but not surprising considering the systemic issues that had come to plague the company. The Starliner underscored Boeing’s broader difficulty in transitioning from traditional aerospace engineering to the software-intensive demands of modern spaceflight. Where other companies like SpaceX have succeeded through deep integration between hardware and software development—leveraging iterative testing and rapid feedback—Boeing seemed ill-equipped to manage this new frontier.

One of Boeing's major challenges was adapting to the increasingly software-driven nature of modern aerospace engineering. Boeing's engineering culture had been built on mechanical precision, aerodynamics, and structural design. But modern aircraft are no longer just flying machines—they are highly sophisticated networks of sensors, computers, and millions of lines of code. Boeing treated software development as an afterthought, something to be handled separately, rather than a core part of aircraft design.

The essence of successful engineering is its reliance on iterative processes—constant testing, feedback, and adaptation. Software, even more so than mechanical systems, demands a fluid and responsive approach to development. The contrast with companies like Apple or Tesla, who pioneered concurrent hardware-software integration, is stark. In those companies, hardware and software teams work side-by-side from the very beginning of a project, ensuring seamless integration and continuous adaptation based on real-time feedback. Boeing, however, kept these teams siloed, treating software as an add-on rather than an integral component. This was a departure from the iterative, integrated approach that had previously defined Boeing’s success and contributed to the poor integration between MCAS and the aircraft’s other systems.

Regulatory Capture and Too-Big-to-Fail Mentality

The 737 MAX debacle was not just an internal Boeing failure. It was a failure of the regulatory system as well—a glaring example of regulatory capture, where the Federal Aviation Administration (FAA) became too closely aligned with the industry it was supposed to regulate. The FAA, due to both capacity issues and an overreliance on industry expertise, essentially delegated key parts of the certification process to Boeing itself.

The idea of self-certification, which might sound reasonable when dealing with an experienced, trustworthy company, proved to be a fundamental misstep. Boeing was allowed to conduct its own safety checks, and unsurprisingly, it chose to minimize the MCAS system's significance to avoid delays and extra costs. This arrangement allowed Boeing’s flawed designs to go unchallenged, creating a systemic vulnerability that only became apparent after lives were lost. In an environment that no longer adhered to rigorous, evidence-based review and improvement, this was the tragic but inevitable outcome. It was not just a failure of engineering; it was a failure of oversight, and a striking reminder of what happens when the boundaries between regulator and industry become too blurred.

Boeing's relationship with regulators also shielded it from facing the full consequences of its actions. The FAA’s cozy relationship with Boeing, born out of both necessity and complacency, allowed the company to operate with far too much freedom and far too little scrutiny. This wasn’t just a bureaucratic failure; it was a structural issue with deep implications for how aviation safety is ensured.

The notion of being "too big to fail" wasn’t just implicit in Boeing’s relationship with the FAA; it was evident in how the company operated internally. Boeing’s sheer size and market dominance created an environment where executives seemed to believe they were insulated from real risks. They assumed that no matter what, Boeing would weather the storm—after all, it had the government as a client, a regulator in its corner, and a reputation that seemed unassailable. But as the events with the 737 MAX showed, complacency is dangerous, and unchecked power even more so.

Boeing’s downfall is not just a lesson for aviation; it is a cautionary tale for any large entity that loses touch with its core values. When a company grows too big to listen, too bureaucratic to innovate, and too comfortable to adapt, it sets itself up for failure. Boeing’s history of excellence was built on engineering integrity, a commitment to quality, and a culture that empowered those who built its planes to speak up and lead. When that culture shifted—when cost-cutting, shareholder appeasement, and regulatory coziness took precedence—Boeing lost its way.

The tragedy of the 737 MAX, the setbacks of the KC 46, and the failures of the Starliner were not isolated events. They were symptoms of a deeper rot within Boeing—a culture that placed financial engineering over aerospace engineering, and corporate metrics over safety. To rebuild itself, Boeing must fundamentally change its culture, its incentives, and its relationship with regulators. It needs to return to the basics of engineering excellence, to re-emphasize the importance of safety, and to learn from its own history before it's too late.

Engineering development is about responding to feedback, continual improvement, and adhering to evidence-based practices. It requires a deep commitment to adaptability and an openness to challenge assumptions. Boeing’s journey back to greatness will require all of these traits. It requires a cultural reckoning—a return to a time when safety was the foundation of everything it did, and when engineers were empowered to challenge decisions that compromised quality. If Boeing cannot rediscover that ethos, it will become a relic of the past, a sobering example of what happens when an entity becomes too big and too out of touch with reality and the very values that once made it great.

Discussion about this podcast

unsubject
Simon's Unsubject Podcast
unsubject covers (1) random topics; (2) the Sinosphere and the world; (3) economics, public policy, and technology.