I really didn’t want to comment on any of the recent discussions about  software on planes – nevertheless  I read a lot about them lately.
In my opinion we should not blame a single company for what happend but should try to learn for every future engineering project.

Today I want to share with you one of the articles that I liked reading from Gregory Travis!
Trever explains what he thinks happend and compares it to his recently upgraded Cesna 172 Autopilot.

https://spectrum.ieee.org/aerospace/aviation/how-the-boeing-737-max-disaster-looks-to-a-software-developer

The emphasis on simplicity comes from the work of Charles Perrow, a sociologist at Yale University whose 1984 book, Normal Accidents: Living With High-Risk Technologies, tells it all in the very title. Perrow argues that system failure is a normal outcome in any system that is very complex and whose components are “tightly bound”—meaning that the behavior of one component immediately controls the behavior of another. Though such failures may seem to stem from one or another faulty part or practice, they must be seen as inherent in the system itself. They are “normal” failures.

Nowhere is this problem more acutely felt than in systems designed to augment or improve safety. Every increment, every increase in complexity, ultimately leads to decreasing rates of return and, finally, to negative returns. Trying to patch and then repatch such a system in an attempt to make it safer can end up making it less safe.

This is the root of the old engineering axiom “Keep it simple, stupid” (KISS) and its aviation-specific counterpart: “Simplify, then add lightness.”