Software-intensive systems are generally quite flexible, - able to evolve and adapt to changing product and market requirements. However, their very flexibility makes it difficult to adequately anticipate and test all the interactions between the various components of the system. Even if all the components are highly reliable, problems can still occur if a rare set of interactions arise that compromise the overall behavior and safety of the system.
Things are even tougher with people-centric sociotechnical systems, which not only have to deal with potential software problems, but with the even more complex issues involved in human behaviors and interactions. And, at the bleeding-edges of complexity are data-driven AI systems, which are generally based on statistical analysis and machine learning methods.
What is it that makes these systems so intrinsically complex? Why aren’t they simpler? What purpose does this complexity serve? Several years ago, I came across a reasonable answer to these seemingly Socratic questions in a very interesting paper - Complexity and Robustness - by professors Jean Carlson and John Doyle from UC Santa Barbara and Cal Tech, respectively. Their paper draws on evolutionary biology to help explain the complexity of man-made systems.
According to Carlson and Doyle, you can find very simple biological organisms in nature, and you can design very simple objects. The key ingredient you give up is not their basic functionality, but their robustness - that is, the ability to survive, for biological organisms, or to perform well, for engineered objects - under lots of different conditions, including the failures of individual components or dealing with new, unanticipated events. Robustness implies the ability to adapt and keep going in spite of a rapidly changing environment.
There is a continuing struggle between complexity and robustness in both evolution and human design. A kind of survival imperative, whether in biology or engineering, requires that simple, fragile systems become more robust. But the mechanisms to increase robustness will in turn make the system considerably more complex. Furthermore, that additional complexity brings with it its own unanticipated failure modes, which are corrected over time with additional robust mechanisms, which then further add to the complexity of the system, and on and on and on. This balancing act between complexity and robustness is never done.