The Greek philosopher Heraclitus once said, “change is the only constant in life.” Yet, all too often, it is change that we struggle with the most. In business and technology, as in life, there is comfort in doing things the way we have always done them. We hold onto strategies, processes, and approaches that are familiar, “tried and true.”
But as the world itself changes, we have to change with it or suffer the consequences. We have to make sure that our comfortable, familiar paradigms don’t become inhibitors of our ability to be successful in the future.
Holding on too long to past successes and resisting change drove the irrelevance of some of the most successful companies in the world—Kodak, Woolworths, Toys R Us, Blockbuster, and more.
To maintain a competitive edge, individuals, companies, and even industries must continually seek new ways to improve, to adapt, and to grow. To stand still is to perish.
These sorts of insights may seem obvious, particularly in the world of applied science and technology. Yet this instinctive, all-too-human resistance to change sets the stage for what I would consider the greatest challenge of modern software and systems engineering today—a battle between a familiar, comfortable, artisan-based methodology that has worked for decades and a component-oriented approach that has the very real potential to transform the future of complex cyber-physical systems development.
Lessons from the Past
Before automotive pioneer Ransom Olds introduced the assembly line manufacturing process to his factories in 1901, every major automaker in the world was constrained to making a few thousand cars a year. The limiting factors were people and knowledge. In short, to make a car, you needed a master craftsman with the engineering knowledge and skills to build the vehicle from start to finish; someone who understood and could describe the function of every single component of a car, how it worked, and how to put it all together. These kinds of people were difficult to find and took a long time to train.
With the assembly line, an automobile-in-progress moved from station to station as standardized components were added by workers focused on specific tasks. Instead of needing to know how the entire car worked, each worker needed only to know how to correctly install their piece of the puzzle. Teaching someone how to put a wheel on a stick is much easier than training someone in how to assemble an entire car from scratch.
Finding and training automotive factory workers suddenly became much easier, and the specialized componentization of the assembly line sped up the actual process of putting a car together. The end result was a faster, cheaper, dramatically more efficient production process.
Within two months of transitioning his automotive factories from a master craftsman-based system to an assembly line system, Olds was producing 500% more throughput per month, churning out 15,000 cars per year compared to his competitors’ 3,500. Soon, Olds’s process was adopted by other car manufacturers, then by manufacturers of all sorts of products. The assembly line revolutionized the way we built physics-based goods, eliminating dependency on and limitations of artisan-based manufacturing through componentization.
Growing Complexity
That was more than a hundred years ago. Now we’re in the information age, and software is king. From our communication networks to global logistics and business data, everything is managed with and connected by software.
But here’s the brutal irony: for all of the amazing advances and efficiency gains that software has made possible, we still design and build software and cyber-physical systems through an antiquated artisan-based process.
Much like automotive manufacturers in the early 20th century, companies are crippled by dependency on software engineers who know how to make the code work. In the early days of software, this made sense. Those early coders were pioneers, writing code and building software from scratch. And the systems created with software were, for a time, simple enough to be comprehensible. It was once realistic for a software developer to be able to understand and describe the function of every single component of a software program, how it all worked, and how to build software from the ground up.
Those days are over.
Modern software is dramatically more complicated, with programs often built on hundreds of thousands or even millions of lines of code. What’s more, today’s software is inextricably intertwined with complex cyber-physical systems composed of firmware, hardware, networks, regulatory requirements, and people. Understanding how all these puzzle pieces work—and ensuring that the software and processes involved are correct, efficient, and secure—is time consuming, expensive, and often impossible to accomplish at high confidence levels.
For the past 40+ years, software and systems engineering has followed a methodology known as the “V-Model.” This approach breaks the development process into discrete phases, moving from project definition to implementation to testing and integration. It has been effective in driving development of physics-based systems and simple software systems for decades, and has been used to build, maintain and modify products such as planes, automobiles, medical devices, refrigerators, and more.
This is not necessarily a bad system, but it is insufficient. The world has changed, and as systems grow more and more complex, the V-model has become outdated and incapable of handling current challenges.
Bridging the Gap
To illustrate, if I were to build a bridge using the same V-model that is used to build software systems, I would:
- Identify the requirements the bridge must meet (weight, throughput, length, etc.)
- Design the bridge (best estimate of an effective bridge with aesthetic and user experience considerations)
- Build the bridge
- Test the bridge (drive a set of fully loaded semi-trucks across the bridge to test the weight requirements)
- FAIL! If the design was flawed, the bridge may collapse under the weight of the trucks (inevitable failure will occur when we fail a unit test, an integration test, etc.)
- Analyze why the bridge failed
- Tear the bridge down
- Rinse and repeat steps 1–7 until we get it right.
It seems crazy to build a bridge that way, but that’s the way many build life critical, safety critical, and security critical software and cyber-physical systems everyday. We plan, we build, we test, we fail, we rebuild.
There is a better way.
To bridge the gap (pun intended) between growing system complexity and the practical limitations of human engineers, we need to shift the software and systems engineering paradigm to a component-based approach in which the parts of a system are broken down into mathematically described pieces that enable accuracy, repetition, and specialization. Just like with Olds’s assembly line, this adaptive shift would ultimately result in dramatic cost, quality, and efficiency gains—all through a new process called Digital Engineering.
Modeling for Success
Digital engineering is a kind of virtual cartography for complex systems. By utilizing math to describe each component of a system, as well as their relationships with other pieces of the system, systems engineers create detailed models that serve as a virtual workbench. These digital models allow users to examine a complex system at varying levels of detail, and to experiment with new designs, upgrades, or changes within a system, its software, and its environment in a virtual workspace.
This kind of experimentation and testing in a digital environment is often dramatically faster and cheaper than actually building and testing prototypes in the real world, allowing for quick iteration. What’s more, users can use these mathematical models to automate the generation of code that has been mathematically optimized for the highest possible quality, precision, and accuracy.
This is the same kind of process that is used when designing bridges and buildings. To ensure that their designs are safe and correct, architects use software such as Computer Aided Design (CAD) tools, 3D rendering tools, and physics analysis tools to create models. These models enable them to analyze the building’s design, evaluate how all the components fit together, and create an architecture that has a precise set of instructions (blueprints) that are used to build the building. They leverage standards (building codes) to ensure that the capabilities they build meet expectations for safety, security and structural integrity.
Using existing tooling, it is currently possible to apply this same digital first approach to designing and building airplanes, automobiles, smart city technologies, and other complex cyber-physical systems. We can leverage standards (FACE, OMS, Mavlink, and more) and use models derived from SysML, AADL, and other modeling languages to design and evaluate systems and their architectures; we can harness the potential of mathematical models of software to prove code security and safety; and we can utilize these models as baselines to expedite future software development and maintenance.
Digital engineering empowers precision, accuracy, and efficiency based on component-oriented engineering defined by the models. From there we can utilize automated code engines to generate our end product in days versus months or years.
In other words: faster build times, lower costs, and better, safer, more efficient systems.
Looking to the Future
The future is clear—change is upon us. Digital first approaches will soon become the mainstream foundation for software and systems development. Already, a growing body of evidence from numerous industries and technology sectors is demonstrating the benefits of a digital engineering approach. Industry organizations like Volkswagen are using these methods to build their battery management systems. The Army has adopted an “Architecture Centric Virtual Integration Process” (ACVIP) to model and detect defects in complex systems in aviation and missile command. The Air Force is pushing a digital first approach to acquisition and lifecycle management, diving headfirst down Will Roper’s rabbit hole towards a “faster, agiler, and more competitive weapons-buying process our nation needs to succeed long term.”
Galois itself has worked with numerous government agencies and private sector organizations, including the Defense Advanced Research Projects Agency (DARPA) and NASA to pioneer digital engineering methods and tools for everything from protecting military systems from hackers to auto-generating virtual models and code to help enhance and secure existing systems to generating dynamic failure recovery plans for space missions.
This new methodology requires software developers to move away from their monolithic code bases to better understand the components that comprise their code. We must realize the benefits of models and start to use the tools available to us to move from artisan approaches to engineering paradigms that create repeatability and reproducibility. With the growing complexity of critical systems, this kind of adaptation is not only advisable, it is necessary.
I’ll leave you with this quote from an old coach of mine. “You either get better or worse, but you don’t stay the same. Others are working to get better, what are you doing to get better today?”
The shift is on. Those who fail to adapt will be left behind.