I attended an interesting systems engineering forum this past week. A fair number of government organizations and contractors were participants. There were many interesting observations from this forum, but one presenter from a government agency said something that particularly struck me. He was saying that one of the major challenges he faced was finding people who were trained in UML and SysML. It made me think: “Why would it be difficult to find people trained in UML? Wasn’t UML a software development standard for nearly the last 20 years? Surely it must be a major part of the software engineering curriculum in all major universities?”
The Unified Modeling Language (UML) was developed in the late 1990s-early 2000s to merge competing diagrams and notations from earlier work in object-oriented analysis and design. This language was adopted by many software development organizations in the 2000s. But as the graph below shows, search traffic for UML has substantially declined since 2004.
This trend is reinforced by a simple Google search of the question: “Why do software developers not use UML anymore?”
It turns out that the software engineering community has moved on to the next big thing: Agile, which now systems engineers are also trying to morph into a systems engineering methodology, just like when they created the UML profile: Systems Modeling Language (SysML).
This made me wonder, “Do Systems Engineers try to apply software engineering techniques to perform systems engineering and thereby communicate better with the software engineers?” I suddenly realized that I have been living through many of these methodology transitions from one approach to software development and systems development to another.
My first experience in modeling was using flow charts in my freshman year class on FORTRAN. FORTRAN was the computer language used mostly by the scientific community in the 1960s through the 1990s. We created models using a standard symbol set like the one below.
Before the advent of personal computers these templates were used extensively to design and document software programs. However, as we quickly learned in our classes, it was much quicker to write-execute-debug the code than it was to draw by hand these charts. Hence, we primarily used flowcharts to document, not design the programs.
Later in life, I used a simplified version of this to convert a rather large (at the time) software program from the Cray computers to the VAX computers. I used a rectangular box for most steps and a rectangular box with a point on the side to represent decision points. This similar approach provided the same results in a much easier to read and understandable way. You didn’t have to worry about the nuanced notations or become an expert on them.
Later, after getting a personal computer (a Macintosh 128K) I discovered some inexpensive software engineering tools that were available for that platform. These tools were able to create Data Flow Diagrams (DFDs) and State Transition Diagrams (STDs). At that time, I had moved from being a software developer into project management (PM) and systems engineering (SE). So, I tried to apply these software engineering tools to my systems problem, but they never seemed to satisfy the needs of the SE and PM roles I was undertaking.
In 1989, I was introduced to a different methodology in the RDD-100 tool. It contained the places to capture my information and (automatically) produced diagrams from the information. Or I could use the diagrams to capture the information as well. All of a sudden, I had a language that really met my needs. Later CORE applied a modified version of this language and became my tool of choice. The only problem was no one had documented the language and went to the effort of making it a standard, so arguments abounded throughout the community.
In subsequent years I watched systems engineers argue between functional analysis and object-oriented methods. The UML camp was pushing object-oriented tools, such as Rational Rose, and the functional analysis tools, such as CORE. We used both on a very major project for the US Army (yes that one) and customers seems to understand and like the CORE approach better (from my perspective). On other programs, I found a number of people using a DoD Architecture Framework (DoDAF) tool called Popkin Systems Architect, which was later procured by IBM (and more recently sold off to another vendor). Popkin included two types of behavioral diagrams: IDEF0s and DFDs. IDEF0 was another software development language that was adopted by systems engineers while software developed had moved on to the object-oriented computer and modeling languages.
I hope you can now see the pattern: software engineers develop and use a language, which is later picked up by the systems engineering community; usually at a point where its popularity in the software world is declining. The systems engineering community eventually realizes the problems with that language and moves on. However, one language has endured. That is the underlying language represented in RDD-100 and CORE. It can trace its roots back to the heyday of TRW in the 1960s. That language was invented and used on programs that went from concept development to initial operational capability (IOC) in 36 months. It was used for both the systems and software development.
But the problem was, as noted above, there was no standard. A few of us realized the problems we had encountered in trying to use these various software engineering languages and wanted to create a simpler standard for use in both SE and PM. So, in 2012 a number of us got together and developed the Lifecycle Modeling Language (LML). It was based on work SPEC Innovations has done previously and this group validated and greatly enhanced the language. The committee “published” LML in an open website (www.lifecyclemodeling.org) so it could be used by anyone. But I knew even before the committee started that the language could not be easily enhanced without it being instantiated in a software tool. So, in parallel, SPEC created Innoslate®. Innoslate (pronounced “In-no-Slate”) provided the community with a tool to test and refine the language and to use it to map to other languages, including the DoDAF MetaModel 2.0 (DM2) and in 2014 SysML (included in LML v. 1.1.). Hence, LML provides a robust ontology for SysML (and UML) today. But it goes far beyond SysML. Innoslate has proven that many different diagram types (over 27 today) can be generated from the ontology, including IDEF0, N2, and many other forms of physical and behavioral models.
Someone else at the SE Forum I attended last week said something also insightful. They talked about SysML as the language of today and that “10 years from now there may be something different.” That future language can and should be LML. To quote George Allen (the long-departed coach of the LA Rams and Washington Redskins): “The Future is Now!”