In any engineering organization, management of communication among the stakeholders is a key challenge. The more extensive the complexity and scope of the endeavor, the greater the need to communicate. From senior leadership to the shop floor, communication is key. Digital engineering transformation is increasingly driving the methods and the means to improve how we share information.
In November of 2024, the SEI brought together stakeholders from the Department of Defense and the intelligence community (IC) who have been engaging and actively innovating in the dynamic environment of digital engineering. Our workshop focused mostly upon model-based systems engineering (MBSE) as a pillar in aligning best practices in modeling with systems engineering practices. The SEI’s work in this field is focused on developing approaches and processes to document, assess, and optimize MBSE methodology. We aim to help government stakeholders choose better systems engineering pathways for large mission-critical programs. This blog post, which is adapted from a recently published technical note, highlights a research agenda and calls to action for future work in MBSE and digital engineering from practitioners in the field.
Distilling MB Synergies
DOD Instruction 5000.97 states that MBSE and digital engineering would “enable faster, smarter, data-driven decisions through the system life cycle.” Yet, reproducing MBSE and digital engineering benefits at scale remains a challenge. The SEI’s work in this field is focused on developing approaches and processes to document, assess, and optimize MBSE methodology, assisting government stakeholders to choose better systems engineering pathways for large mission-critical programs.
Recognizing the interwoven nature of modeling and engineering activities, MBSE balances the desire for agile velocity and responsiveness with the need for carefully designed capabilities by using models as a common exchange format. Model-based techniques can often address scalability challenges for complex systems. Integrating the complex elements of modern systems as a synergistic whole requires the power of modeling and modern computing. The advancements of engineering agility cannot be fully realized without a sufficient understanding of digital engineering infrastructure architecture and its relation to MBSE.
We use the term MBSynergy to refer to a community approach to cultivate learning and the derived benefits of MBSE and digital engineering efforts spanning the DoD and IC. The goal of the effort is to develop an organized, integrated approach that will enable us to provide a consistent, effective level of understanding to the DoD, IC, and ultimately to industry at large. The initial MBSynergy workshop engaged participants who work on behalf of the warfighter and civilian emergency services. In these roles, they engage lead systems integrators or any of the well-known providers to the Defense Industrial Base. The selection of participants ensured relatability among their similar contexts of work. We employed Chatham House Rule to enable participants to speak freely without personal attribution associated to any particular comment.
The issues raised during the workshop were grouped into five areas: “hot” topics, DoD/IC policies, digital engineering environment, training, and MBSE processes.
“Hot” Topics in MBSE Technologies
MBSE and digital engineering continue to evolve with new areas of interest emerging in SysMLv2 and artificial intelligence.
Finalization of SysMLv2 and the migration from SysML1.x to SysMLv2 is a major concern in the defense industrial base, DoD, and the intelligence community. The prevalence of the Unified Architecture Framework (UAF) and shortcomings in language and tools support for SysML1.x were among the key subjects discussed. Early evaluations of SysMLv2 demonstrate the importance of this subject for major programs.
Workshop participants also expressed interest in exploring how AI might help support MBSE applications in modeling. They expect a new and revolutionary user experience for development environments.
Calls to action include
- Investigate the SysMLv2 transition.
- Explore the use of AI to support MBSE.
DoD/IC Policies
Practitioners in the DoD and intelligence communities use MBSE with common policy, deployment, and sustainment concerns. Also, these practitioners use similar tools for capturing requirements modeling in UAF or SysML, and so on. The paradox reported by participants is that bespoke environments deployed in different settings build-in obstacles to the application of MBSE concepts across the boundaries of differing implementations. These unique environments create myriad operational issues that projects must contend with above and beyond daily operations.
Defining, deploying, and maintaining a common MBSE baseline for digital engineering environments across organizations would help achieve the following:
- Promote the development and dissemination of standard practices and conventions.
- Reduce the cost to operate these platforms.
- Address common issues (e.g., access control, configuration management).
Policies for sustaining models and data are also required to address updates in modeling standards and the tools they rely on.
Acquisition policies are another key element to consider. On multiple occasions, workshop participants noted a lack of guidance for defining deliverables. MBSE is likely to reside at the hub of fundamental human and system behaviors. It is therefore essential that the architecture and design of MBSE-affected program parts be keenly understood and managed appropriately. In this regard, the opportunities for unsatisfactory results can overrun the possibilities for real synergies gained in a sound program campaign of action. Insufficient guidance starts with the media to be delivered. Saying, We want a SysML model is not enough. Practitioners need to understand the precise version of the language and tool being used (e.g., Is it a Cameo model as an .mdzip file? Is it a model that is compatible with Cameo v2022 or v2024? Is it a report that has been generated out of a Cameo model?Does the model use additional standard and/or custom profiles?) Clarifying these details would ensure that the model received is accessible to or relevant for stakeholders.
The lack of common terminology can lead to imprecision in naming objects in the models and artifacts to be delivered. It also works against a shared understanding of what is expected from a delivered model and ultimately makes integration more difficult.
Calls to action include
- Influence OSD and policymakers to provide a program objective memorandum (POM) for MBSE within the DoD/IC enterprise.
- Address the diversity in MBSE deployment across the branches of the military, and audit best practices for MBSE.
- Define a common platform for model exchange.
- Create a lifecycle sustainment plan for models and data.
Digital Engineering Environment
Establishing a common vocabulary that acts as a foundation for program execution was chief among the digital engineering concerns discussed at the workshop. OSD DEM&S, OMG, AIAA, NDIA, and INCOSE, among others have exchange forums where users can discuss digital engineering topics.
Workshop participants also discussed how the deployment of MBSE at scale through a common digital environment creates new access control challenges. Because models can change rapidly, threat scenarios range from known threats, such as unauthorized read access that would result in unauthorized information dissemination, to unauthorized write access that would taint digital assets with malicious or corrupted information.
Further, workshop participants noted that unauthorized disclosure of the architecture or design could assist adversaries in identifying exploitable weaknesses or vulnerabilities that exist in the system. This concern is similar to securing software development environments in general. Because a model shows more aspects of a system than source code, attack vectors can evolve as progress is made on the system under development. A specific cyber threat analysis is necessary to fully evaluate this issue.
Calls to action include
- Define the relationships among MBSE, DevSecOps, Agile, digital engineering, and other methodologies.
- Evaluate the cyber risks associated with digital modeling environments and their integration.
- Create a Security Classification Guide for MBSE models.
Training
Training is a crucial component of technology transition and therefore knowledge and the skills that practitioners acquire from training and education are paramount. Defense Acquisition University, Air Force Institute of Technology, and other DoD/IC components support training for various aspects of MBSE and digital engineering. AIAA released a report on digital engineering workforce development.
Workshop participants unanimously agreed that training for a specific language (e.g., SysMLv2 or UAF) or tool (e.g., Cameo) is not their primary concern. Rather, training should focus on how modeling languages, architectural frameworks, and tools can help government personnel achieve greater effectiveness. Given that highly regulated settings where engineering work occurs are prerequisites to adopting new approaches, training should also address workflows where MBSE will contribute.
Calls to action include
- Apply MBSE to specific acquisition pathways.
- Help practitioners articulate the role of models and understand how to leverage modeling for a particular situation.
- Provide guidance about applying MBSE to legacy versus new systems or capabilities.
MBSE Processes
Defining a methodology and its associated processes is crucial for the success of a program; however, MBSE processes are often overlooked. Practitioners sometimes learn to use Cameo to model in UAF or SysML1.x, but they often lack the foundational concepts of systems engineering required for success (e.g., guidance found in the INCOSE Systems Engineering Handbook or International Organization for Standardization [ISO] 15288). Nevertheless, it is critical to understand the goals of systems engineering, its roles, and how to tailor it to a specific program.
Workshop participants had specific questions related to MBSE:
- Some questions focused on the short-term use of MBSE and were specific to a use case, such as how to improve model interoperability, define a minimum viable model for a specific evaluation goal, or understand the model lifecycle.
- Other questions focused on the longer-term use of MBSE, such as how to consider models an integral part of a system development lifecycle, how to update models regularly, how to audit MBSE processes to improve quality metrics, and how to define those quality metrics.
- Finally, participants asked about an MBSE starter kit to help programs initiate and support their modeling journey.
These inquiries share a common focus: determining the appropriate methodology, if any, for utilizing MBSE. Standard MBSE methodologies such as the Object-Oriented Systems Engineering Method (OOSEM) cannot adapt to DoD/IC requirements for supporting activities mandated by acquisition policies. The span of influence for any given acquisition program has not been sufficient, historically, to define a complete solution for the DoD/IC context. It’s not that OOSEM cannot adapt to DOD/IC requirements, it’s that OOSEM has not been built with those considerations in mind. Consequently, these requirements lack the proper foundations to support activities mandated by acquisition policies, especially with respect to the definition of government-funded information and the review of models produced by other groups.
One area of work addressed in the MBSynergy project that SEI researchers will explore in a future workshop is whether to modify the existing MBSE methodology and processes or develop specific processes that align with an organization's objectives. Whether to modify the existing MBSE methodology and processes or develop specific processes that align with an organization’s objectives is a topic that SEI researchers are currently addressing in the MBSynergy project and will be the subject of a subsequent workshop.
Calls to action include
- Identfiy processes to improve model interoperability.
- Avoid models becoming shelfware by maintaining current and relevant data.
- Define criteria for determining the sufficiency of a model. Ask, How do I evaluate models to determine whether they are good or can answer the questions I need answers to?
- Guide auditing MBSE processes with associated metrics to evaluate the maturity of MBSE adoption.
- Use an MBSE starter kit that includes a collection of templates.
- Optimize the MBSE approach by decreasing model complexity and team/organization complexity (e.g., geography, skills, career paths).
Five Future Areas of MBSE Work for the Department of Defense and Intelligence Communities
The observations captured during the workshop allowed SEI researchers to identify future areas of work to serve Department of Defense and intelligence communities.
- Systems Engineering and Architecture. Workshop participants agreed that digital engineering and architecture are essential enablers to the beneficial use of MBSE. Participants attributed many failed MBSE implementations to failures in these fundamental disciplines.
- Enterprise Value of MBSE. An immediately measurable expression of what a successful MBSE approach yields was not apparent in many government settings. The motivations for implementing MBSE were often disconnected from the day-to-day performance criteria that define program success.
- Community Building. The socio-technical nature of the challenges that practitioners face when using MBSE requires that they learn from their early experiences and from each other to accelerate beneficial change. Building on the experience of others in a forum for establishing a shared history and track record can accelerate this process.
- Tools, Training, and Policy. All the challenges to successfully using MBSE are not easily solved by introducing training or new tools. Workshop participants helped us understand that these external drivers to adopting MBSE do not suffice.
- Ownership of the MBSE Approach. Many participants described their experiences implementing MBSE as spanning the contractual boundaries and proprietary technologies that define the defense industrial base. However, shared ownership of an authoritative source of truth across boundaries, especially at a more detailed level, can be contentious.