 Welcome to the CTS Learning Series, Chapter 4, Data Model Conformance Verification. Video 2, Data Model Conformance Verification Overview. In this video, we will provide necessary information for understanding data model theory, explain the basics of the face metamodel and how the CTS enforces structural correctness, and explain data models in the context of the CTS. The face approach is designed to address the affordability initiatives of today's military aviation community. This subsequently encourages software portability and reuse for multiple platforms, as well as reduces duplicative investment for the same capability, thus reducing the time to field. It also provides a common operating environment and a data architecture to enable system of systems integration. To ensure interoperability, the face technical standard requires that UOPs or Unit Supportability use the face data architecture. Data architecture is defined as a set of related models, specifications, and governance policies, with the primary purpose of providing an interoperable means of data exchange. The face data architecture consists of a series of data models to describe information used by face components. The face consortium's data architecture working group is responsible for defining the data architecture and data model aspects. To quote the data architecture working group, data modeling describes the data going into or out of a software component. In the context of the entities of concern to the software component, to enable an integrator to combine software components to provide a larger capability, data modeling captures the semantics of data and is exchanged in a rigorous, machine-processable, validatable format. Data semantics in this context is defined as the meaning of data. In layman's terms, data models describe data well enough for components of a system to clearly understand not just the format of the data, but the meaning of the data as well. So, why are we creating data models? It is important to remember the face technical standard aims to achieve the goal of developing software-intensive systems that have a high level of complexity. These systems usually require a large amount of documentation to support their operation and integration. The data architecture was designed to develop a specification to support engineering design documentation for data and interfaces, such that there are no ambiguities in these designs for the purpose of component integration. The data architecture allows for a way to provide unambiguous semantics and measurement systems to increase interoperability. Another reason is to provide a standard method for defining the data format and meaning for information sharing between software components. In the end, data modeling results in better documentation of the system, which allows for better software. Given the rigorous syntax of the face data architecture, it is possible to enforce the conformance of data models. This, in turn, encourages increased interoperability, allows for widespread software reuse, and creates the ability to generate meaningful documentation all in one action. In the context of face UOCs, data models show the representation of data traversing the transport services interface. An important aspect of the face data architecture is that it is intended to be domain-independent, meaning the application of data modeling does not have to be specifically for aircraft or aircraft-related applications. In response, the Face Technical Standard Edition 3.1 leverages the Open Unified Domain Description Language, or UDDL, standard syntax and structure for data models, and extends Open UDDL's functionality to create face-specific aspects for describing data. For this video series, it is important to understand that the documentation for the concepts that we will be discussing is contained in the Open UDDL 1.0 specification and the Face Technical Standard Edition 3.1. Previous to the Face Technical Standard Edition 3.1, this information was entirely contained in the Face Technical Standard. The Face Data Architecture is an integral part of the Face Technical Standard. An in-depth explanation of what is contained within the Face Data Architecture is required to fully understand data modeling and is out of scope for this video series. However, it is important to understand high-level concepts of the Face Data Architecture to understand the CTS's role in testing conformance. The Face Data Architecture defines the structure for UOP-supplied models, or USMs, which contains data models, sometimes written as DM, UOP models, which are sometimes written as UM, integration models, and traceability models. Additionally, the Face Data Architecture defines a shared data model, or SDM. The syntax for each of the separate models are defined in the Face Technical Standard Edition 3.1 and the Open UDDL Standard Edition 1.0. The SDM and USM files are saved as a file with the extension .face. The contents of these files are written in Face XMI, which stands for Extensible Markup Language, or XML, Metadata Interchange, XMI. XMI is a standard created by the Object Management Group, or OMG, for exchanging metadata information. For any metadata whose metamodel can be expressed in Meta Object Facility, or Moth, the Face Data Architecture is written as a Moth Metamodel, with all data architecture components specified by Essential Moth, or EMoth, which is a subset of Moth and a set of Object Constraint Language, or OCL constraints. Explaining the metamodel and OCL constraints are out of scope for this video series. The SDM is a model that is written and maintained by the Data Architectural Working Group in the Face Consortium, and the SDM is designed to have elements added, removed, and edited as implementations of the Face Technical Standard occur. The SDM defines basic elements that are used across data models in the USM. The shared data model is, at its core, a data model, with the data model containing a conceptual data model, or CDM, and a logical data model, or LDM. The SDM's conceptual data model provides the observables that data modelers may use in a USM. An observable is a specific type to define the characteristics of a data model entity. A data model entity is a single instance of a thing or concept. The SDM's logical data model provides the units, measurements, and coordinate systems that data modelers may use in a USM. These measurement systems measure the observables that are defined in the SDM's conceptual data model. In the end, the goal of the SDM is to be able to describe any concept or measurement that is pertinent to the environments in which the Face Standard is applied. As the need for new observables or measurements are encountered, the Face Consortium will expand the SDM to meet the needs of data modelers using the Face Data Architecture. The implementation of these concepts is in the USM, or UOP supply model. The USM is defined as a data model supporting a unit of conformance. The USM defines the data model elements to send and receive by a UOP. A USM is comprised of zero-to-many data models, zero-to-many UOP models, and optionally, zero-to-many integration and traceability models. In a USM, one of the data models must be the shared data model. This is to clearly define the USM's use of a certain version of the SDM. The CTS may then compare the official Face Consortium written shared data model to the provided shared data model. Aside from the data model defined in the SDM, the USM may also define a data model. The USM's data model is similarly split into a conceptual data model and a logical data model, with the addition of a platform data model. In the USM's conceptual data model, we define the abstract characteristics of an entity. That characteristic is defined as a type that is an observable, which is stated in the SDM's conceptual data model, which we would pull from the SDM into the USM. Picking observables to include in a USM's conceptual data model is dependent on the scope of the system. For example, if our system was supposed to measure the temperature of a cup of coffee, we will need to measure temperature. The cup of coffee would be an entity. Temperature is considered an observable of the entity cup of coffee. In the logical data model, we define how the observable characteristics will be measured. If we recall, the SDM provides measurement methods that can be used to measure observables. Similar to the USM's conceptual data model, the USM's logical data model must also pull from the SDM's logical data model. Picking measurement systems is dependent on the scope of the system as well. In the example of the temperature of a cup of coffee, the temperature observable could be measured in Fahrenheit, Celsius, or Kelvin. In the platform data model, we define how the measurement system will be interpreted by the UOC's. The SDM does not define platform data model types. Instead, the user must supply them. The data model metamodel, defined in the OpenUDDL standard, lists possible types that may be used in the platform data model. In the cup of coffee example, to measure the temperature with some precision, we might want to measure temperature as a float or a double. Each of the USM's conceptual, logical, and platform data models may create conceptual, logical, and platform views, respectively. A view is an abstract element whose function is to select a subset of a data model's entities and associations that are of interest. Views are composed of queries and templates. Queries define the data that is included in a message, which are defined in the data model. The templates define the organization of the data in the payload, which are defined in the UOP model. In order to select that subset of information in a model, a query modeling language and a template language was developed to allow a selection to happen in a model. The query modeling language is strongly derived from SQL. Queries describe how data in the entities and associations are combined to provide a cohesive data set, such as when needed for a message. The platform data types are mapped to each of the supporting languages via IDL. IDL is used to define data types and interfaces in a way that is independent of the programming language, operating system, or processor platform you may be using. IDL is an object management group standard. IDL may then be translated into a programming language via the IDL language mappings provided by the OMG. The template language is used to specify the presentation of data. The template refers to elements in the query through its selected or projected elements. Templates specify the format of the data selected by the query and can filter or down select from the query results. They cannot select new data. The UOP model consists of elements that provide the means to formally specify the interfaces of a UOP. Effectively, the unidiportability model defines a high-level black box diagram of UOP components and how data flows into or out of those UOPs. The contents of the UM directly reference elements in the data model. Because the data model and UOP model define the semantics of data, the interaction between multiple-face UOPs must be addressed. The integration model documents these data exchanges, view transformations, and integration of UOPs by modeling TSS connections between two or more UOPs. To do this, the integration model relies on the UOP model's views to show connection data. This is enough information to allow for TSS message interfaces and TSS data transformations to be automatically generated, which is supported in some third-party tools. It is important to note that the integration model is not required to achieve face conformance. However, if an integration model is provided, errors in the integration model will reflect as errors in the conformance process. Traceability models provide a way to map external sources to the face data architecture. For example, if a UOP is written to meet a specific requirement, it may be beneficial for documentation efforts to define a relationship between the instance of the face UOP and the requirement. Other examples of external model mappings could be other data models, either face or non-face, interface control documents or ICDs, and message definitions. The traceability model is not required to achieve face conformance. However, if a traceability model is provided, errors in the traceability model will reflect as errors in the conformance process. Similar to SDMs, domain-specific data models, or DSDMs, are models designed to capture the domain-specific semantics and generally do not contain UOP models in a domain of interest. For example, a domain of interest could be fruits. The fruits DSDM might define entities, associations, and queries that relate to fruits, such as size, seed type, and parent plan. USMs written to DSDMs may derive their elements from the DSDM, the SDM, or both if needed. DSDMs are not required to be written to achieve face conformance. It is possible to use only the SDM each time a USM is written. A number of companies have developed data modeling tools that allow users to create data models. The explanation and introduction of these tools are out of scope for this video series, but they may be found at opengroup.org-face-third-party-tools. The CTS automates the validation process through the DMVT and DIG, which stand for Data Model Validation Tool and Data Model 2 IDL Generator, which are used to validate both USMs and DSDMs. The rules for testing a USM and DSDM for face conformance are contained in the Face Technical Standard. The USM is automatically tested through the GSL generation process, as mentioned before, and in Chapter 1. The CTS converts the USM into IDL, which is used in IDL, the IDL compiler, to generate data type definitions in C, C++, ADA, and Java, depending on the user's selection. The mappings between IDL and C, C++, ADA, and Java are defined as standards from the Object Management Group. The USM is also automatically tested when the user presses the Test UOC Conformance button, as the GSLs are generated again before a conformance test. The CTS does this to confirm whether the USC compiled their code using the same options when the USC was built, since the USC was built against GSLs. Alternatively, pressing the Test USM button tests the USM without generating gold standard libraries or testing for conformance. Thank you for watching. In the next video, entitled Verification of the USM Using the CTS GUI, we will demonstrate how to configure the CTS to test for USM conformance, along with showing a passing and failing result of the conformance test.