 There you have logged in. So, Simon, we're ready to get started. Yep. I've started recording. So, off you go, if you'd like, Lauren. Great. So, I want to welcome everybody to the Faith Getting Started Guide in Balsa Overview Two-Part Webinar Series. Today, we will be discussing the Getting Started Guide, and tomorrow's session will be on Balsa. This webinar will be recorded so you can access it at a later date if needed. I'd like to introduce myself. My name is Lauren Rousseau from the Open Group, and I'm the Forum Coordinator for the Future Airborne Capability Environment Consortium. I also have here with me Dave Laumsbury, who is the Chief Technical Officer for the Open Group. Hello, everybody. And I'd also like to introduce our panelists that we have today. We have Dr. Alicia Taylor, who is IT Project and Planning Analyst, who is supporting U.S. Army PO Aviation, and she also serves as the Chair of the Integration Workshop Standing Committee. Steven Seeming, who is Vice President and Program Manager for Military Aviation Programs for TES-Savvy, and is also the Vice Chair of the Integration Workshop Standing Committee. And Tom Brixie, a Senior Software and Systems Engineer and member of the Data Modeling Staff for TES-Savvy Military Aviation Systems Division. He is also an active participant in the Integration Workshop and Data Architecture Working Group, as well as many other subcommittees in the consortium. For those of you who are not familiar with the FACE Consortium, we are a government and industry collaborative organization under the umbrella of the Open Group, which is a global organization that focuses on achieving business objectives through the use of open standards. The FACE Consortium uses the approach of the government industry software standard and business strategy that we're using to enable folks to acquire affordable software systems, rapidly integrate portable capabilities across global defense programs, and attract innovation and deploy it quickly and affordably. I mentioned earlier that we developed our industry and government collaboration via the Open Group, and we currently have 90-plus organizations and over 1,100 participants on our FACE Consortium mailing roster, which has grown exponentially over the past six years. The benefits to government using an open standard collaborative approach between industry and government is better buying power, increased competition, achieve affordability, and control lifecycle costs of the multiple programs for the various aircrafts that are out there. We want to incentivize productivity and innovation in industry and government and ultimately reduce software development time, saving time and money through modularity and portability. Also for cross-platform decision-making, reuse is the primary use for the FACE approach, not investing multiple times for the same capability in enabling integration of cross-program requirements. The industry benefits as well. The FACE approach enables industry to create software-centric product lines, opportunities to develop capabilities once and use it for multiple customers across multiple platforms. It also provides opportunity for software capability across multiple aircraft types, which may have not been available before when they were supplying for one customer. It enables them to get into a market where multiple customers can benefit from the capability. It lowers costs of doing business, common standards, lower costs and schedule risks. Developed once, the quality assurance is done once, and if it's proven once, it's proven to work in multiple environments. Standardization allows for rapid development of capabilities and reuse of software application enables integrators to optimize platform performance. If they have tested it once and integrated it once, the learning curve goes down and folks can do things more rapidly, and the ultimate goal is to get high-quality capability to the warfighter faster and in a more cost-effective way. For more information on the FACE approach, you can go to our FACE landing page, which is accessible at www.opengroup.org. This is our public site. We do have a collaboration site that we use for members only. And if you're interested in joining this consortium and want some more information, we can give that to you by sending an email to ogface-admin at opengroup.org. On this landing page, it will give you information on the consortium activities, membership, published documents and tools that we use for developing FACE software, recent procurements, a link to the FACE registry, and also information on how to navigate through the conformance process. On our main landing page, you'll find a section called Documents and Tools, and under that section, you can find published documents as well as the newly published software supplier getting started died, which our panelists will be giving you more detailed information about shortly. And also on the main open group website, www.opengroup.org, under publications, you can search for the software supplier getting started died, and this will allow you to download the document as the PDF is shown on the slide. And if you have any questions during this webinar, we ask that you use the Q&A section, and we will try to answer as many questions as we can. If we don't get to you, you can send your questions to ogface-admin at opengroup.org and somebody from the team will get back to you and get you the answers that you're looking for. And with that, I'd like to introduce our next speaker, Alicia Taylor. She's going to give you a bit of a background on the Integration Workshop Standing Committee and the software supplier getting started died. Over to Alicia. Thank you, Lauren. The Integration Workshop Standing Committee actually developed the software supplier getting started died. And I noticed that several of the attendees participate in the Integration Workshop Standing Committee, but I just wanted to point out our charter. Looking at the second bullet is to discover, evaluate, and produce face reference implementation examples and facilitate adoption and publication of these examples. The software supplier is just one example of some things that we do. I've also highlighted a few things, the FaceTIM and conference events. One of the things that the Integration Workshop does in conjunction with the technical working group is to review the papers and kind of help with the coordination of the presentations. And you can see some information there on the previous TIMS. We also host what's called a BITS event, which is just an integration event. Lauren mentioned that integration once and your learning curve goes down. That's kind of the idea behind the BITS event is it gives you an opportunity to integrate software with Balsa, which we'll talk about a little bit later. And then, as you can tell, this last event in June, we had four teams, but we had 10 face organizations presenting. So that means there's a lot of integration going on between companies and between software. We've also looked at the code challenge. I mentioned the software supplier getting started guide, Balsa code, data models, and user guides. So I just wanted to kind of touch on that just a little bit. And what I'm going to do is just walk us through at least the first parts of the software supplier getting started guide. So just bear with me and I will share my screen and get up and running on that. So looking at the table of contents, you can see the first chapter, the first section is divided just into primary documents. There are certainly more documents that the software supplier getting started guide are, excuse me, there are more documents than the face organization, the face consortium produces. These are just some that are highlighted for the software supplier. Then we've got chapter two, which kind of walks us through some of the environments, units, conformance, data modeling, artifacts. We'll spend a little bit more time, or at least, both Chris Cook, who's a subject matter expert for Balsa, will be one of the presenters tomorrow. And he will focus a little bit more on Balsa. Today we've got, following me will be Steven Simming. Steven is going to talk a little bit about the artifacts, the conformance test suite, and we'll key on appendix A with the testing for the face conformance. Sorry, the face conformance test suite. I probably should finish my statements. Just kind of scrolling down. We've got a number of tables in there that we tried to highlight some things. And then going down to our first section. The software supplier getting started guide is designed to be a navigational quick start guide for software suppliers. What it's not is it's not an overall view of face. This is geared primarily to the software supplier. A software supplier is really anyone who's providing software to be certified as face conformant, or just anyone who's interested in knowing more information. We use Balsa, which is the basic avionics lightweight source archetype as the application. It's an on-ramp example. One of our philosophies is you learn by doing, and the Balsa software is actually available to you or anyone in the face consortium to play with, to integrate, to basically just do whatever you'd like to. If you'd like to add to it, certainly there's that as well. And again, I mentioned more information is found in Chapter 2. There are four primarily goals of the software supplier getting started guide. It's just to provide startup guidance, highlighting basic information, looking at an application. And then I mentioned learning by doing approach where you can download, explore, analyze, update in a great test, et cetera. And then we also want to assist you in gaining more understanding of the verification process. And this is one of the things that highlights or that separates the face consortium with the face open architect. It's a little bit different. It has not just the technical approach, but the business approach. But it also has that verification and certification that allows you to certify that software does meet the requirements of the face technical standard. So I'm going to start. Primary documents. Lauren mentioned the landing page. You can get to that. She also mentioned the documents and tools section. That's one of the things that I'm just pulling out some documents here. I do want to point out that we recommend that you use the most current addition for new development and verification unless you're contractually required to use a previous addition. Downloading information in some cases does require a password, but that is free of charge. And you must have a valid email address. The first document I want to point out is just the face overview. If you're brand new to the face consortium or want to learn just a little bit more about it, we recommend that you start with the face overview. There's also a face 101 technical briefing. That's very good. We also are working on getting some updated information, not just about the face 101, but something a little bit more kind of specific that we'll talk about the technical business and data modeling overview. Our face technical standard is basically our keystone document, and all the other documentation kind of supports that. You may not know, but each addition of the technical standard has a corresponding rig, shared data model, shared data model, or data model governance plan, conformance verification matrix, and then a conformance test suite. The rig again just provides some best practices. It provides example scenarios, and it does go into a lot of detail, but it's just something that helps you understand the technical standard. I mentioned the conformance publication and tools. We've got a number of things there, including some information on face conformance 101, a certification guide, the verification matrix, the matrix user's guide to assist you in interpreting information, the conformance test suite. It's a test tool that measures whether or not your interfaces or applications are built to the technical standard. And please ensure that if you're using technical standard 2.1 that you make sure that conformance test suite that you're using aligns with that. There's also third party tools and applications, which can be very beneficial to you. These are not necessarily verified that they do pass conformance, but these are just additional tools that are supported by third party vendors. And then we recognize in the face consortium that our documents may have issues, probably not often, but occasionally that does happen. Or somebody might like to have more information or something else included. We have what's called a PRCR process, a problem report, a change report, and looking at this, basically your PR report helps you. It prevents a unit of conformance from obtaining a certification certificate and a CR change request simply means that you've identified an issue that you would like to see included in a consortium product. And the last item I want to highlight is just the face contract guide. And then that brings us down to Chapter 2, which again provides an overview. It talks a little bit more about setting up an environment, Balsa, how to operate Balsa data model. Most of that will be covered tomorrow. However, Steven, see me, I want to turn that over to you. I think you're our next presenter and you're going to talk a little bit more about kind of a brief overview here and then maybe data modeling and then running conformance test suite. So Steven. I'm getting control of the pass on that. Okay. Let me share my desk screen. Well, good day. This is Steven, see me and I'm here with Mr. Tom Bricksy, both to us savvy. I'm also the vice chair of the face integration workshop with Alicia. Yes, have he's been face members since the inception of 2010 and we're also a VA. Together, Tom and I are going to leverage Alicia's introduction. We're going to present the contents of the getting started guide and step through an example. So you become familiar with the design and the flow of this important document. But first, how did we get this document? When the integration workshop was incubated, it's almost four years, about three years ago. The issue was there's a mass quantity of very good technical information that is unfortunately hard to navigate. Overwhelming is a word that was often used and it's okay. It's still good information and improving along with the standard. So the integration workshop was formed with the charter to help product development, help the ecosystem products and help facilitate others to embrace product developments using the technical standard. So the initial task was this getting started guide. As we mentioned, a navigational quick start guide that helps readers navigate through the plethora of face products. The reader should quickly locate what they need, what they want, and literally grab a working example and run that example on their own machines, which is useful indeed. Essentially, as a 30 plus developer, I found that when you have something that works on your own machine, then you can say you're on the on ramp of the developer highway and ready to progress so we can hack examples and go from there. The other thing on this guide is keeping it simple. As we look at the table of contents, you can see a lot of these sections, the authors limited it to about five pages per topic or per section. That's not always the case, but mostly the case where you grab a working example and show execution. So that should be the common theme to the reader is get something, grab it kind of akin to when you're at the grocery store, you got that kind of contents at the end of the list, you can see what's there go down there if you want it. So, as it's mentioned, a lot of very good documents upfront, I typically go through a table of contents, because it gives you a preview from beginning and end of the document. So we got a bunch of helpful documents upfront. Essentially, then one of the first things you do is you set up your work environment. This is one of the challenges we have in face because face is a software product that's agnostic to hardware systems and operating systems. Where we try to be prescriptive, we can't be at the same time. So bear with us on the abstractions. We try to give you an example, but we can't say one operating systems better than the other. One tool suites better than the other. That's just the way we have to operate. So at first you set up your environment. There's a couple of examples of how to target the operating systems and grabbing the example. Chris tomorrow is going to go through section 23 and 24 of the data model. So I'll defer that till tomorrow's webinar. Essentially, then we have a descriptive section here. I'm sorry, I'm jumping too fast. Data modeling is a face standard. It's a very descriptive about how face data modeling is. After you're done getting an environment and the software applications together, you're only part of way through the face conformance process. The next thing to do is have artifacts that are used with the face verification products. We'll actually show you how to test your application. But then there's some other things that the verification authorities are going to require to ensure that all face requirements are met. And those are met through an analysis of artifacts. So we have a section in here is what are the conformance artifacts and an example of how you would actually map your artifacts to a requirement. Then last but not least section 2.6 is when you have your product running and your artifacts together is to locate a verification authority and go through the verification process and conformance process. So we have how to log in, how to find a VA, and then essentially it ends with that we'll talk about it at the end of this example. Reporting problems and change reports because it's actually quite useful and will be used. But what Tom and I are going to do is we're going to go through this example of second appendix a obtaining a face user supplied data model and testing using the face conformance test suite. We assume, like most people, as you're new, you don't know what a data model it is you don't know what the conformance test suite is you don't know how to get access to anything. So we're going to just walk you through this one section we together authored it and run it through. So I will step through the actual document first. It all should be hot linked. And so appendix a is what we're walking through. It describes what you're doing. Basically, you'll pull down the face user supplied data model. And then you'll add in your own data model, which is the description of messages that are being passed through the architecture there. One of the first things you need to do is get access to some documents. There's some performance documents. And let's go ahead and grab the test suite. These are all linked. So if it loads correctly, it takes you to a place on the face website, and we're going to get some test suites in this particular one. Which one are we grabbing here, Tom? Let's turn to one three two one three is so we picked one to run through the example. So we got 214 a little bit further down two and three will grab the zip file or the tar file. Let's take the tar file. Okay, click on it. And it should download. If I'm connected to the internet doesn't look like it's downloading. What happens is downloaded. There we go. Takes a few seconds or a little longer, and you'll pull down the file onto your own machine. Back to word. Performance test suite. We need to then grab an example of some data model. Oh, here's the tools. Disclaimer on the tools. There's again an ecosystem. These tools are not vetted by the face consortium, but again they are useful. What we need to do is grab a data model to share data model and we're grabbing edition 2.1. Okay, and we will load the face data model should download. So we download the face data model. Are you guys seeing this now on the screen? Is it showing well? Okay, so we've got conformance test suite and the data model loaded. We provided as a part of the consortium provided an example data model under the third party tools. So if you wanted to access that you could hold bunch of tools that are available publications. Well, I wanted to show the third party tools first. It's not grabbing me to that. Let me try again. This one was slower this morning when we tested it. What we can do is we'll just pull it this way. Okay, third party tools in your applications. A whole bunch of different test suites. Again, these are not vetted by the open group, but they're available. What we did is we had a data model. We put the data model up there. So you have access to that. Scrolling down. We took the 3.0. Correct? 3.1. The 3.1. Pull that down. So now in my download files as it's downloading, I have a conformance test suite, the shared data model, and we also should now have an example of a user supplied model here. So with that, I'm going to then get I'm going to go through the deck first and then we'll pass it over to Tom. I do that because the conformance test suite will not run on a Macintosh. So we have a Linux box next to us here. So we're going to fire up the conformance test suite, configure a configure the data model, bring in our user supplied model, configure the test suite tools, which means you select which components you want to test. And then run the test. The test takes about three minutes to run. So while we're running that, he's going to pass the ball back to me and we'll show you the data model and then we'll go back and interpret the results. So again, what we should have is access to all the tools, working examples and running through. So I'm going to go ahead and pass this over to you, Tom. All right. Stop sharing your screen. Very good. Very good. So what we have here is a representative Linux platform. Stephen mentioned that we have the test suite for looking at a terminal containing the test suite 213 right now. And we also have a terminal view of the data models themselves. We have our ADSB model that's from the third party site. Today we're going to compare that against since it was built against the phase 2131 shared data model. So to fire this up, all I need to do here is we need to run the script that's provided within the CTS itself. And we'll bring this up and we'll see the UI that Stephen pointed out. There we go. There are configurations. Today we're going to run a data model test. So we'll just concern ourselves with that or share data model, which is contained within data model directory have to have it conveniently on the desktop today. Share data model goes in first. Open that up. And then the USM is also having the same folder. And on the correct load what we should get presented is from our USM we should get all the UOPs that are going to be capable for being tested. We're going to select them all. Let's go ahead and load it up. Here we go. We have a four. And I'll check all four here. And after we run the test, you should see each one of these components loaded. Absolutely. And then we'll show you the data model, what's it looks like while it's running. Correct. And we'll just take the defaults for saving offer conflict today, just for sake of example. Okay. And okay. And all's good seems to be let's test our data model. And we're off. Passing back over. Yes, sir. Or I can just grab it. I don't think I can because I'm not at the center. So those of you who would like to see what the data model looks like when we're running it. Okay. Tom clicked off during the configuration of the user supply data model components correspond to the ATC PCS component, the platform configuration, the ADS transponder component, and the Iggy component in there. So if we go from the top level, what a face data model here's what we downloaded. The conceptual module, all logical module, the platform model, and then the UOP is the one that we've downloaded off the site there. And this is what we're testing. And if Tom, you're finished testing. Not quite there. There are some other additional information in appendix B about data modeling. We'll probably get some more data modeling training with a particular course there. So I'm going to get back to stop sharing and pass the ball over to you so we can show the results of the test. Oh, and the results are in. So what we're seeing here is that true to form our data model configuration that we want to show test for our UOP testing with the ATC PCS component. And the four components here is we have listed out and sure enough, they all passed with this. Yes, I will. There we go. Three main components within the test itself. It's the metamodel validation, which is, is it a properly formed model by virtue of the face 2.1 metamodel? Along with that, as found in the data model governance plan or the OCL constraints check, as well as the shared data model performance, which would be things that perhaps might be a question if we had an observable or something that affect in regards to follow on activity to get processed by the chance control. We have a passing model with a peer series. Thank you very much. Send it back over. I'll share my desktop. And Stephen, while you're doing that, there's a question that says, will we need a particular data modeling tool or is one provided? On that third party tools are some free tools and some tools for sale that you can peruse. I showed you one. That's actually the one that we develop in-house. That's the one we're used to using. So that's the one that I showed you a little while ago. So the third party tools, you can get data modeling capabilities there in the ecosystem. So just to recap here, the results passed, the data model passed. And now there's additional data modeling training also on the same site. And what I wanted to do is what we'll find out because the standard is evolving. So what you're going to find out is as you create more robust products, it's very likely that the shared data model performance will need additional elements added into the tools. So we will need to use the PRCR process and add those in. So what I'm going to do is go back to section 2.6 really quickly if we have more time. Oh, went too far. Sorry. Table of contents 2.6. All right. So let's assume you have your product together, your artifacts together. Probably the first thing you need to do is to locate a VA. This is a section that identifies how to do that. The conformance process talks about it. Now what happens is you will need to identify if there are issues that the data model needs a new element added for you to pass conformance. Here's the ticketing system for the change reports and not all changes are bad. This means addition to the shared data model. Here's how you get to actual processing and checking that. And Tom did a good job of actually how we track that example. It's in another appendices. So with that, we've presented the contents of the getting started guide and we actually showed you through an example of how the flow and design is for the document. So I think we're at a good place to ask and address any questions at that point. Simon, do you want to take it back? If anybody does have any questions, we don't have any right now in our Q&A section, but please send them along in the Q&A section or the chat section and we can get you an answer. We have had a few folks ask if there'll be a link to record the webinar and I want to let everyone know that if you register for the event, you'll get an email with the link to the recording and we'll also make this available on our website as well. So please type your questions in and we'll give everyone a few minutes if there are any questions. And just as a reminder, tomorrow at 2 o'clock central time, 3 o'clock eastern, we will spend a little bit more time on the sections dealing with Balsa, how to access Balsa, how to get it up and running. All the Balsa guide, all the other documentation including another data model will be included in the package for those of you that are consortium members. If you're not a consortium member right now, just bear with us for a couple of weeks. You will not have access to Balsa yet. We're in the process of getting it through PAO approval. So it looks like there's a question. How are DO 178 issues managed? Example, oops, sorry, let me pull those back up. How are DO 178 issues managed? For example, tool certification, et cetera. Steven. Yeah. As a face VA, this is a very good question. Primarily the face standard does not require their worthiness. But the A in face is airborne, and we all know that we have a airworthy component. So the question is, how are the 178 issues managed? They're managed outside of the standard. Primarily when you're configuring and conforming, verifying, they go to the face requirements. That probably means that your contracted application has additional DO 178 requirements. And those objectives need to be satisfied elsewhere in your documentation, but it is not within the scope of the face standard or verification authority. Hopefully that answers your question. There's also a document that I helped co-author. It's an airworthy supplement to face. And if you write to me your email, I guess it's Peter Straub. This is Steven Seamy. If you write to me your email, I can send you a copy of that particular document. And it's an adjunct document that kind of says what is the, it hits the heart of your question. Okay, the next question is, will Balsa example be updated to the Edition 3.0 technical standard currently? Balsa is aligned with Technic Standard 2.1. Actually, it's already in the process of being updated to 3.0. As a matter of fact, we used some updates to Balsa to test some of the supporting information and alignment with Text Standard 3.0. So it has already been partially updated. It will be updated to Edition 3.0. I can't give you a timeline on that, but yes, the plan is to update it. Okay. Other questions? Alicia, in the chat, I think there was a follow-up to the first question that came through. It came through via chat, and it is so. We are using 178 certified and base conformant, RTOS, and base TSS. Steven, thank you. So, are we using? Okay, I'll mute there. I would help. Are we using? Keeps moving there. It was just a comment that, you know, that this is the state of Lownsbury. You know, yeah, I mean, that's the right kind of way of thinking about it that, you know, that you need to have a component that sort of satisfies the V178 regime for flight safety and base conformance for modularity and reusability. But I think the question, and Jeff's question is, in the current Balsa set, is it a 178 face conformant RTOS? And no, it's not. We do know one of the BITS participants actually integrated with the face conformant RTOS, but that's not the baseline for the Balsa set. And there is no face conformant TSS. Okay. Yeah, Jeff said it's just a comment. Yes. We do have a few more questions that come through. Is there already a text conformance suite for a face 3.0? No. It's in the process. Yeah. Usually the tools follow the standard. The standard is not released yet. It was snapshot. Next question. How hard is it to migrate from face 2.1 to face 3.0? We haven't tried that yet. We're learning. Yeah. I would say I think it depends upon what exactly you're doing. You know, TSS is a little bit more complicated than, you know, P-trip lists or PCS or something like that. So I think it really depends on what you're doing. There are some changes. And I think there's working on, there's a group working on kind of some migration approaches, things like that. But really, I think Balsa is probably the only thing that has been migrated and it hasn't completely been migrated. It's just been using kind of, I guess, parts of the technical standard. This may be one of the questions that you want to ask Chris tomorrow, but he's the one that has been working primarily on the updates along with Joel Sherrill and some other folks as well. But I would encourage you maybe to ask Chris that question tomorrow. Okay. So there's a summary of key changes between 2.1 and 3.0. That's usually published with the standard. So that should be coming out. That's Corey's question. And then Pete Strahl has a question. Is the test rule a certified test rule? Yes. You download it off the open site. It is a sanctioned tool and it's released with each standard. As is a conformance verification matrix is aligned with each standard. Good questions. I don't see any more questions. Chat window. Looks like we got it. You could hear. Great. And if you do have questions after this webinar ends, please send an email to ogface-admin at opengroup.org. And we'll be sure to get you the right answers directed to the right people. And again, everybody will get a link of this webinar sent out to share with others. And we thank you for joining. We also encourage you to attend part two tomorrow. And with that, we are done for the day. I want to thank our panelists for joining and Simon for facilitating the webinar for us. Thank you, everyone.