 Obviously, Title X U.S. Code, all of the acquisition authorities, all of the authorization acts and appropriation acts impact us. Our federal acquisition regulations, defense federal acquisition regulations, DOD instructions, every service has their own instructions. So we get lots of help from everyone on how to do acquisitions. At last count, it was 547 distinct documents telling us how to acquire things. But of late, we've evolved our understanding of MOSA. The modular open systems approach acronym first appeared, anyone want to guess? In the 90s, so we've been talking about it for some time. What we haven't been doing is defining what we meant. So now we're getting to the point of doing that. So now our understanding has evolved. They've defined in the last NDA, so now it's in 10 USC. They define a major system platform, a major system component, and a major system interface. Yay, the government now knows about interfaces, we're very happy for that. Congress actually caught up with the idea. We also have this thing about standards preference, and it's again in 10 USC, where we have to prefer commercial standards and other widely adopted standards ahead of our own. So military standards, government standards are generally our last choice. We're supposed to be using what industry is providing. But in some cases, we have to go help industry get there. Some general types of standards that we look at, this is a bit of an eye chart, I do apologize. But we have government standards that take a long time to come together, but then they tend to have a longer life. And then we have these program specific, and I'll show you an example of that later, where a program says, I need a standard and I need it really fast. So we'll go with that in a moment. And of course, we're supposed to favor non-government standards. So consortia, like the face consortium and the social consortium, we also can use a proprietary standard if it's widely adopted. Sometimes we do that simply because we need the interoperability it provides. But I won't mention Microsoft by name. And then standards development organizations, the larger idea of organizations, something like SAE or IEEE, where they have a long history of putting standards together. Those bodies tend to be of various sizes depending on who they're talking to. And they tend to take a long time to get things. How long did it take for us to get 802.11 and Alpedore? We had draft standards for about six, seven, eight years before the final standard came out. So sometimes those bodies move slowly as well. Two methods that we have used in the government to get to standards quickly, more quickly than waiting for industry to provide them are these two, collaborative working groups. Now this is where we have a limited number of parties, often because it's classified event or something like that, some sensitivities in the programs, etc. This is paid performance. We put people on contract to develop a standard for the government. So we get all the data rights we need, we have everything put together. The benefit of this is that we have a fairly predictable schedule of labor. We know people are going to be working on this for X hours per week. We're paying for it. So we can get a fairly good rapid development of a standard that way. The problem with it is because you're limited in scope of how many people you're willing to pay and how many people may or may not have access to it. That can really hinder your further adoption because you're getting a soda straw view of the requirements. So you don't tend to have a good widely open standard when that happens. For consortia, they're much more open. It's a pay to participate. So people need to see an incentive in there. There needs to be a business model for the consortium in order to encourage people to see the value in it. It's diverse, it's voluntary labor. And of course, schedules, what's a schedule? I mean, we can set up schedules, we set objectives and goals, but we don't have as much determinism in achieving those in a broader open consortium. The benefit of this that's undeniable is that the broader the consortium, the more open the standard will be and the more adoption and uptake will come about. Now, as an example of the collaborative working group, I'll talk about the open mission systems or OMS. Now, this is government owned. A collaborative working group was funded for years to do this. And we got through to a standard, the first edition was out in less than two and a half years, that's pretty good. And the first edition was actually fairly good. We're now out to version 1.3. And this is a single abstraction, basically a service oriented bus architecture where we just defined one layer, this is it. But we also used other work that had already gone on in a particular organization, the Universal C2 interface as its new name. And that message set was used as part of the specification. What this does is it allows us to more rapidly integrate things to keep things moving. So we can put things together quickly with high specificity and have things integrated. The issue with this standard is it's focused on mission systems in the Air Force definition of that. That means the non-critical, non-safety critical, non-real-time type systems. Additionally, this one is restricted to US persons only. Simply because of the genesis of the requirement, some of the messages that were built to go with it. That drives a level of sensitivity that makes it really hard for the rest of the world to adopt the standard. Now, it's adopted in many Air Force standards. It's been mandated for mission systems in the Air Force. And one Navy program so far. So as you can see, there isn't a definite impediment to adoption. Now, the face technical standard. This is one of the two consortia I work with on the open group. And I'm happy to tell you that the face consortium has just put live their version 3.0 of their conformance test suite. So that's a tidbit. Official announcement will be tomorrow. But with the face consortium, this was, again, a volunteer consortium of people coming together. With the safety and security and general profiles, it tried to cover a very broad area. And it defines actually three interfaces and five layers. And with that, we decided after version 1.0 and into version 2.0, we really needed to have a data model. So now that's growing. So now the snapshot of the universal domain description language, or UDL, has just been released as well. So this grew into much more than it was initially intended to be. The soda straw wasn't there, it was much broader. And so it has been a little more open and it's much easier for folks to adopt in other places. The sensor open systems architecture are SOSA consortium. Now, we use these words here because our general said to. Integrative, inclusive, and incremental. What we mean here is the standard is trying to integrate a lot of other existing standards in a sense that makes sense for sensors for DOD and beyond. Most of our DOD sensors are one-offs or two-offs or three-offs. We put one or two of them onto a particular platform and then we have that platform go fly its missions. And then we get into the sense where, okay, now we need to do something different. Well, we need another platform because this one's only got that on it. So one of the goals of sense is to have that interoperability, the interchangeability of parts to get there. Our second snapshot, actually an update today, the SOSA second snapshot of the technical standard is published. Yay, or no, published this week, I'm sorry. I don't want to upset Judy or Sharon. But it will be published before the end of the week. Now, the challenge is here is there's a lot going on in that standard and it's very active, it's doubled in size in less than a year. So there are a lot of workers together. And the focus is being somewhat driven by the urgent needs of programs today. We're bringing that abstracted information into the consortium and pushing them forward. So we're struggling with the architectural independence because we're getting a lot of hardware specification type input as well. So it's something that's a bit of a challenge to manage. But one thing about all of those is we're trying to keep them balanced. We've got conflicting interests. We have government interests. We have industry interests. So from a strategy perspective, balance is the key word. Obviously, you need to not tick off too many people and please enough people to maintain your involvement and your work going forward. So in order to do that, I'm using the carpenter rule here. Use the right tool for the job. And in order to do that, you have to know what the job is. You have to be familiar enough with your challenges, with your issues, with your problem set that you can effectively elaborate it to a five-year-old, all right? You also have to be familiar with your tools, train if needed, because you want to be able to have the right tools in the right place to do the job. You have to apply them properly. You can drive a hammer or you can drive a nail with a pipe wrench, but I don't recommend it. You cannot drive a deck screw with a pipe wrench. So use the tools properly that you have and then maintain your tools. One of the things we've seen from the government side is we have a terrible time keeping our software up to date because it takes somewhere between six and 12 months for us to get certification to run a piece of software on our big integrated super safe network. Remember, safety is the enemy of usability. Now when it comes to supporting these standards forums, we have business strategies there. We can adopt any one of these roles here. We can be passive, wait and see, it might be good. We'll just wait and see what comes out. We're not going to worry about it too much. We can participate. We can come in and say, hey, we're going to serve on a couple of committees, provide some input. We can also be pushing a standard. And this is roughly where we are with the SOSA standard today. The government is pushing that standard. Now that last one possessive is generally only true of collaborative working group type standards or of proprietary standards. Where you own it and you're driving it through with all the funding effort. Now when it comes to adopting standards, we can take all of these approaches too. And the old breakfast joke is here. So in a bacon and eggs breakfast, the cow's indifferent, the chicken's involved, but the pig is truly invested. So if you're indifferent here, this is one of those where, eh, I'm not going to worry too much about the standard. It's not a big deal to me. I don't see it breaking my business model. We can also say, oh, this is one we need to be involved in. And then there are times like with SOSA where we're invested because we started up the effort to try to get the standard going. We chose a consortium method to get it going. So the government adopts the different levels depending on the nature of the standard and who's driving. So how do we apply these standards once they're out there? Well, we have many approaches. We can mention them like the MOSA approach back in the day. We can say, oh, yeah, yeah, do that with no further instruction, no references, no nothing. That's not terribly useful. We can also mandate them, say, go use the standard again without providing any instruction. We can say, you should use the standard. Now, again, standards should be applied in the right places, and they should be applied in the right ways. So you have to be careful with that. Now, mediate it. That means you're actually out there telling people, this is how you use the standard. Here's some guidance for you. Here's some do's, do nots. Here's a checklist. You're giving them enough to worry about. And the last one is one we have not achieved yet. That's modeled. If we model a particular standard or methodology and say, here's the model built from this, that's a much better starting point for our programs going forward. We're hoping to achieve this within the next decade. Conformance strategies. Conformance is important to us. You want to know why? Because everyone will tell me, yeah, I did that. We need to know for sure. We need a way of assessing. So conformance programs are important to us that way. So we have different ways of looking at this. We can have a compliant standard. They can verbally tell us, hey, we've complied with the parts that were applicable to us. But we ignored these because we don't like them. Well, they're compliant to a degree. That's a partial satisfaction. If they've got the key interfaces you're worried about, that might be OK in some cases. Conformant here means that there's a complete conformance to all of the applicable standards. Every last one of the applicable standards has been conformed to. That's important to us because then we know that if we bring someone else in that interface, that they're conforming to the standard. And then if we have an issue, there might be an issue in the standard, but generally not. Finally, confirmed. This is where we have an independent verification of conformance. Now, this can be done in a number of ways. But if there's an independent verification, I find that to be very valuable from a government perspective because it's not an opinion of someone bringing it forward. Now, lessons learned. Specificity is paramount. This is very important because we've seen a lot of application of standards in the government where they were just left out there and they weren't properly mediated. They were mandated without any further guidance on when they didn't apply or how they didn't come together. So you have to understand how and where each standard applies or may apply. You may have an interface where two different standards could apply. How do you choose the best one? Well, you need to be specific about what your intentions are in that particular case. And one thing we're moving towards is to model what you know, or in some cases, model what you believe to be true. With a more formal model, it actually drives you to the point of understanding your requirement. And in the government, that is a radical concept of understanding your requirement. We know we need something to go boom, but we have to get through the various steps of how to do that, what's acceptable, what's not, what are the restrictions, how does that come together. The second bullet there, be open to other views. We in the government have tended to be so, so distraught specific in our views. All of our programs are funded in a silo. So we have excellent silos. But if you don't want corn, you're in trouble, because that's what we put in our silos. So how do we get between the two? There's no cross bridges. There's nothing there. We've been incentivizing people to do things in an isolated fashion. Now, the benefit from that is they tend to be really good at that. The lack in that is that they tend to be very bad at talking to others about what could help them. So we have no enterprise crosstalk level, and we're beginning to approach that from a digital engineering and a modeling perspective to say, we need to change that. But if you're open to other views, then there may be another approach. It may not be optimal for your program to do X, but it may be optimal for your enterprise to do X. Can you take a 95% solution and work with that for most cases? The answer is yes. In many cases, you can work with an 80% solution that is cheaper across the board and satisfies several other people's needs. We haven't had the way to do that yet, so we have to work on that. The other thing is tooling. There may be other tools available and different methods than you've been using. If you've always done it this way, this way may not be the best way to do it. It might have been 10 years ago or 15 years ago when you started it, but you have to look forward to that. The other thing we see is folks not planning and taking reasonably sized steps. As we say, if you go for the all or nothing approach, nothing is definitely a possible outcome. So you have to be careful to understand how big your effort is and what you're going to try to do in doing it. That's why the sensor open systems architecture is the sensor open systems architecture, not the everything OSA. You have to have a specific focus that scopes it appropriately for you to work in. Here's another one that General Palankowski gave us a talk once at a particular open standards event on the Air Force Base at Wright-Patterson. And you need to understand the risks of the status quo. What she says is we're really good at seeing the risk of change, but we're really terrible at seeing the risk of not changing. So you need to understand the risk of your status quo. Is your software 20 years old and suddenly it's been aggregating over that time and nobody's bothered to re-architect it because you've been trying to get the next capability in? Well, a lot of our aircraft are in that position today. And how do we get them to do the re-architecting needs to be done? It's a difficult problem. But as we say, if you are a fan of the Princess Bride, are there rocks ahead? If there are, we'll all be dead. So, but you want to assess the risk of change, of course, but also assess the risk of not changing. That's the thing we tend to miss the most. So in conclusion, open's not optional. You have to find the balance between all of those competing interests as you're going forward. You know your situation from one perspective. Others may see it from another. Talk to people about it. Understand your situation from more than just your point of view. And collaborate wherever possible. There are other people out there that have had other experiences that may contribute positively to what you're trying to do. And finally, calculate all your risks, including the ones that you can't easily see unless you take a step back. John, thank you. Please take a seat. Where should I take? You threatened not to leave enough time, but you did, so we've got some questions for you. You mentioned CWGs in the presentation. Is there any discussion of leveraging the open group for the open architecture CWG recently announced? Promise I didn't write that. The answer is I don't know. That's not my lead, so I can't provide an input on that. So in the best government ease, no comment. Very no. Where does OMS fit in the support and adoption strategies with Part B being, is there a roadmap to more government direction? The open mission systems standard is successful in its own right because it started with a very small focus and a very sharp focus with some good people working hard to bring it forward. One of my goals, and this is a personal comment, not a government comment, one of my goals is to get the OMS standard to segregate itself from the UCI standard such that the service oriented architecture and the layering they've selected is retained, but can then be used in a data model environment similar to what the face consortium is bringing forward. I think that is the place. A lot of folks don't know it, but the OMS standard was initially given a copy of the Face 1.0 technical standard as a starting point so that they didn't have to reinvent wheels. And indeed, they ended up with a significant commonality in the APIs, the POSIX APIs chosen. So there is a lot of commonality between them. The significant differences between them are in the message set utilization. You gave us a sneak peek that the latest version of the test suite is coming out for the Face technical standard. What's the significance of that? What's the significance of having a test suite generally and specifically an updated version? Well, the conformance test suite is important to the government from the perspective of giving us assurance, independent assurance, that someone has achieved the intent of the standard in creating an open interface so that we'll have the ability to interchange as needed later on. Our goals are to drive our costs down and drive our response times to a lower level so that we can make changes more rapidly. As it is today with a legacy architecture, it can take years to make a change to a platform. We'd like to be able to cut that down to a matter of months in the future with the appropriate standards instantiated at the appropriate places in our platforms. Question just come in. Have you considered the application of data science to your standards modeling efforts? Indeed we have. The problem is, if you ask anyone in any of the services, what's the specialty code in the Air Force for a data scientist? We don't have one. Yeah, okay. What's the specialty code for an efficient modeler? Oh wait, we don't have one yet. We have to make some changes to our personnel and training systems and in the meantime, we will be buying services. So if you offer those services, be aware that the government is interested in finding them at the right price with the right focus going forward. Specificity is often considered the enemy of innovation. How do you balance these things? Well, specificity in understanding what you're trying to do is critical. Now, as again, I said, also leave yourself open to the views of others. When we take a view of something, we have our own perspective, our own mindset coming forward with it. There's more than one way to skin a cat, if one were to try to skin a cat. I prefer to just put them out myself, but it's a matter of simply being specific at the right places. For instance, if we're talking in architecture, from my perspective, one of the main reasons to build an architecture is so that you can specify interfaces. You specify interfaces sufficiently to allow interoperability and interchangeability. In order to do plug and play like Microsoft has done and Intel has done with USB, it took years and millions of dollars and a lot of concerted effort to get to the point where you can literally just plug a device in, it will load its drivers and it will start up and it will work. If you remember USB 1.0, it wasn't quite that advanced. So having that level of interoperability is a goal and you don't get there unless you highly specify the right parts of your interfaces. Which leads perfectly to our final question, which is you explained that SOSA was the sensor open systems architecture because it needed to be something you could focus on. Nonetheless, that's focused on military sensors right now. Is there thought in the group of how that could be more generally applicable in the world of IOT and IIOT? Absolutely, we don't want to try to walk on the IIOT guys at any point in time, but we are looking for commonalities between them. We're pursuing a dual path in the SOSA consortium. The architecture level is intended to be generic enough and common enough to allow us to look at the modalities, not necessarily the military instantiations of them, but the modalities of the various sensors we understand today. It is certainly extensible in the future and we do want to see if we can get that aligned with where we're going with all other types of sensors. We don't want to diverge from where industry is headed to any degree at all, but we do have some driving requirements on the specification side to get us to an initial capability for programs. When you're talking about a consortium, a consortium needs the pull from the customer, if you will, from the government in this particular case. So we need to drive that forward by telling people we have a need. We do that through programs and programs themselves have the funds to help this happen. So we're balancing those two conflicting requirements as business strategies are all based on trying to do specifications while also building the long lasting architecture. We'll leave it there. Thank you very much.