 Good, hopefully you can see my screen. I thought to frame this talk so you'll see that this is about decentralized semantics. So I thought the best thing to do is kind of frame this as a given overview of a dynamic data economy. And that's really a decentralized trust infrastructure for a safe and secure data exchange. So this is a dynamic data economy. So it's a decentralized trust infrastructure acutely aligned with the European data strategy. So as many of you know in Europe, there's been a huge drive and quite a lot of finance behind this kind of paradigm shift of data management that we're about to embark on. And really this is where the actors have the ultimate aim is for actors to have transactional sovereignty to share accurate information bilaterally. But in order to do that, you know, we need to underpin these distributed data ecosystems with governance and semantics and authentication. So this is a this is the master mouse model. It's basically a fractal model so you can use this model kind of internally within an organization or externally within a distributed data ecosystem. And one of the things we always say human colossus foundation is, you know, data is like electricity. So it has value when it flows, but it's costly when it stagnates. And that's really where the dynamic part of the data economy comes the dynamic dynamic meaning that move movement. So it's really about the movement of data. So just very briefly what's in here is, you know, everything is citizen centric. And, and we've also got a space here for guardianship so for instance, if a citizen is has dementia or something like that then you know some of the, some of the legal authority might be pushed over to a guardian so that's those two spaces at the top. And then we kind of look at this is very much as a service model so purpose driven services as opposed to insights driven services purpose driven services are could be anything from you know a taxi service to a, you know, a clinical service, but that's really where new data is captured into these distributed data ecosystems. And on the other side of the model there is insights driven services so you know organizations or individuals that are searching for existing data within the ecosystem. And then kind of gluing all of that together is this idea of a data governance administration where they set the policy and notice the rules and regulations for safe and secure data sharing within their ecosystem. As I mentioned before there's, there's synergistic data domains that that all have to kind of work harmoniously together. And the DD concept gives equal weight of importance to the four core data domains so the semantic domain which is really about data capture and objectual integrity within the within the system inputs domain is about data and entry data inputs so that's the factual authenticity of the of any events in the system so any any temporal events so basically as soon as you, you, you mark something with a signature, then you can consider that to be a fact. Then there's the governance domain which is all about data access, and that's really consensual veracity for the policy and notice and regulations for the ecosystem. Once we've got all of that underpinning the the ecosystem then finally you can talk about the economic domain which is about data exchange, and we call that transactional sovereignty. So interestingly in the transactional sovereignty part that's also where you can have you sort out consent consent agreements, so you can really only agree to something once you have a peer to peer relationship with another, another entity that's transactional sovereignty. This is a model that's taken us literally, I would say about three and a half years to to settle on this model it's taken a long time. I've, I've learned in this space that patience is a virtue, and you know we've listened to experts in all of these domains the semantic inputs governance and economic domains. But just very simply what you can think of it is is is the semantic domain is all about objectual integrity is about the objects in the system. The inputs domain is about the events and the factual authenticity of events in the system. The governance is about the rules for policy and consensual veracity is the governance domain, and then finally is the economic domain which is about transactional sovereignty. And you know where you agree to, to data sharing and consent. We put a stack at human classes that we built the DDE trust infrastructure stack and this is very much ordered. So, you know we say that you know if you put rubbish into the system you'll get rubbish out so we really start with the semantics and the objection integrity, and that's really about the data harmonization within the within the ecosystem. So once you have that then you can look at the factual authenticity which is really about the data provenance where the data has come from. And to have that like a verifiable or auditable chain of, of provenance as the inputs domain. So those two first those first two layers is really all about the machines. And then once we have that that digital network in place then we can put the human domains on top of that so you've got the governance domain at layer three, which is as I say policy and notice and the rules and regulations within a distributed system, and then finally the economic domain about transactional sovereignty between bilateral exchange in a peer to peer fashion, and those top two layers are about human accountability so we talk about a trust infrastructure is equal to cryptographic assurance plus human accountability. In this particular series, we're really concentrating on the underpinnings of enabling that that transactional sovereignty from the economic domain so we won't go into the economic domain too much, but more about the underpinnings of the of the ecosystem to enable that to happen. As I say the DD is a next generation data agile economy offering a new paradigm and digital living interaction and growth with a vision of empowering people and businesses to make better informed decisions based on insights from harmonized accurate data, framed by sound data governance. So this week I'll be talking about decentralized semantics I'm very much a semantics nerd if you like I've been I've been working in semantics for the past. Nearly 30 years in the in the pharmaceutical industry where we have to deal with a lot of data, you know to enable drugs to be approved by the FDA or the EMA. So yeah I know a lot about semantics so I'll be talking about that this week. And then next Tuesday, my colleague Robert Mitwiki will talk about decentralized authentication and then the third week. I'll be talking about distributed data governance. This is what we call the accurate data pyramid. And it kind of very much goes about what I said before so we kind of start with the data integrity. You know what is it. And that's the deterministic objects, and this is this. And traditionally think about this as data management. The core the core characteristic here is about data harmonization. If that's in place you can look at the next layer which is authenticity where does the data come from, as I said it's about provenance. So factual authenticity of the information being entered into the system. And you can think of that as key key management or keys management. And then finally the third layer their governance. Think about it as access management so the epistemic rules for policy and notice to enable consensual veracity within a distributed data ecosystem. Okay that's enough of the overview. I'll get straight into the decentralized semantics part and really decentralized semantics is thinking it's kind of a new way of thinking about semantics so really about separating the semantic tasks for maximum interoperability and as I go through this presentation you'll get a better idea of what that means. So first of all for those new to semantics what is data semantics. It's the study and the meaning and use of data in any digital environment. In data semantics the focus is on how a data object represents a concept or object in the real world. Within the semantic domain, the core foundational characteristic is objectual integrity. And objectual is an important word here because objectual means not just the objects themselves but also the relationship between objects within a system. And integrity here means that you always know that the identifier of the object is always going to be the same object wherever you see that identifier across any data ecosystem. And that provides the overall accuracy, completeness and consistency of objects and their relationships. In the four domain model that I showed you before this is what we'll be talking about today is this top left one, which is all about digital objects. And you can ignore most of the words here. I won't go into that amount of granularity at this stage because I've got another slide later on that goes into it in a bit more detail. But the important thing to note here is that really data is the kernel of the entire of the entire economy. So here you can see that all of those four domains they kind of link to data and whether it's capture entry access or exchange. There's a few core DD principles for the semantic domain so we basically we built on top of the the fair, the, the, the fair principles, which I'm sure many of you have probably heard has as you delved into the semantic space. But we really add these four piece of these four additional items here so rich contextual metadata the captured context and meaning the metadata for all payloads must be rich enough to ensure. Complete comprehension by all interacting actors, regardless of written language structured data forms so obviously forms for when you capture data. Data governance administrations must publish structured data capture forms specifications and standards driven by member consensus for a common purpose or goal that will ultimately benefit the global citizens and legal entities they serve. This is really the key here I'll be going into data harmonization, a little bit further into the slide but this is really the crux of decentralized semantics and what it can offer. There's two areas of distinction to consider there's data harmonization, which involves transforming data sets to fit together in a common structure. And then there's semantic harmonization which ensures the meaning and context of the data remain uniformly understood by all interacting actors, regardless of how it was collected initially. So harmonized payloads are a must for multi source observational data to ensure that the data is in the usable format for machine learning and and artificial intelligence. And then the last one is deterministic object identifiers, and this is really, this is really important when you talk about, you know, distributed data ecosystems because as I said before, we're looking at a paradigm shift and it's, you know, longer looking at these kind of centralized silos of data, where we're naming their identifiers however they like, as soon as you go into a distributed model. The identifiers should be deterministic for all objects so if the result and the final state of any operation depends solely on the initial state and the operations arguments, the object is deterministic. So all object identifiers must be resolvable via the objects digest to be deemed deterministic. So this is a, this is the digital network model. We also have a socio economic model which is, which kind of delves into the governance and economic domains but really at this stage we're just talking about the digital network so this is really about objects and events. The objects obviously sit in the in the in the bottom hemisphere here in the semantic domain, and the events is is about the factual authenticity of what's put into the system so the data input so that's in the northern hemisphere. And really we built this model as a yin yang model. You know so there's always, there's always an opposite on each side of the model here so. What we like to think about this model is, we can start with a schema and a record so they're really the persistent objects in those two spaces. A schema is really where you add the structural, the structural integrity of the objects. And then on the other side you know how that information is stored it's it's usually stored as a database record or, or so yeah schema and record or the opposites of each other. I separated here the form and the credential for those of you working a lot in the SSI space that talk a lot about credentials, and a credential is a transient object that will allow you to do something. And I'm not sure that information that goes into a credential is usually from a form. So, if you think about, you know your driving license you know you have to fill in some information in a form and then you hand that to an authority they stamp it and then they give you the credential which is your driving license. Everything in the semantic domain is about is has to be deterministic. You've probably heard the term immutable data quite a lot actually in a distributed system immutable is is not particularly useful because you might have a scalability issue with that so we talk about deterministic deterministic objects in this case and and really that is that if you have the authorization to to delete an instance of an object, then that's totally fine you can do that it's it's not immutable, it's deterministic so wherever that object that identifier ends up in any system, you know it's the same object. In the middle there just around the horizon you'll see that's data entry versus data capture active and passive active basically talks about control. So obviously if you enter enter information into a system, somebody has to control that information so you sign it, and then it becomes a fact. On the other side of the model the semantic domain. We don't talk about control at all it's it's it's a, it's a light. It's it's more about a language rather than control. And I can kind of explain a little bit about how if you issue something how you can, how you know that's issued from a certain legal entity. So what is decentralized semantics. It involves separating all semantic tasks into separate objects so that different entities can act as custodians to those granular objects and I'll show you a couple of diagrams or what that looks like. Here you go. That's what it looks like so. So this is really about maximizing interoperability within the semantic structures within a distributed data ecosystem. So you can kind of think of a capture base as your, the schema and it's most basic form. The only thing that a capture base has is is it also has a flagging block where the issuer can flag any sensitive information within the capture base. But literally everything else is is a separate semantic objects that you can cryptographically link to that capture base. So here you know you've got the obviously got this semantic semantic box there the inputs box, and that's the transformation box and the presentation box and I can go into those and a little bit more detail in a minute. Oops. Hang on. I'm just going to move the chat somehow. Oops. Sorry about that. I've just lost my way. I'm going to get this chat out of the way. I'm just going to escape for a minute while I just move that chat. Okay, let me try again. Okay, here we go. Okay, so why do you want to decentralize semantic so I thought I'd start this with a question here so you know how can insights driven services perform accurate criteria searches within a distributed data ecosystem that have semantic properties from several data models or representation formats. So, you know, a good one to think about is, you know, the healthcare sector in in clinical research they use data models from C disk in more in the exchange of health records in the hospital side. There's a HL seven fire, and then there's also a big European project called Eden, he HDN and I just noticed that the data model they want to use is called omop. So you know once you how are you going to get these these data models or representation formats to be able to understand each other and that's what that's why decentralized semantics is really important so the answer to this question is data harmonization. This is a picture of a of a distributed data ecosystem, another, another visualization of it. But all this really shows is that you know the citizens are kind of, you know, they need to interact with these with these distributed data ecosystems so you'll see the citizens are all in the ring around the data ecosystem. Purpose driven services if you'll remember from the master mouse model is is where new data is entered into the system. So in this case, I put here the data capture ring if you like and there's a bunch of icons for purpose based purpose driven services, and then kind of pinning all that together as the data governance administration for the for the ecosystem. There's a few of the icons in there you know the rules, what's the legalities behind it the standards transparency policies requirements and regulations. Okay, so I said that the core, the core part of decentralized semantics is really did data harmonization, you know, there's just no way of being able to find the information you want properly without harmonizing the data. And the transformation overlays are can basically be held by the purpose driven services they can be controlled by the purpose driven services and then cryptographically linked to a capture base to harmonize that that information. So you can kind of think of you know we've got for transformation overlays I've mentioned here ones obviously for unit mappings the other one for attribute name mappings entry codes perhaps you need to map those. And then standards mapping you know you that that's more of an informational piece so you know you're going from one standard to another standard. So there's a couple of interesting use cases that decentralized semantics brings about one of them is internationalization, which is a common problem within within, you know, data management and internationalization is the action or process of making something international. So the internationalization of transient digital objects across distributed data ecosystems is essential for service providers to participate in a global market. We're traditionally presenting information for a purpose driven activity in a language understandable to all recipients has involved replicating digital forms credentials contracts receipts into various languages based on user preferences with federated or centralized maintaining digital objects and multiple languages internal data management and inefficiencies are common to many organizations institutions and governments. And yeah so so in this particular example here, we're looking at. We're really looking at Switzerland as a as a quadrilateral country, where the national languages are German, French, Italian, and a minority language called romance which I'm sure not many people on the call will have heard about. So, you know, decentralized semantics is well suited to represent this principle in digital information systems, since Switzerland is a federation, the sovereign cantons can define their official language, according to the primary language spoken in their in their region. So for instance here I've said okay the Swiss government can be issuing a capture base and some core overlays but maybe some of the language specific overlays can be controlled or have custodianship from the the cantonal regions, for instance. So another use case is presentation. And, you know, so this is really about the ability to cryptographically bind presentation overlays to the standard capture base used for authentic data entry. In many presentation instances that the legal entity that issues the original capture form may differ from the entity that issues the presentation objects required to produce an associated credential. For example, national passport issuance provides an opportunity, opportunistic use case demonstrate the advantages of this particular characteristic. So if many of you guys know I cow I CAO that's the International Civil Aviation Organization. So they're a specialized agency of the United Nations tasks with planning and developing standards for safe international air transport, but they've actually, they've got a lot of standards for machine readable passports, and you know they can define the you know they could be the the custodian if you like of the of the data capture form for the information for anybody globally to capture the information for a digital passport. For the, you know, for the presentation objects needed to produce a national passport. You know that the, the, the custodianship would would differ so as an example for Switzerland, the Swiss government is the authority to act as the primary issuer of in this case the presentation overlays for the passport. Whereas maybe I cow could be the, the, the controller of the of the initial capture base, and, and some of the core overlays. So it's a really interesting dynamic that we're looking at here you know you can really, you know, it's really about a multi stakeholder collaboration and semantic objects, where literally any of these layers can be controlled by totally different legal entities or departments or, or different individuals can have authorization for the custodianship those objects. So it becomes, you know, when we talk about semantic interoperability this kind of shows you where the power of it lies. So the final part of my, my talk I just make sure am I doing for time yet not too bad is is overlays capture architecture so this is a global solution for data and semantic harmonization this is a something that we've been working on and human colossus for about the last four years. And it's really a technical implementation of what I was talking about regarding decentralized semantics you know what does that look like when it, when you bring it to a technological perspective. So there's the common, the common diagram that we use, and I can go into a little bit more information here about about some of these, some of these boxes at this stage so the semantics is really all about the, the context of the data. You know, the giving meaning for the data that you're capturing. And you'll see some of those are kind of language specific overlays so what that enables you to do is basically you know rather than changing the entire semantic structure. You could change the labels from, you know, English into, you know, French or or something like that you could rather than rebuild the entire semantic structure you could just build a label overlay, which in the new language slot that into the into the stack and then you have the option of going into that new language. The inputs box there is all about data inputs so this is where you can put a little bit of control around what people are entering, which is obviously really important when you think about, you know, data insights and analytics. And if there's too many freeform text fields floating around, then it becomes a potential point of attack, but also more importantly is not particularly useful for data insights. The transformation overlays there. That's really about, you know, making sure you can go from the current way that you're capturing data as a service provider. You know, mapping to the to a consensually agreed data capture specification from the data governance administration. And then finally is this presentation box which allows you to present, present different objects still cryptographically linked to the same capture base but different legal entities could be controlling those different objects, whether it's a credential or a receipt or or or a contract or or form. It doesn't matter, you know, different people can be controlling those. Okay, so what is overlays capture architecture. It basically represents a form, which is domain agnostic and a schema which is more domain specific as a multi dimensional object consisting of a stable capture base and interoperable overlays. I'm just going to that a little bit of the difference between a form and a schema. So, certainly, you know, when I, from my experience with clinical data management, you know, usually you would have some sort of schema representation as a as a domain if you like so you know something like demographics or adverse events you know they would be they would be held as records within two different two different domains, whereas a form is a domain agnostic so on a form. You could be capturing information that might go into multiple multiple records held on on on databases if you like if you like so it's domain agnostic. So introducing overlays as a task orientated linked objects within the OCA bundle, the architecture of offers an optimal level of both efficiency and interoperability in alignment with fair principles. Primarily being devised for semantic object interoperability and privacy compliant data sharing OCA is a proposed global standard for data capture that promises to significantly enhance the ability to define manage and use data in terms of simplicity accuracy and allocation of resources. Okay, so why why OCA. So as I said, it really offers a solution to harmonizing data between data models and data representation formats. The key benefit of OCA is that different actors from different institutions departments etc can control specific task orientated objects within the same OCA bundle. In other words, different actors may have dynamic control over assigned overlays rather than the entire semantic structure object interoperability is in a data agile economy, where multiple institutions can participate in complex use cases. So if you think about something like a global supply chain, for example, this is where OCA can be particularly useful. You know, you can't, you can't expect everybody to be on the same network within a global supply chain and enable to communicate with each other. And really OCA enables different parts of the semantic structures to be controlled by different people. So when you think about a global global supply chain. For instance, if you're dealing with suppliers in Japan, you know, those Japanese suppliers could be controlling language specific overlays with into some of the data objects which are obviously globally recognized by the data governance administration for that ecosystem. Here's some core OCA characteristics. So task oriented objects, content bound objects, deterministic identifiers, simplified data pooling, stable capture bases, flagged attributes. I'll show you flagged attributes in a minute you'll see some code in a second. Internationalization, object presentation, composability OCA is totally on ontology agnostic. And then, you know, you can have multiple issuers and it's and it works cross platform. So capture base is a stable base object that defines a single data set in its purest form, providing a standard base to harmonize data. The object defines attribute names and types. The construct also includes a flagging block allowing the issuer of a forms or schema to flag any attributes where personally identifying information may be captured. And it could be PII information here but it also just might be sensitive information that the issuer wants to flag and that's totally fine. We don't say what to do with those flagged objects but at least they're flagged right so as soon as they go into a distributed data ecosystem or across ecosystems. When it comes to encrypting that information, you know that it's already been flagged. So it just means that all corresponding data can be treated as high risk throughout the data lifecycle and encrypted or removed at any stage reducing the risk of re-identification attacks against blinded data sets. For those coders on the call, you'll probably be interested in seeing a little bit of code. I'm not going to obviously show you every single overlay because that's a way over the top. We don't need to do that because they will work in a similar way but I'll show you roughly how they work. So you'll see here in the capture base it's really there's two main blocks here. There's the attributes themselves so you have the attribute names and then we've got some core data types that you put in here and they're defined in the OCA specification which is the last slide that I'll show. There's a couple of links where you can see those and then there's this flagged attribute so that's obviously where you flag any either personally identifiable information or sensitive information, you can flag them in that block. And apart from that it's just a very simple bit of metadata at the top there so you can see the type of the object as a capture base and the classification is where you can put in some broad categorization. So if you wanted to say okay this schema has come from the pharmaceutical sector, you can put a Gix code which is a global industry sector classification scheme I think that's what it stands for something like that. So basically that's an eight digit code there goes quite granular and enables you to go into a sub sector of an industry sector so, you know, for instance, something like healthcare, it can go down as granularly as, you know, people building medical devices for instance I think that would be an eight digit Gix code. So that's all that's in a capture base very very simple object. And then what is an overlay so these are all the linked objects that I showed you before so these overlays are cryptographically linked objects to that base that base object and they provide layers of task oriented contextual information to the capture base. So the actor interacting with a published capture base can use overlays to transform how information is displayed to a viewer or guide an agent in applying a custom process to capture data. And here this is a little bit of code. So we have a whole bunch of overlays. The, the entry overlays are for me or one of my favorite overlays because I hate freeform text fields I realized that every now and again you do have to have them and try and try and ban them possibly good because they're no use for insights and analytics so the entry code overlay. Basically, there's two, two overlay types that are kind of linked together one's an entry code overlay and one's an entry overlay, and the entry code overlay is is is the coded values that you can do your, your insights on, or your analytics on, and then the entry overlay is the human readable version of those coded entries so the entry overlay is actually language specific so you could change that into multiple different languages, but obviously the entry code underneath stays the same. So there's a couple of things in here to show you, you always have a capture base at the top so that's a deterministic identifier of the capture base so it's always linked to a specific capture base. And then you see the type here this is an entry code overlay, and then you have your entry codes here so you know document type in this case, zero equals passport. So you have your kind of English version if you like that's your default. And then, where you see these deterministic identifiers for issuing state and issuing state code. I'll show you what that means so basically, well, what it basically means is that you can point to another deterministic table somewhere you might have a code table with, you know the code, a bunch of codes think about something like country you have like the ISO codes down one side of the human readable countries, and you could do that in multiple languages so this just gives a pointer to some of these objects which may be controlled by in this case ISO. And that's it that's that's all that's in an overlay really, and all the overlays are very simple but they're all task specific so they're very granular and what they can do. This is what I just mentioned if you wanted to cryptographically link to an external code table you can do that. So code table, also known as a lookup table is an array that replaces runtime computation with a simpler array indexing operation. The savings in processing time can be significant because retrieving a value from memory is often faster than carrying out an expensive computation or input output operation and an OCA table is just simply just a lookup table that OCA can ingest. So this is kind of more of a visualization of a data form. This one's for a data agreement but it could literally be anything. And a couple of the overlays here on I just wanted to kind of show you how how the different parts of that form are constructed with different different overlay types. And this this particular slide because I break it down into separate slides so. So here's a meta overlay so a meta overlay is a cool linked object defines the contextual metadata about the schema, including the schema name, description and broad classification schemes. So here we separated this object out because when we first built OCA you know the entire form was you could translate it into a different language but for you know the title of the form and description was always kind of stayed in the same language so we separated that out so that so that can also be done in in different languages. Entry overlay, as I just mentioned an entry overlay is a cool linked object defines predefined field values in a specified language. It's good practice to avoid implementing free form text fields wherever possible to minimize the risk of capturing unforeseen personally identifiable information or quasi identifiable information or sensitive information. The overlay type enables structured data to be entered there by negating the risk of capturing and subsequently storing dangerous data. So you can think of these as you know on a form this scroll down of the predefined entries, they're all defined in the entry overlay. And as I mentioned before there's an entry code overlay underpinning this overlay as well so they kind of work work in tandem. The entry code is about the codes and the entry overlay can be done in human readable languages. The label overlay. So the label overlay does both defines both attribute and category labels for specific locale again it's language specific. So you can be done in multiple languages. This overlay type enables all labels to be displayed displayed in a preferred languages language at the presentation layer for better comprehensibility to the end user. And finally I think this is the last one I'm going to show as I say there's a whole bunch more overlays with these are kind of the core ones that are that you'll you pretty much have to use any time that you build a data form. So the information overlay defines instructional informational or legal pros to assist the data entry process. And what's really interesting about the information overlay is so the information that is displayed on a data capture form might be different to some of the information displayed on a credential for it for example and with by separating out all of these kind of objects we enable that kind of functionality between the different data objects so you can have different information for you know different elements. Yeah, all but all cryptographically linked to the same capture base so it's kind of interesting. And what's the main significance of transforming different data models and representation formats into OCA. So you have to excuse me I'm just going to read this slide. OCA provides a fundamental step in building digital objects that when resolved at the application layer, maintain the semantically rich definitions to ensure that the meaning and context of data are uniformly understood by all interacting actors, regardless of how it was initially collected in doing so all transient objects. When I talk about a transient object is things like forms credentials contracts receipts and documents. Plus, plus the records the records is not so much not so transient the records are more persistent. But in this case you could have records held in transient containers so some of you will know things like solid pods or authentic chain data containers or semantic containers, where you could have an authorization credential that provides transitive trust for authorized access into those containers so all of those sorts of things you know OCA can deal with. And by building it within OCA you can then start working with harmonized data, which when pulled provide a cleaner fuel for better AI statistical analysis machine learning and other insights based solutions for specified purpose. Notice that in this presentation I haven't really talked about AI, and I don't know that AI is a hugely important topic. But the way that we look at it is there's no point really looking at ethical AI unless you've got the data harmonized within these ecosystems so that your AI algorithms are working off accurate data. So in terms of data harmonization inputted oops sorry about that inputted earmarked inputs from multiple sources can serve can serve a common purpose. But obviously those source inputs need to be transformed into into OCA structures so that you can harmonize the semantics, and only through this data capture process can the benefits of structural and contextual harmony harmonization occur. So there's a couple of links here that you guys might be interested in. I've got the official OCA website there so you can take a look at that. And also how you can contribute to the OCA specification. It's totally open source. The license that we use in there is is it's an EU public license at the moment but it's pretty much the most friendly license that you could possibly use. So we've gone with that one. The only thing that we're trying to do with the licensing is to ensure that you know that people don't just take the code and rebuild another semantic architecture because that obviously, you know starts to fragment things which we really want to avoid but you know it's really a this is really a global community initiative to build out the build out the specification. And you know we're not we're not we're not going to have covered everything in the OCA spec so far really as you get new use cases that are using OCA. We might need to build a new overlay type and you know where it says here how to contribute. That GitHub link is where you can ask questions or or you know if you think that that we don't have the overlay types for your use case then we can go about as a as a as a community building another overlay type to ensure that that the architecture can deal with your use case. So go go ahead and have a look at those links. I've also put my my email address here. You're more than welcome to email me directly with any questions. You know we're a friendly bunch. We are not we're very collaborative in the way that we work. And yeah we look forward to new use cases new projects new new friends as we as we go and and although we're a nonprofit Swiss foundation. You know the reason that we built it in Switzerland is because the foundations in Switzerland are incredibly neutral and that was really the idea when we built this foundation. So that you know there's no there's no risk of of of centralization or anything like that with some of the IP at the foundation it's very much meant for for global contribution. So that's everything from my side. Hopefully. Thank you so much. Yeah, this was a really useful I think, you know, I think a lot of engineers on the on the calls are normally familiar with, you know, kind of the OCI layer sevens like 12345 which is used in, you know data communication application layer and I think what you've at least set up initially these four layers on object input governance economics. I think it was a really nice vocabulary to kind of start talking about this and I think that's really, really important and I think people should go to the links that you shared at the end, because if you know Paul's been setting the context but you know, for those of you who are trying to think about how to even start using it there are actually libraries and tools around how to capture architecture. I can even see like a data form and view library around there. So even if you want to start doing it in your ecosystem, like you know as a business, you know it may be useful to start thinking about and building on this building around this stuff to do that so yeah the links are, I think Paul's just shared it on the chat also. So that's really helpful. I think the idea of actual in semantics, you know we don't talk so much around all of this in this explicit way around presentation versus semantics we talk a lot more about syntactic transformations and syntax, but you know the basic businesses around data capture, you know catalogs, harmonization, even within organization is tremendous and we don't have a good vocabulary, shared vocabulary across everyone to talk about these lists. Thanks Sam. Just on that point I just wanted to just kind of, those four data domains that we've kind of brought there, they've taken a long time to really try and separate those out. And one of the reasons for doing it is that we needed communal language that everybody could use without tripping over each other. So you know obviously when the data management guys talk to the identity guys, you know they talk a totally different language and same with the economists and you know the people that are setting up governance frameworks. To give you an example in the economic domain we were working with somebody that started to use the word attributes, and I said you can't use that this is like the data management will go nuts if you take that term right. So he said you know what term should I use in the economic space and I said well just think about like everyday life you know when you go to an auction it's the next item for sale is this or you talk about itemize receipts or you know itemize billing, and that sort of stuff I said and your core term there is item, use item and then he totally adopted it and what was cool with that is that you know now we can talk to each other and say okay, we talk about asset exchange and items that's in the economic domain. Attributes and schemas is kind of left to the data management people and really trying to kind of separate those domains. So that was good fun, well I'd say it was good fun it was like plucking teeth sometimes but we got that. I'm going to wait for some questions but like, I think what and see if there are any questions but let me ask you one I think this internationalization example that you shared I know you can talk about Swiss and the cantons. You know in India we have the same challenge right we have many languages we have state languages and you know federal system like like similar to similar to other countries, some of the other countries and I think what you've just talked about also is quite relevant in terms of thinking about internationalization and localization for governments and public public services in some way. And I feel like at least in India where we are still in the infancy around that we have not articulated that all our services have a different presentation layer or different information layer. So any pointers from how you've seen that happening in your region or in Switzerland that you know you think we could be we should be thinking about just some of public domain kind of services. Yeah yeah I mean, well the, the one thing that I've always always loved about the decentralized semantics is that it supports minority languages so we've done a lot of work with the Canadian provincial governments and you know they obviously have the first nations in Canada, where you know some you know it could took is like a language of the Inuit people and and those those languages and cultures are in danger of dying. But what's nice about all of this is you know, you know, it enables the, the, the experts in those languages to be able to do the translations and have full custodianship of those of those language overlays themselves and you know, and then the, it puts less pressure on the federated system on the federated government themselves right. I know that over here. Romance is a minority language in Switzerland, but nothing is translated in in in Romance, and I don't think that the government could even do it because I don't think they've got the expertise in that language needs to go to that canton with that speak that language and they should be controlling those those languages. I think it's, yeah, I think it's a good point, we should be thinking much more on that direction, even here. We have one question from some cushion, let me hand it over to him. So I should do you want to come on the mic and yeah, go ahead. Yeah. So, fantastic. Exposition from Paul looking forward to the next two as well. This is probably the second time I have sat through a presentation on this topic and I have made notes. I have a question for Paul that relates to some of the projects that are underway over here, particularly related to citizen services. We have a lot of legacy data, and one of the key things that happens when you're dealing with legacy data and making it more friendly web 3 or what you might call it is that the project gets bogged down into data transformation phase by which I mean that you are taking in data that's on paper or perhaps OCR digitize in some one way or the other and then you're transforming it into records that you can manipulate extract some knowledge out from. If I go back and look at how you describe the workflow transformation is an integral but not the start point of things. It's correct. And so my question is that how do you think we should be able to raise the awareness in those projects that deal with legacy data to start looking at the system being transition from system a to system b by adopting some of the things that today you described. Yeah, that's a good question. Thank you, Sean. So yeah, I think the important thing is that you're not going to change the way people are not going to change how they capture data, or they'll do it very reluctantly because they have these legacy systems that they've probably been using for 10 years or longer in some cases. And it's just about kind of saying to those people that, you know, you can capture data that you want but if you want to interact within a distributed data ecosystem, whatever that might be whether it's e commerce or it's a, it's a health or it's a financial it doesn't matter but if you want to interact with those, you know, all you need to do is just build some transformation overlays on your side, which you can control, and then you know every time you're pushing, or people are requesting data into the data ecosystem, it can automatically be transformed into that into that kind of harmonized solution. So it's a short term thing, you know, once that once they start doing it that way then I'm sure those think, oh you know what, it's just going to be so much easier for us to just to build our data capture straight into into OCA without necessarily having these transformation overlays but I think that's the first starting point is to kind of really push that, you know, we're not changing the way that you capture data at all but if you want to contribute. So we need to kind of work with this, the standards as the data governance administration has, has defined. Right. Thank you. Okay. Paul, let me just ask one last question as we kind of close this and like set it up for, you know, for the next two sessions. The semantic web. I mean, I go back 20 years the initial vision of like the semantic web, never really panned out. And you know, I give us some optimism around why this time would be different. Yeah. Well, yeah, so it's, yeah, it's a perfectly reasonable question. I just got back from a conference in Helsinki called my data, the my data conference which is really driven by data centric organizations rather than rather than the authentication side to the side side. And you can kind of in that community you kind of see that there's always like some like a buzzword or buzz topic within and this year the buzz topic was really data spaces they call them, which is a horrible time is a horrible term because that doesn't doesn't really mean anything but basically the way that the European Union has kind of defined data spaces is what I would call a distributed data ecosystem. And, and, and the first thing that's coming up in that space and not really talking about semantic so talking about data governance how does the data governance work for these, these data spaces that you know it's some of it will be distributed governance some of it will be federated governance authorities will probably be, you know that the administration is usually separate from the authorities so So those are sorts of the, but the cool thing is when you talk about data governance within a data ecosystem, you very quickly get into the semantic into the semantic topic, you know, because they realize that within a data ecosystem, you need to search for data and then where do you start. So let's look at the semantics right so so it's kind of a very gentle semantics is a very gentle approach into governance. And what's interesting in the four domains is that actually the, the, the inputs domain and the economic domain are really user user driven sits more about the inter information as a system it comes from me, if I have a relationship with you and it's it's still user letters between bilateral communication, but the other two are really driven by the, by the ecosystem so the ecosystem driven so the governance is ecosystem driven, and very quickly as I mentioned so as a semantics because they need to have some sort of agreement on how they, they have safe and secure data exchange within that ecosystem. And that's all semantics. Thank you so much Paul for you know giving us a vocabulary game is language setting this whole, you know, lenses around this topic, and especially, you know, I'm hoping people will go back and you know share and participate in the open source or the open stuff that you're and maybe also bring it to it I think we in India definitely need it. And we are far behind on it and I'm hoping we, we at least start thinking about this in the government space. For sure, even, you know, even before businesses I'm sure businesses have their own perspective to it but you know, I'm really hoping in public policy it has a big impact. So you know this session has been recorded so you know we will have an edited version of this up there if you can all go to has the calm and privacy mode slash privacy mode and go from there. There's a link to data slash gov slash and slash and you can see the upcoming talks, as well as the next two coming talks there's on the same team is one by Robert on decentralized authentication so we go to to talk about authentication or credential in some way, the credential aspect that you saw in the conversation today. And then we'll fill it will also come back, fill it will come back and we'll also talk about, you know, kind of the initial set that Paul showed around distributed to the government so how to, you know, make this all work, especially looking at the governance side of it right so this talk was on schema. Next one is around authentication or identity, maybe for some, if that is the other thing, then the third one will be going to the human aspect, the key aspect. I think economic we all kind of buy into, but the governance part which is going to be set up right. I, this is a dialogue, and you know, please, please come back and you know you can post comments at the talk, you know we can continue the dialogue with Paul. And where he's available, you know any responses and if you want to come in and talk about this. If you want to come and talk around this topic on what you're doing or how you are trying to think about this in the space of privacy, or you know data privacy fairness, please please to come back. There's also a telegram group on privacy mode, which is we're welcome to join and, you know, have continued this conversation with my people that said t.me slash privacy underscore more so you can continue the conversation also on telegram. Well, on that note, and looking forward to the next two thoughts, I would like to thank Paul. Thank you so much for spending time with us and setting this up and we hope to continue this collaboration and this dialogue with you and learn a lot more as we have in this session. Thank you so much. Perfect thanks. Thanks thanks for having me and thanks for everyone for joining the call and any questions I put my email in the chat there so feel free to email me about anything you want. Right. Great. Thanks so much. I meant, take care guys. Thank you everyone. And well, take care. Have a good lesson. Bye. Bye bye.