 Where is? Okay, you're live now and I'll drop the YouTube page link here in chat. Perfect. Thank you, David. Happy to help. The Linux Foundation and Trust Policy, you know, as per customs, the Linux Foundation and Trust Policy. Linux Foundation meetings involve participation by industry competitors, and it is the intention of the Linux Foundation to conduct all of its activities in accordance with applicable antitrust and competition laws. It is therefore extremely important that attendees deal to meeting agendas and be aware of and not participate in any activities that are prohibited under applicable U.S. State Federal or Foreign Antitrust and competition laws. Examples of types of actions that are prohibited at Linux Foundation meetings and in connection with the Linux Foundation activities are described in the Linux Foundation and Trust Policy. If you have questions about these matters, please contact your company counsellor, or if you're a member of the Linux Foundation, feel free to contact Andrea of the Grove of the firm of Gaston of the Grove LLP, which provides legal counsel to the Linux Foundation. Hyperledger is committed to creating safe and welcoming community for all. For more information, please visit our Hyperledger code of conduct. So, my pleasure to have you all here, Zevenin. This time, you know, my pleasure to introduce you to Harirantanen from Finland, who works for ACB, Svenskansville, the Banker. It's pretty interesting, you know, I got in contact with him a few weeks back inside, and he showed me some interesting projects. So, I invited him to join us today to let us see what he's working on for a few time. And I leave it to him to have a presentation of this work. Harirantanen, I'll leave the groundwork for you and I'll leave it to you to have the presentation. You can step in and have your slides presented to the attendants. Alright, thank you very much, Andrea. I'm really honoured to be here on your community, and being also a part of the community. Now there are some background noise problems. Andrea, you're set as host now so you can mute people. Yeah, alright. Thanks. Alright, nice singing anyway. So, let's start. I have a couple of slides prepared for you, and then I will have a slight kind of a multimedia show today. I will show you in practice also the things that I'm working with, and hopefully there are some value for you as well. So, the header for the session today is do we understand our data? And the key question today is why, like you see there, why when we are three years old and even the pop bands are asking that. But finally, we as a business people, IT people, technical people, blockchain people, we also are really interested in the question why, and you will understand why this is a key question today. So, let's see if I can get my presentation rolling. Yeah, I'm coming from SCB. It's Scandinavisk, I answer the question. But anyway, it's a Nordic bank with 160 years tradition and existence and grounded by a Wallenberg family who is still really active both in banking and in a multitude of other good initiatives regarding artificial intelligence and so on. Finding many, many items in the Nordic countries where we are delivering good things in digital way. And I have been working in SCB for 13 years and now last three years as a business developer looking at the future both from our own banking perspective but also from the customer's perspective. I'm serving mostly large corporations and financial institutions but I have some background as well for the private customer and the SME point of view. That's a little bit about SCB, not that much. And here is our agenda today. We will discuss about information management. We will talk about the knowledge graphs, web semantics. I will present you another community that we have been running for three years or four years actually already. Standardized trust. Then some practical cases of the knowledge graphs actually can improve our understanding of our data. And then a summary as well. And you will get this presentation in a PDF format. So the presentation is filled with hyperlinks with interesting, hopefully interesting links of data where you can dive in in a deeper way into the items that I'm really scratching the surface today. So I have a tight agenda. Let's hope that I will be able to deliver everything that I was planning to. But let's start with the data sharing mantras that we are dealing with. And mostly especially from the banking perspective we are really careful of what we are sharing. But I had recently reading this gardener report about how actually important it is to share the data. So the point here is that the organizations that are following must share data mantra instead of don't share data are making better business overall. So it's really critical that we are now moving into digital world, platforms, ecosystems that are based on information sharing. And if you are not able to do that then you are not really well participating in that kind of ecosystems or platforms. So that's really critical to keep in mind that when you have your data and analyst organization that they are knowing these challenges that it's not only think that it's careful to share the data. Of course there are a lot of regulation regarding this. I can tell it from banking perspective. But there are also many ways to mitigate those risks and being able to participate in the new way of working and new way of making business. And a good example of that is that everyone knows football all over the world and Kevin De Bruyne was really good example of this. Just recently it was in the news in Mirror that he actually did a raise increase from the Manchester City organization by only analyzing the full data that this is delivered on the games that he was playing and he was not playing. And then he was able to show based on the data analysis without any agent how important Kai he is for Manchester City. And this is really a really nice example of that important thing we are talking about. Really practical level. So the key element here is that we have to understand the data hierarchy. And this is really good pyramid of showing what we are talking about. And we many times we say that the data is the new oil but it can be also oil in a bad sense meaning pollution if we do not make any sense of that data. So practically the data what we are dealing with is only what. It's a extract of the real world by some kind of metrics that we have been defining ourselves via standards or market practices. Do we measure kilos or pounds? That's up to different metrics that we are using but it's not telling that much. It's only a extraction of the real world. Then when we start moving from what to who, where and when and then we are combining the data together so that for example in my case I can show you that there is a bank account with some data and there is a closing date balance on the 2019 date for 5000 euros even on the credit side. It now has an informational meaning but then to make that even more meaningful answering the question how then we are talking about the semantics. We have to be able to compose that information into the new level meaning that we have a understandable set of information available. So now I can start making conclusions that how this is possible that my bank account is on 5000 euros credit side and finally the last thing that we humans are good are exactly asking the question why. Why this is important information? Why this information is meaningful on knowledge point of view and then we are answering the question why. We can make decisions out of that we can start predicting things like I have been doing now in my slide here as well. So data must be fine-tuned into information knowledge and wisdom to be able to make that meaningful and there are now luckily tools that are not only dealing with that data and information level but we are now able to start doing as well via the semantic world and semantic approach we are able to make that information available for knowledge and even for the wisdom level so that we can participate in that decision making in a good way and we can take off trivial operations with the data. So what about the knowledge graphs? My history is coming from ISO 2022 or 2022 which is a financial services standard started 2004 I have been working in financial services practically 30 years already and this was a really big revolution. Finally we got a common standard for five domains we have payments domain, trade domain, FX cards and that's giving us insecurities giving us capability to talk in the same language. The ISO 2022 business model and repository electronic data dictionary and repository is available there in the ISO 2022 pages but it's still not having yet a true semantics even though when I was still working 2019 in the management group of ISO 2022 we started already before that a working group that is focusing on the semantic technology and hopefully in a couple of upcoming weeks and months also ISO 2022 data and business model can be transformed into this kind of semantic model to be available for further operations that we will be discussing further on and everything here is related to internet evolution so we have started this internet evolution with web 0.1 which was actually only reading the web pages that are available there so it was only output from different HDTPS or HDTPS sites that you could evaluate what kind of data there is on each and every home site. Then we moved into this social web where we could also show our fingers up and down participating into different discussions and also giving content into different home sites and different data storages that were available there. So it was the next step for making the web on the next level 2.0 and now we are talking about this 3.0 web which is actually started 2016 and the key version there which is now Tim Berners-Lee having a sure as a ranking for him it wasn't there yet in 1996 but nowadays it is so Tim Berners-Lee has said that the vision already the web 3.0 vision in 1999 that we are moving into linked web so that the web is also executing some stuff by itself and the home sites web pages humans included can be operating that by themselves and that means that we have a semantic available which is combining the approach that we human beings are doing together with the machines and that is really the important thing that we are able to do now having the web semantics available describing us how to do that so moving into semantic web means that also the latest developments in blockchains and the distributed technology are moving this web into stateful web so that we can instead of storing data in multiple places as copies we can also use that as a immutable one single instance and then change the states of that instance available and of course part of that development is the use of value meaning that we have also these crypto coins, staple coins and now a multitude of central banks also developing the digital currencies available for making value exchange even easier way available so that we can start including internal things data available making triggering of the transactions automatically and even in the spatial web we can move humans into the web with the augmented reality and virtual reality but finally the knowledge grasp what is the meaning of that that started 2012 and this is part of the WGC semantic web vision so the idea here is like it's there in the picture that we can have whatever data-based blockchain whatever home-based structure whatever we define simple definitions of simple things like cities being a place we can make a person available and then we have a date for example available we can have an artifact available and then we define them commonly with a semantic so that what they actually mean and it doesn't matter what kind of technology are using for storing that kind of data and then we start making triples meaning that we have simple statements available like we use in a human language as well we can say that the sky is blue so we have subject we have predicate and we have object and this kind of triples are described in the knowledge grasp so that you can see in the screen that Bill is a person Alice is also a person, Bill has visited Eiffel tower is a place and Louvre is a museum and it's located in Paris and we can make quite complicated and easily understandable statements out of these instances of data and information and that makes that possible both as humans and machines to understand and that's really critical that's the work that WGC has done describing also multitude of technologies how to make this available and even query language on that so that we have a common ontology so that we understand what we are talking about and then agreeing a common language on that that how this language should be used in different contexts and this makes it possible that all the data and information that is available there from different sources can be combined in an understandable way and suddenly both humans and both machines can understand what the context is what kind of links are there between these nodes in the knowledge grasp so again saying this finally that we started from the flat files and we have been moving into relational data models different kind of databases that are there then the XML a revolution study that we have schemas that are able to deliver different endpoints and still there we were using a lot of FTP or SFTP to move the data from point A to point B and then 2004 the definition of the semantic web started by by WGC and 2013 the knowledge grasp were created based on that semantic web languages and of course again started from the big text so Google is using this technique and they have this graph implementation by themselves which is practically openly available via APIs so if you are interested of that you can look that into or from the Google developer side you can make any of your requests or queries that you are doing in the Google pages you can make a JSON LD description out of that and start using that in your own processes of automated systems and linkages that you are doing and finally of course why this is so important is that when we are talking the same language we can look that from the natural language processing NLP point of view and then we can start doing this distributed semantic representations, algorithms that are based on knowledge graphs because then it's easier for us humans to be programming this kind of stuff when we have knowledge graph available and the semantics for the data wherever that is in whatever technology stored then the combination of the data can be done and we can start utilizing artificial intelligence for analyzing the data like Google is doing, Facebook is doing by the way Facebook is doing only for their own data but Google is combining all the data in the internet using that same logic which they have in their knowledge graph implementation and of course now ISO 202 gradually moving into this space as well as a financial services standard that we are using in multitude of ways in the banks and coming as a market practice standard 2025 in all the banking when it comes to payment domain so all the data will be moved in structured data instead of old empty messages that were a little bit more unstructured so how this is evolving in the financial world overall with the semantics there is already EDM council enterprise data management council that is working on this and they have created this financial industry business ontology FIBO if you want to call it like that and FIBO is available there for you so you can go into FIBO viewer in the internet and start browsing the stuff that is available there it's not 100% covering all the financial services information but many of the items are there available like business entities which is really interesting to look into that you can dive deep dive already into that stuff from your own point of view so the hyperlinks are working quite well so you can explore afterwards after this presentation also by yourself all these resources available on this presentation okay that was the theoretical part of the session and let's move into practical part so we created 2017 a community called standardized trust and the aim for that is creating a common language for trade finance we had a boot camp where we had people coming in from different perspectives from corporations and we had a discussion and we created white paper which is available there on the standardize trust.com which you can download it's still really valid paper we found out that there are really some pain points that we have to address 2017 we talked about standards and actually semantics already there because in that time EU level E-inverse semantic model was just coming up it was in the discussion to be a regulation which is it now already so we were copying that semantic model approach from there and then of course making standardized approach available for the team so we have now about 300 plus members in LinkedIn group which we use mostly for the market intel but then we have the actual working group 20 plus active members with which we have been using our expertise without any fees, without any budget into guarantee not only bank guarantee but overall guarantee definitions so we can jump into the standardized trust home page and I can show you some basic facts from this point of view we have working groups here and nowadays we work within this working group number two and the latest stuff that is available is in one drive, we do everything openly so you do not have to pay anything you just commit yourself into the team and suddenly you are able to look into the semantic model draft as we are working here so we have started with the ISO 202 phase 2013 data model that we have been fine tuning into our purposes and now it's a living document that we are working here so we have a data set that is describing the undertaking with which we have defining all the necessary fields that we need for making the undertaking available and it's quite a long description and at the same time we also make a comments available how for example ICC rules like URDG can be applied into each and every point that is there, we have also used the code lists available so that if you want to see for example different undertaking parties that are there they have been also added into the lists so you can make this as a really living document and you can use that also available for any kind of use case mapping so you can have the code lists available if you want to make a use case of rental guarantee all these code lists are coming from the standardized trust homepage, we have a resource site here so we have all the guarantee code lists that are in the Excel sheet dynamically linked from this homepage so we have links available here so we have tried to make this as modern as possible and easily amendments amendments available so that we can agree what kind of stuff we are delivering easily available for anyone who is looking for this setup so and also in the one drive I can show you we also now started lately a JSON description as well so we have made from the Excel sheet a JSON schema that I can show you here so we can make that available for the ones that are technically how this common model voluntary and really available, openly available, whoever can be applied into what kind of platform vendor system or application of different guarantees you would like to implement with that and of course we have been now exercising as well how that would look like in a JSON payload so there is a short example of that what kind of surety rental guarantee described in JSON payload so that whatever machines can understand and read it in the purposes of using that. We will not go into in this team into any kind of application the security solutions or whatever we focus on the practical side of the guarantee descriptions and start and continue on that. We will gradually move now from the guarantees also to the documentary credits to our small team it will take some time to finalise the guarantee descriptions available but we have really active team and I'm really positive that we will be able to deliver this this year for the guarantees and of course again there will be a life cycle management after that how we will be able to compose that further. So that was the community story there and of course if you are interested in the joint standardized trust just inform yourselves into LinkedIn community and we will approve you easily there to be members and then if you will be able and interested of joining the actual working group being active on guarantee or documentary credit semantic model descriptions please inform me and I will be able to put you available into the team so the latest members in our team are coming from the global trade cooperation digital world services the new service for guarantee is available on a cloud-based services then another practical case I'm just looking to my clock that I'm still in the time table yes we made a proof of concept together with one of our customers Värtsilä in Finland Värtsilä is a manufacturer of different digital and physical goods for marine technology for example we also deliver in power plants all over the world and they have a multitude of digital services as well available SCB is the bank that I'm presenting here and we were the final sponsor of this proof of concept that we wanted to test in the Secure Corporate Data Sharing and then we use the platform of CITRA which is Finnish Innovation Fund and the IHEM testbed that I will talk about a little bit later and then we have two vendors digital living as a main vendor here and the creator of the data economics Nexus which is a data sharing enabler platform and then Nixu which is a global cybersecurity company company which is doing a multitude of good things around the cybersecurity and what we did actually was a proof of concept for Secure Corporate Data Sharing and we had a use case of trade finance letter of credit a small part of that process were advising bank as SCB and the exporter Värtsilä were exercising a data sharing on digital content basis so that we created a platform based on CITRA a digital identities for us and Värtsilä and then identities for end users link them together and then the end users could be responsible for the data sharing instead of sharing documents like we usually do we have to do still in the letter of credit when the advising bank is delivering the requirements of the letter of credit to the exporter the exporter responds by the documents and now we exchanged everything there into data sets and there are press releases and case studies indeed available for that but I will show you now in detail just with the second I make ensuring that I have sound as well included now it's there so we can enjoy now a short video three minutes creating and showing how this works in practice yes so that was the case in the short video and let's go back into the presentation here the key idea like you saw in the video is that we used open standards even in the data if it was if it was available and the security side open ID logins and all that stuff available and we even had a possibility to go into further in self sovereign identities I will talk about those as well but we did this in a mock-up way so that we didn't need to go into that verifiable credentials way of sharing the consents in this case but the key element here was that we were able to run this case in the Citra Ihan Testbed and I will show you here some technological features of that Ihan Testbed which was the basement for sharing the data and the key element here is the data is not shared on the platform as such it was really critical from the customer side to say that none of this critical information will be shared on the actual platform but it can be shared only by data owner into the data user consumer by these consents that they were giving to this so practically all that architecture on the Ihan Testbed is available as well here on the home pages so you can also dive in into these links and look into how the actual testbed is working so we do productized APIs meaning standard APIs that are registered on the platform and then you can link corporations and end users into those APIs and they limit who can use those APIs for different purposes and the data products that are there are based on the actual standards as well so practically everything is available again in the GitHub I will show you here the standards that are available already and we go into the invoice data that I can show to you in a JSON format this is not any standard invoice payload but we used in the test all the standards that were available there I can even show the JSON-LD documentation on the internet but the JSON-LD the link data description is describing actually the data elements from which standard they are taken off so we used EDM counts, we used PEPOL, we used ISO 202 and different other standards to make the fields available on any standards that are there already without creating any news stuff that was needed there of course we also wanted to share for example INGO terms and that was something that was not available in any standard yet but to create something new is also quite easy in the standard test page so that you can create the definitions in a knowledge graph way as well so that the data is understandable also by anyone and come as a participant in that and one key element in the exercises in the rulebook of the Citra third data economy platform and this rulebook is available as well for anyone to look into because this is done by legal, technical and functional PISAS people as a template for any kind of contractual framework for data sharing, corporate data sharing so that the different participants in the network can be joining in a good way when there is no regulation or other legal framework available though then of course the participants of the network should be agree how the data will be shared and this is giving a platform and template for that and I just heard that this is already translated into Portuguese because their national health system was willing to start using this as their template in their system development so this is one part of the larger development now in the data sharing where EU is trying to compete with the big techs Amazon and Microsoft Azure and Google Cloud so that we are dealing with the further data instead of putting all the data available as such in a data set into cloud solutions so GAIA-X is the project name and then it's collaborating with the International Data Spaces Association and then of course Citra Ihan that we were using in the proof of concept is part of this GAIA-X initiative and it's more practical I even copied some facts that were composed by the participants and digital living colleagues into this kind of analysis of how these different platforms like the Big Tech Platforms GAIA-X and Ihan Testbed how they can be compared with each other so like said the Ihan Testbed is ready for testing it has been already used for a multitude of test cases and now of course also there will be new cases available as well because it has been quite successful in this proof of concept way of working finally other enablers that we will need in this digitalization of the trade finance as is overall. I remember the two weeks ago the same team here was discussing about MLETR which is coming from United Nations Commission on International Trade Law and there has been now really good developments from this 2017 when the MLETR was released Bahrain Abu Dhabi Global Market and Singapore has been adapting that or starting to adapt this into legal framework now setting their own goal for year 2020 2022 and G7 just in last month agreed that yes this will be the way forward on these big countries that are also dominant in the global trade and this is really critical for the development overall and digitalization of trade finance and then we will need this trust so this is linked into web 3.0 semantics really important part and block called trust that we have to solve that wasn't sold in the early days of development of internet and again WGC is doing good work there defining this trust over internet protocol basement based on this TOIP foundation and there is also these good other standardization bodies available and included as well because SSI identities they are mostly run by the Sovereign Foundation and the base code from Hyperledge Indy and Iris which are used for this kind of wallet based solutions that was also available and possible to implement in our proof of concept that we did with Wärtsilä but like said we didn't go that far yet but it will be available there if we need to implement that in a similar way so finally we have a summary I have used most of my time but I wanted to conclude in the end like a wish list and the building blocks first with the wish list what kind of components we will need to make business digitalization happen in practice it might be trade finance digitalization it can be whatever other digitalization but we need to have a common legal framework so that we have a capability to change not documents papers but we can exchange data sets instead and MLETR is really good basement for that internationally we need also these digitalized entities so that I can interoperate with my digitalized entities in whatever country whatever service provider is giving services so that I can identify myself to them I can identify myself as a persona in a company and of course companies can be identified as digitally as well and even the things so that the things can be ensured that they are exactly the things that we are supposed to be connected with and not any kind of bootlegs or something else and then we need these business data standards should be also openly available so it's not only a question of standard also the market practices harmonization of those business standards should be available and then the data sharing we should be able to share the data in a secure way with digital consents and then it again should be openly available as a governance rule book so that we can agree with the data sharing network participants by all of them how we will do that until there will be maybe some regulation or law available but before that we have to make a contractual agreement how to make data sharing happen because it's needed like to remember from the beginning of the presentation the organizations that are sharing data participating in data sharing platforms they are making better business than the other ones and then finally everything should be openly available for these development resources there is really many ways to compete but the basic stuff which we need for digitalization should be openly available and the fees included no partnerships no VIP memberships like we are doing in the standardized trust we should be sharing our knowledge and expertise like many developers nowadays are doing via GitHub and internet overall if you have a question regarding JSON or JSON LD or semantics you will find an answer by multitude of communities using that and then I finally made a conclusion with the building blocks in my very last slide we have Unigital MLETR trust over internet protocol legally entity identifier you can look into that verifiable credential proof of concept that they did with the evername using this sovereign foundation and blockchain technology there it's really fantastic what they did and this will be the way of identifying organizations globally instead of using national level identifiers then we need the semantics and knowledge graphs to to fine tune our data into level that can be used in artificial intelligence we need the secure corporate data sharing and then we will need these rule books especially in trade finance uniform rules like ICC has been always doing and then their transformation to digital world as well and then the DSI digital trade standard initiative led by Osval Kyler who is also have been talking to you here and that's really critical initiative that makes these standards available for anyone who is willing to develop something on those shipping the peel of ladings is really critical or for various categories it's really critical for the trade finance system follow ups and DCSA the digital container and shipment association shipping association they have started really good work as well starting this kind of standardization and fight while I was talking about and then we have Diane network as well who has done already semantic APS available for creating infrastructures for banking services or financial services available by any kind of payment provider or bank or financial institutions overall so as a banker or developer for banks I would really recommend to look into Fible and buy it together and then we need collaboration like we are doing today we have to share our knowledge we have to share our skills and put them available on the same table and start discussing how to make these building blocks available some of them already there some of them will be there and then to put them available when we have all the legal framework market practices uniform rules equipped with the digital digital needs then we are also already ready with the different different platform system and process applications they're available and now I will stop and leave some room for the discussion as well It was a really compelling presentation I see Eugenio would love to make questions to you so I'll lead on to him Thank you Andrea Hi Harry it's really nice to meet you here and I thank you very much for the presentation because I think it really goes on one of the core issues at least in terms of operational interoperability and trying to harmonizing the operation on different DLT finance networks I would like to make a question I think as a reference the slide which show which you showed on having a pyramid and showing essentially different data analysis perspective or perception from the market and I would ask you to ask in according to your opinion which kind of I would say level understanding different trade market participants are actually having to understand their own data and how as a community we could help the trade finance market to elevate that understanding and reach the maximum Thank you Good question and it's quite funny it's an experience from our proof of concept that when the corporations are sharing and we share the data between each other in trade finance which is based on paper papers and PDFs can be shared without any concerns or let's say major concerns but when that document for example invoice it's not on the paper but we compose that into structured data set suddenly we start discussing and being concerned about how can I secure that when I share this invoice data with you that you use it only for this specific purpose so yes we start to understand the value of the data when we move from paper which is actually a printout of these facts or partly also information because it's a set of data already composed who where when so practically we are still in this area data and information area on many stakeholders in the trade finance and even global trade like I said the shipping data is also there available I remember DCSA presentation saying that unfortunately 95% of the documentation in the shipping and logistics is still on paper even though a good work has been done for standardizing that as well but that's coming from that you cannot refer that into data sets in possible disputes in any court so we have a way to go to make semantics work so that we can understand how the knowledge part and finally the vision part of the data okay understand it seems that there is still a long way to go but essentially what we are doing is trying to reach the peak yes definitely and it has been happening now lately a lot so I started my trade finance journey from payments domain let's say 10 years ago when that was so painful for no standards available but always fifth network standards for many many really large corporations only available and now suddenly in the last two years three years it has been starting to happen by ICC and other stakeholders so we are in the right path now and finally COVID also pushed us to make this happen in practice give it any other questions yeah just a question from myself Hadi I was you know I'm a long-term trade finance specialist you know working with the LC nationally you know when you deal with African Middle East this is one of the core points electronically fine the LC the basis is already there but the ICC which is the EUCB 2.0 how do you integrate in your work how do you comply with the statements by the EUCB 2.0 how was your experience with this new set of roles which I would love to remind that so far it's very poorly used in limits it's more normal than used although you know there were several rules about this yeah and then that is under construction and update as well into your DTT meaning that ICC is doing this work for unified rules for digital trade transactions meaning that now they will also apply into your rules that how this will be available for digital and data-based exchange which is really good and what we found out the proof of concept was also interesting because the large corporate customer was also suggesting that when we move into data sharing why should we wait for all the documents or data sets being available from corporate to be shared with the advising bank if they have inmost ready already with all the data available why can't they start using continuous presentation instead meaning that they can send it to the bank or have a consent for the bank so that the bank can have the access for the data so there will be most probably challenges and challenge for the uniform rules by moving into digital processes because when we find out that we have the new tools available and we can share in practice in real time when the data is ready data for the actual stakeholders on that let's see it will change the need of the rules as well and of course finally when we are moving into MLETR type of implementation in the legal framework that will again enable us to do new work like ITFA international trade and for fading association and BAFTA are doing now they are promoting this digital and negotiable instrument initiatives also pushing legal framework to move forward so that electronic bill of ladings is used for automated trickering for the electronic payment undertaking so that automation comes along and we have to apply this into the uniform rules as well there is a let's say a question hold on from Patrick here in the chat and he is stating as a professional focused on creating data standards to leverage within blockchain enabled environment hold on I get problems with my PC it seems to me that there is still too much fragmentation yeah I can share from the payments domain experience of ISO 202 implementation that exactly that happened because the standard alone is not good enough we have to agree by the market stakeholders the standard so we have to agree this information and knowledge level so that we have a common market practice how to do that and semantics really could work a good tool for doing that so again whatever technology and storage you are using we should agree on the common language how we use the standard commonly so we have for example languages we have a grammar what I am actually saying to you now because you understand it from different perspectives different backgrounds so this is really critical that we agree on the market practices as well because that is the only way to avoid fragmentation it was starting to happen in ISO 202 use cases as well in the early days 2005-2006 before the separatants started to use that as well so we agreed to make a market practice group as well available to include banks and vendors and end users, corporations together to agree on those rules and that we have to do as a co-operative work together in the trade finance domain as well are you still hearing me? I was muted and I did not realise that happened I would love to expand the picture in trade finance which is basically based on banks to trade in more general that you have many actors into the picture and some of them of course they are institutional how do you see the challenges in bringing them in picture are they fit enough in trade finance nowadays trade finance to participate in this collaborative effort I forgot to show you one picture which is a value network that we did in the Wärtsilä proof-of-concept so I can show you here this was a co-operation by Exxon World by Digital Living International they actually build up from the two proof-of-concept cases, the LC cases that we used in proof-of-concept, the value network so this is described in one single letter of credit all the stakeholders in one letter of credit documentation so practically we have four layers we have banking and finance like you said the banks are there, then we have the business side buyer and seller agreeing an agreement making all that invoices available and all other related documentation logistics layer who is delivering the goods from the seller to the buyer and then the manufacturing part which is producing the goods that are delivered even really mini-scale digital twins for these tunnel thrusters delivered by Wärtsilä so that we can relate into the goods that are delivered which is usually not needed in the LC case at all and if you go further in the zooming you can see that we have knowledge graph here behind, I can show that also as a model that we used in the proof-of-concept here is a letter of credit value network in knowledge graph as well described so this makes the stakeholder value network visualized meaning that then of course you can also realize that it's not only buyer seller's bank and buyer's bank that are involved in this global trade, it's only the trade finance part when we add the logistics shipping manufacturing part, all the suppliers all the customer side as well as buyers we come into the totally new picture which is the complexity of trade and global trade overall and that is exactly the reason why we will need these rules and market practices by collaboration because otherwise there's a risk that we will go into logistics standards and manufacturing standards and business standards and trade finance standards we have to be able to break the walls here between business layer and banking layer, we actually do and I do that on my big day basis making the automation on triggers happening on smart contracts on how to make the payments flow automatically all the terms and conditions of the contracts are met and so on and we have to break the layer between business and logistics side so that we have a capability of connecting these value network players and nodes closer to each other and automate these relations between each other by standardization and being and making that happen in a secure way that doesn't have to be compromised anyhow. Sorry Mark here but I think isn't that one of the biggest challenges that we have I mean it's interesting you have this slide here is dated to new oil I got it was an interesting quote during the week from a Margaret investor which said data is not oil it's a renewable source that needs to be shared and reduced and we need to make sure that we make the businesses make the most of the data but again the biggest part about it is actually securing access to that data ensuring the trust in the data that it is protected from misuse so I think this is one of the things that we the challenges that we have in terms of building these platforms and ecosystems that while there are not many barriers at a technical level it's always the legal compliance security and trust making sure that you know the data is shared with the people with the relevant people who need to have access to data only for a valid use case and I do agree with you in the sense that you know we tend to treat paper differently than digital data I mean papers are left on desks and what have you even though a lot of companies have you know security policies and what have you but you know there's you know we sent as soon as we go digital the bar is lifted so much higher and I just noticed I didn't share the screen that I was talking to you just before you coming in and here is the value network visualization by one single LC and then having this banking layer business layer, manufacturing layer and finally the knowledge graph of one single LC showing the complexity of the stakeholder stakeholders here so this is the power of knowledge graph we make the complex data sets more understandable by those who we have a subject predicate an object and then suddenly both humans and machines can understand this so the finally the digital consents can be done and given by systems and processes as well because we tend to move into automated world and we cannot expect that there is always someone human being sharing the data in a proper way by giving the human touch a consent for that finally we have to be able to program as well the consents by different rules that we as humans are setting there so that we can make these automated agreements available by smart contracts for example so we are having now upcoming the tools available like said technology is usually not the problem it's mainly how we make the market practices and legal framework to approve that what we are doing from the technology side very heavy heavy it's very heavy any other further comments like said I will share the presentation to you so you can deep dive into those links and look into more details on really practical level what is possible to do already and of course in the proof of concept we will try to now make a experiment commercially looking into possibility to set some funding and the participants to make that happen in a larger scale so that we can start making that data sharing available according to any available rules already so that we can start exercising practice also the corporate data sharing in a secure manner Harry Andrea just one final consideration my side I think I mean this is my my suggestion if you want if we want to reach the final goal to have common understanding common analysis tools for different kind of data to different kind of players involved into the trade industry in general from banking to shipping industry to logistics to buyers and sellers I think one I mean I'm thinking I mean process overall maybe there is one topic we didn't discuss here which is very practical but maybe important as well into the process overall which I think is education and I mean not just in terms of content which is related to data but more from practical perspective how bring this information this knowledge to the people to different people which have different different different resources different education as well different languages so I think this can be from a practical perspective I think this can be a big barrier for that and how can we by our action try to avoid that to reach the the shorter pathway to reach the demonization and the education of this audience the best way of learning is is doing that by practice that was already said by by ancient Greek philosopher Aristoble we tend not to have time for upskill ourselves because we are too busy with the existing processes so that's why being part of the ordinary communities and the upskilling ourselves by practicing is not that common nowadays because we are too busy and that's really a big bit that we do not have time for adapting these new technologies to ways of working and doing things together yeah I totally agree I guess that was it for today yeah I think it was all if nobody else would like to make a question I think I mean would love to end the meeting and to thank Harry for this presentation it was very insightful it was a sort of continuation of the meetings that we had over the last over the last months so on behalf of the SEGA thank you so much for joining us Harry it was a very good session I can't see you all during the next ones next one we'll keep on trying to deliver good content thank you everyone thanks for being this opportunity bye bye