 Thanks a lot everybody for attending this session. I'll be presenting for Informatica, together with my colleague Hossam from the Middle East. My name is Remy van de Kleij. I'm a solution architect for Informatica, working out of the Benelux and Nordic regions. I'll do a quick introduction of how we help customers or operators get ready for an OSU implementation and how we can support you along that journey. For the people who don't know Informatica, we're a data management software technology vendor, focusing on addressing these common challenges that we see our customers have discovering the data they need, making it available where it needs to be, for example, OSDU, and making sure it's usable and findable. And then managing that entire landscape of applications, including OSDU. If you look at an implementation of a platform like OSDU, quite often the initial expectation is that we're going to move our data and our processes from our legacy applications, the recalls and the open works of the world into OSDU, and then start working from there. And initially that is indeed a very common task to address. It does come with a number of challenges because you'll have a few decades of history in those legacy systems. And as you start moving these into OSDU, you would like to address any data quality issues. We all know that not all these systems have been maintained as well over time as they might have, should have been. So it might be data quality issues, inconsistencies, duplicates of data. And ideally you want to start OSDU with a kind of as cleanly as possible. So initially you'll have to address those data quality challenges. However, as you go along, you realize that those legacy systems are not going away. You might be moving some of the process, because it's going to be a gradual program. I'm moving everything basically in a big bag. It's just too much impact. Which means that some data, most often your kind of key data items like wells and well bores, will need to be synchronized back to those legacy systems to make sure they are consistent with OSDU. During that you can of course synchronize back and integrate these back to those legacy systems. The challenge tends to become bigger as the number of legacy applications increases. Working with a few operators talking to them, they might have 30 odd instances of open works that either do hopefully separate sets of wells, but not necessarily. So keeping those all in sync is quite often quickly becoming a challenge of how to handle the changes that happen on the left and on the right side. Who wins? What is the actual, the right value to use? How can we make sure we don't have any duplicates? And how can we make sure we don't end up in a massive spaghetti of integrations and ETL processes that try to keep this all in sync and working? This has a big risk of entering the role of spaghetti integration and introducing a lot of technical depth. This is a multi-year program, a journey like OSDU. So managing this properly is really important to make sure you don't get stuck somewhere halfway along the journey. What we often see is that besides the typical data integration and data quality functionality that you need to integrate these systems, we see there's a need for some kind of broker in the middle, a system that manages the changes coming from the left and the right to make sure that you have this kind of single consistent view of those key data elements. So you know where they're being managed. You can understand the differences between systems and detect any data quality issues and fix them at a good kind of single point. And over time allow a kind of gradual decommissioning of legacy systems or maybe move them from a kind of master system into more a slave where they are simply receiving data from OSDU and not necessarily being used to manage the actual key elements anymore. So we see that there's quite a lot of data management aspects to implementing a platform like OSDU. It's from integrating data, it's from understanding and with quality, managing the references between systems, translating values and structures and keeping that whole view of your key master data elements. And these are the aspects we're trying to help operators with our technology. I'll hand over to my colleague Hassam who'll do a quick presentation of how we fit into the OSDU platform architecture and if we have time as well, we'll do a quick demo of how it can look like in real life. So Hassam, you're gonna take over the sharing and you want me to move to the next slide. Yeah, so I'll be sharing. So can you hear my voice? Yeah, I can hear you. All right, very good. Okay, so here we go. Yeah, I can see your screen, Hassam. Very good, thank you very much Remy and thanks to the attendees. So maybe reflecting on many of the pointers that was discussed by Remy when we look at the platform journey, the OSDU platform journey, it is one that came in in order to tackle multiple different aspects in terms of liberating the data across the landscape. And those are ones that included the activities that talks to governance, ingestion, the ability to enrich the data, to furnish a strong foundation for analytical as well as operational streamlined insights within the business itself. So when we look at the Informatica platform and how we are coming in order to catalyze this very life cycle, the platform itself is one that comes into the picture in order to boost the productivity of all of these different tasks that we are seeing, which could potentially furnish a strong foundation to a better adoption, as well as to a great extent, the ability of lifting the gates into reusing or realizing the value out of the data. And this is one of the things that we've been seeing that comes to be very visible in the form of our not just data management components, but also the governance solutions that gets to be brought into the picture may be on a technical level or even business perspective. There is always a lot of different processes and policies that we get to witness throughout the life cycle, which needs to be documented very well for the purpose of making sure that understanding is streamlined across the landscape and understood by all of the different business users. But the very aspect of the quality issues also gets to be a great pain point. And this is something that we've been seeing across many of the different customers that we've worked with, with a special case around how do we cater for master data synchronization across the landscape? So what the master data management or the 360 engagement solutions are gonna come in just like Remy had explained, it's gonna come into the picture in order to sit in the middle of the end-to-end landscape to open up and the enterprise readiness of making use of this data, not just to serve the OSDU platform, but to potentially catalyze the use of the insights that is hosted within the OSDU platform across the other enterprise applications, may those be ERP applications, asset management applications, so on and so forth. Which is going to lead us towards potentially a very short demo of how a well master would come into the picture in order to deliver this type of value and synchronization across the landscape. So if we look at the interface of the master data management application, it is one that is very well designed for the purpose of delivering both the business as well as the engineering insights across the master records. In this case, we are looking at the example of a well with the master data attributes that defines this very well across the different touch points. May those touch points be source and target touch points that the OSDU platform will be using. But you can see that these are some of the information that gets to be synchronized across the landscape, making sure that the consistency and the ability to remove the duplication of the data, may those data be coming in the form of these attributes that we're seeing or potentially some of the engineering insights as well as the hierarchies that relates to the well itself that impacts to a great extent many of the different business processes that we're seeing. This is also one of the things that gets to also be delivered by the solution in the form of delivering an end to end 360 degree view of the different relationships and the hierarchies that a particular well would have. So we can see that all of which are things that can be expanded in order to enrich many of the activities that will be undertaken by potentially the machine learning AI components, but it's not just restricted to analytics is also delivered or coming to deliver more of an enterprise ready information around the master data assets. And this is one of the things that we've been seeing comes to be very useful, especially when we are dealing with potentially the equipment and the materials that gets to be reused on within the context of the wells themselves impacting potentially some of the supply chain business processes and the predictive asset maintenance use cases that many of the organizations were pursuing. So as you can see, this is really an example of how the Informatica platform is coming into the picture to deliver a consensus around the master data. By the time these data assets have been integrated and come through the lifecycle of delivering a better consistency around the standardizations, making room for the aspect of catering and minimizing the implementation risk across the technical landscape with definitely the ability to support an enterprise level use of the information across the different technologies. So this is really one of the things that Informatica has been very well known with given our quite agnostic approach towards how we integrate with the different technologies within the upstream business which significantly catalyzes the use and the adoption of the OSDU platform and potentially some of the other solutions that many of our colleagues here on the call would be coming into support. So I would love to leave some room potentially for questions to be triggered over the chat. So I'm gonna thank you very much for the attention and the intent listening that we've been having from the rest of the community. So thank you very much. These are our details. As you can see, this is the email, the Twitter handle as well as the LinkedIn profile, the same case we've got the same for Remy. So if you've got any questions maybe now or at a later stages, we'd definitely be more than happy to engage with all of you. Thank you very much.