 Hello, my name is Martin Schüffler, Product Manager at ASAPU. So our topic is today how to get SAP data into the Microsoft Azure Cloud. There might be various approaches using CDC tools on database level or middleware-based approaches, which both approaches have some flaws. The CDC tools typically don't really know about the logic or the context of the change that is happening inside the SAP system, doesn't know about approvals that are still outstanding, or doesn't know about some of the values inside an SAP system that you want to know are more calculated, cannot be easily grabbed by a database tool, but have to be computed then in your target system again. And also CDC tools often require more expensive database licenses, at least for SAP HANA, that's the case, they need an enterprise license to run them. And the middleware-based approach typically needs all data services or sub-services that are available or made available through SAP, or that you have to implement yourself in order to pull data from the SAP system, but you never really know when to pull, so this also has some drawbacks. It is a very flexible approach. Middleware layers are probably very powerful, but of getting the data quickly from the SAP system to your cloud, not always the number one approach. So we came up with a solution that sits directly on the SAP system, so it is an ABAP-based add-on that does sit or is installed on the ABAP NetWeaver stack, so it runs on older SAP ECC systems, it runs on SAP HANA systems, it also runs on other NetWeaver-based systems, so you can also install it there, and it will work the same way, so we only require SAP bases as a core component and dependency. It's also not really important if your SAP system runs on-premise or if it runs on some private cloud or if your system runs already on Azure, then that's a big plus for integrating to the Azure cloud, of course, but it's no prerequisite that that's the case. The tool is also certified for the SAP Rise program, so you can use it also in conjunction with that. On the Azure side, our tool provides a connector that does connect to different Azure services, so mostly we connect to the event-based services, so it's Event Grid and Event Hub or to Service Bus, and from there you can forward the data to different receivers inside your Azure landscape. We do that because typically the event-driven architecture pattern is very useful for this kind of publishing information from the SAP system as it happens because there typically are multiple consumers, so there might be an analytics consumer, but there might also be an application, either a third-party application or a home-grown custom application that you have on the cloud that needs that information, so with the typical event-hub-sub approach, you can just communicate the info once and then you have multiple consumers getting them, and we will see in our demos, you will see a few examples, and we talk about a few use cases more, but in the demos we have two very concrete examples on how to do that. What we're currently working on together with Microsoft is in connection more directly to the new Microsoft Fabric as suite of tools where the approach we are taking with Microsoft is having a more direct connect between SAP and the Data Lake so that the Data Lake as a basis for most fabric services are filled directly and don't have to be filled through the event broker services per se. So what does our tuner offer? So we have a long existing integration add-on that's on the market for almost 10 years now. It was expanded with different use cases or for different connectors, and it has evolved into a powerful extraction framework that comes with configuration-only data extraction, so that's where the no-code, no-code comes in and where you can keep the mantra, keep your core clean from SAP, true. And so you configure what data you want and how it should look like in the message payload, and then you connect it to a trigger inside the ABAP application layer, and that's where we can react directly when something happens in the SAP system. We can react immediately and send the change data out immediately, but as is needed in most projects, there's also facilities to do batch loads which are very, very efficient. We will see later numbers from existing customers how they utilize that, and the nice thing about the add-on is that it uses the same configuration basically to do the real-time incremental events and also do these more initial full load type of things when you need reference data at the beginning of a project where you don't necessarily have events for the objects that you want in your cloud system, but you need to load them to have, for example, the last year of sales order data or the last two quarters of invoices, things like that. You can easily use the framework for to do batch loads and then have the same configuration also active to do incremental loads as they happen. So these loads, they all, because we are on the SAP side, we exactly know how much resources are available inside the SAP system, and we use SAP server groups to make sure that we don't overload the SAP system there and to have a well-performing load into the cloud system. So what are now these possible triggers that we have been talking about inside an SAP system? So SAP has for a lot of the business objects, there are what they call internally as a technical object, a business object, and these business objects have events on them. So a sales order has events on it, purchase order or a material master, shell account or invoices, they all come with events that can be activated, some are active by default, some have to be activated for configuration, but they are all available inside your SAP system. And then you can hook up our framework to react on those when they happen and then our configuration kicks in to send out the messages. And to not keep that on the slides, but to go more into details, I want to show one example that we have in our system configured. It is an example that sends sales order data to an event hub first for the distribution, and then we have attached a streaming analytics job to read that data and to post it for consuming by a Power BI dashboard so that you have an up-to-date view of what is happening to your sales data. So I want to take the opportunity to quickly switch into the SAP system now to show you how that would be configured in our tool. So this is our SAP system and SAP system in our network. And we, as all configuration inside an SAP system, we go to the IMG and there will be an entry for our integration add-on. And then you configure the connectivity. So we have all kinds of services connected. So I'm just looking for the Azure services and I know that we are sending to the Azure event hubs. And I know it will be in that connection. We just have basically two interfaces connected, the one that we are interested is the sales order interface that we have configured here. And for an interface like that, we always configure how is data extracted, how will it be sent, and what will be the triggers. So I'm looking at the trigger first. This is done using the SAP event linkages where we just connect a business object event. So it is connected here to what's called BUS2032, which is a sales order inside an SAP system. And it's connected to a changed and a created event there. And this will fire off this interface then. So if these events happen, it will trigger this interface and this interface is configured with an extraction that's based on our payload designer where you just decide, okay, which are the tables that I need, which is the data that we need. So you can use tables, you can use SAPs to combine those. So first you have to define how are these linked together. Will I always have data or will sometimes data be missing? That's where you have the outer choice. And then if you have the tables, you can just add fields. And we already did that for our demo example. So you will add these fields you see on the left-hand side these more cryptic SAP names, the typical cloud developers don't know anything about. So you can map them to have more readable or more descriptive field names inside the payload that we generate. And for the event services, we typically generate a JSON payload as that's the typical language or typical format used. And yeah, you can do even more stuff here. You can add a conversion class if you want to add some coding logic to it, how to come up with a value. You can do that as well, but you can just, if it's a simple example, you can also just use it's use as they are here. So if we now would change the sales order, an event will be triggered. And I want to show you as well how you would monitor this thing inside your SAP system. And we also have a monitoring transaction for that. And here I'm just for, because we have so many connections, just focus on the one we have been looking at with the Azure Hub. And you will then see a list of, so these have been the calls that we did to our event hub in the last few hours. And there's also one sales order change that was triggered here. And if it's activated in the configuration, you will also see the exact traces. So you will see the exact message that has been sent. Typically this will be off in the production environment, but if the cloud system, the cloud endpoint returns with an error, we will automatically turn it on because typical cloud errors that we see are more temporary nature and you can't really reproduce them easily. So it's always good to have them if an error occurs the trace active to see what has been sent, what was in response to be better able to troubleshoot it. From here, you can also jump into the application log. So all these entries are also linked to the standard application log. So if you have some kind of integration that checks the application log or links that to a solution manager, for example, then you can use that as well because if there was an error, we will also post an error to the application log as well. There are also facilities to reprocess messages. So for a successful message might not be really useful, but it can also be done manually here. You can just reprocess that message and then it will be sent again. If there was an error, you typically have a job running to check for failed calls and that will automatically reprocess these failed calls in the background. So typically customer schedule that every few minutes, some do every minute, just to have them reprocessed as soon as possible. So and then with this configuration, we are sending the sales order data into our Power BI dashboard. And I also wanted to show the Power BI dashboard, bring up my browser real quick. Where is it? Just quickly have to check where my browser has gone to. Oh, there it is. So there we just built a simple dashboard that collects info from our sales orders and divides it by material number, for example, and just to keep it updated with the latest info from that data we send to the event type. Yeah, so this is one example where analytics is involved, but that is a good approach to bring that further with that stream analytics jobs. Another example that I wanted to show you is one we built together with Microsoft as well to have a approval integration into Microsoft Teams whenever a purchase order is created and sent to external approval in the SAP system. So how that happens, configuration is very similar. On the SAP side, we send the event to the event grid and from there a logic app is triggered that will then publish a message into a Teams channel that's the part I wanted to show you real quick. I triggered one earlier just before the call and we have a Teams channel, it's called Azure demo here that will get an adaptive card by that logic app and then in the adaptive card you will see, okay, there is a purchase order created. It has here, it has one entry. You just see information on the item that was in there. You see the quantity and the price and now you can hear approve or reject and when you get one of these buttons it will go back to that logic app and it will then post this message back to the SAP system so that the SAP system can update the purchase order with your decision. So that's it for the very quick demonstration. I will have later a slide where you can also book specific demo meetings to have a dedicated demo for you that lasts a little bit longer to have more time to go into the details. But I wanted to also bring up some numbers from existing customers so that you can see real customers are using these integrations and what kind of numbers they are actually using and have as a throughput. So like there is a procurement use case where a customer fills a web application that's a dashboard for their procurement team to see what kind of purchase orders have been taken, have what's the goods received already booked, have the invoices been posted, all that kind of stuff is consolidated into a web frontend that they have and they have these typical daily numbers so that's not when they are pushing the system but it's just the day-to-day business that they have so they are sending 200,000 updates to purchase orders every day, 1.2 million purchase requisitions and they all are combined then on the Azure side for this web application and also for some analytics integration there as well. Yeah, other customers that have tried more what can the throughput be, they pushed it more to the limits to see what is really possible and of course it's always depending on your SAP environment how big is your SAP installation, how many application servers do you have, what kind of server group did you use for the tests all that kind of things of course play into these numbers but as you can see there are quite some impressive numbers are possible with the solution and we have been benchmarked against other tools and typically come out ahead. We always thought CDC tools just running on the database will be faster but it seems like they are not necessarily faster so yeah, you're welcome to also do a POC together with us to see what are the numbers in your system how does it compare to these and now I wanted to take a few minutes to just broaden the use case space a little bit to not go from the detailed how did we do it in these demos but what are other customers doing with it? There are quite a few retail customers I guess they have a lot of pain points with their SAP systems right now and how to integrate, how to get the info out into their cloud systems where they typically have some kind of online shop need to update stock levels or have to be up to date on what is really sold what they still can sell that these different systems are more up to date more in sync, that's also a typical place where just data replication falls down a little bit as he available to promise quantity which is very important to know your real stock level is not just the number on the database but is a computed number that depends on your configuration and depends on different data sources inside your SAP systems so it takes into account your sales orders potentially your deliveries, your reservations, your purchase orders so various things can play into the number and it's never anywhere there's no number on the database that just reflects it so it's quite hard to take for just a database replication but it's also, what is also really key for these customers is that they have shipping information so when that's a delivery, when are the deliveries planned when do they go out, when is the goods issue happening things like that are integrated then in the cloud system typically with some logistics partners that then do the actual delivery to the customer so that the tracking is more real time for these sales also they always have typically a analytics use case attached to that as well so it's very typical that they use that event driven approach where they send out the message once and have multiple consumers there but they also have all kinds of other use cases that might be inbound like get product pricing from competitors through some cloud application and then feed that back into the SAP system for pricing update decisions but there are also other customers that use more the procurement use cases that we saw but it's more important to send out info when purchase orders are created, when goods receipts happen things like that and also very common use cases are in the area of cloud maintenance or services or use cases where if you have equipment that has to be serviced that you know where is it what is the life cycle of the equipment if it is moved where is it currently located when is the next service due that these things go out immediately into planning systems and or for example to have production orders published into shop floor systems so production planning is typically happening in the SAP system but there might be then some other systems that do the fine grained planning on the shop floors so they have to be kept in sync and existing solutions are typically not real-time so they don't get the info when they are changed inside the SAP system but they can get it on a different schedule typically so these are use cases where this approach can be really helpful Yep, so we have over 200 customers globally using our tool this is not all of them are direct customers from us because we are also licensing the tool and the framework to SAP so we are part of two SAP standard products the one is the SAP FICLAS integration and the other one is the connector for the SAP event mesh from ECC and S4 both of these SAP products also use our framework and use different connectors for it so a lot of customers happily use it and as you might have guessed by the numbers that we've shown, they are typically the big customers having larger use cases where a lot of data is transferred where existing approaches might fall down more easily than what they are doing right now we have a lot of customers also switching to the event-driven approach from a middleware-based approach because it reduces load on the SAP system because there's not much polling going on but we push out the data so all that stuff is there we also have a partnership together with Microsoft we are in close exchange with them on how to improve the connectors what to do, there's also an entry on the Microsoft Azure Store and as you have seen in the beginning we are in talks to support Microsoft Fabric which is pretty new to also integrate that with your SAP system there is documentation available publicly even if you don't have a contract with us you can go there check what the configuration would look like what you have to do, what you can find there there are also some demo videos up already just put the links here in the slides as well that you can also double check this webinar will also go on and be published there as well and there are of course ways to contact us for real-life demos or to have a more in-depth technical sessions there's a web form that you can fill out or to find and book your demo session so it's typically either with myself or with my colleague Benedict where you can book directly through the website you can also contact us by email of course if no matching slot is available might be that your time zone is not reflected in our booking system or that it's just no slot is available or if you have more questions regarding licensing or more sales related topics then you can also contact my colleague Florian he also has a booking tool where you can book meetings with him as well thank you so much for the webinar have a good day