 Next up, I wanted to bring up the Mindex team to show us a few things about the work they're doing. We're going to have Nick Ford and Johann Dinhane come up and show us what they're working on at Mindex. Come on. Welcome to Cloud Foundry Summit. Yeah, great to be here. I know this is not your first one, but we're excited to have Mindex back and showing us the work that you've been doing. Tell us a little about what you've done in the last year. You were on the stage last year with some of your customers, but you've done a lot in the last 12 months, or as Molly said, 213 days. What's that look like for Mindex and what's been going on? Yeah, so, and indeed you're referring to, we've been involved in the Cloud Foundry Foundation quite a while, so early on we joined it. And I think the reason is that we did that is we have joined philosophy behind the platform. So if you look at Cloud Foundry, it's all about abstraction and automation for deployment of applications for the operations. And Mindex with low code is also all about abstraction and automation, but then for application development. So we have our visual models that we give to our users that they can build visual models and define their applications and then turn that into working out applications automatically. And we've seen over the last 12 months a big uptake in the amount of applications created on top of our public cloud, which is running on Cloud Foundry on AWS. So we have now 40,000 applications running on Cloud Foundry in our public cloud in six regions. And half of that has been created over the last 12 months. So 20,000 apps in the last 12 months. Yeah, so all adding to our public cloud. And on top of that, we signed up some great strategic partnerships with IBM and SAP. So you can now buy Mindex via IBM, via SAP, and then run on their clouds. So there's also thanks to Cloud Foundry. You have this interoperability. You can run these apps on any cloud. And we actually started these relationships at previous Cloud Foundry summits. That's what we're here for, to bring everybody together. Exactly. So let's look at an example. Yeah, let's see what's Nick's got going over here. OK, hi. Hi, everyone. What I want to do is talk you through a chatbot. Everyone loves a chatbot, right? So we're going to take this chatbot and make some changes to it. It's simple. We've trained it using the IBM Conversational UI. So that's running the bot behind the scenes. And I've trained it with some simple dialogs. So it's asking me if I want to take an image and to scan an image. So what I'm going to do is actually allow the chatbot to take an image, scan that image, and give me some response back into it. Right now I've built sort of entry level for that so I can take my picture, or a picture of individual. Take a picture of myself. Once I've got that, they're OK. And it will pop up inside the chatbot with the image. We have a button at the bottom that actually doesn't do anything right now. So I want to talk to you about how you can use low-code platforms to rapidly build capabilities into, for example, in this case, a chatbot. So let's switch from here into the Mendix desktop modeler. This is where we build applications in low-code. And we have some domain specific languages, as you'd expect. The ability to build data models. There's a data model behind the face real recognition. And we have WYSIWYG page editors for building out pages, drag and drop. And we have complex logic. So in actual fact, this chatbot is a little bit more than I'm demonstrating today. It's actually a health chatbot which manages glucose for newly diagnosed diabetics. This is actually a microphone that manages that interface. So it visually allows you to check streams of data that are coming into the app. This is how we build complex logic in Mendix. This is the conversational UI. So this is actually the screen that we just saw rendered on the mobile device inside the desktop modeler. And we have a button here called Recognize. What I want to do is to extend that button with some logic, some low-code logic in this particular case. So in the interest of time, I created a simple microphone. It has an entry point, has an endpoint, and it takes in a parameter. That parameter is our chat message. I'm going to take that parameter and I'm going to do something with it. I'm going to push it out to an external service. In this case, we're going to utilize an AWS image recognition service. Though I do that, if I head off to the Mendix app store, there are thousands of reusable components, connectors, complete applications, templates to begin to build and bootstrap your application from the store itself. I'm going to search for AWS in this particular case. I'm going to click on the AWS image recognition service. I'm going to download that into the desktop modeling environment. What this does is it actually extends the fabric of the modeling environment itself. So I'm not an experienced developer when it comes to integration with REST service or external services, but I'm trained on Mendix. So what I can do is I can go back into my microflow now. If I look at the toolbox for a microflow and search for AWS, you'll see I have some new connectors in my toolbox that I can start to use. The one I want is detect faces and labels. We'll drop that onto the microflow itself. All I need to do to work with that is to configure it. It takes in three parameters. I need to authenticate with the service, and I need to pass it an image, an image that it can recognize and give me data back. So let's edit this. Let's pass in the chat message. The chat message is actually generalized as an image as well, and it takes a couple of other parameters. So we'll have an AWS access key for authentication, and the other one is we need an AWS secret access. There we go. That's done, and now we're finished. The final thing I want to do is to write the data that comes back from that recognition to my database. So to speed things up, I've created a subflow. So a microflow that does that for me. So let's open up here. I want to make a call to a subflow, which is a microflow already created. I'll select that from my list. It's actually create chatbot. It automatically knows what parameters it takes, and we'll map them into the microflow that I have right now. So I'm done. I now should be able to call an external service from my chatbot. What I want to do is to push this into Cloud Foundry. So I want to one-click deploy this into our public cloud, which is AWS running Cloud Foundry. The way I do that is just hit the button, tell it that I want to go to security, and away it goes. Whilst it's doing that, it's going to take a couple of minutes to do that. I'm going to hand back over to Jan to talk a little bit more about complexity of apps. So we've seen how you can quickly change an application visually and then deploy it straight into Cloud Foundry. What you often hear about these kind of tools or visual programming is that people think that's just for simple applications or for toy applications. But actually, we're doing this for large enterprise customers, so hence the amount of applications that are running on our platform. So this is an example of a hospital management system where we have multiple microservices that emit events, that listen to events, and these microservices are completely cloud-native scale on Cloud Foundry. And let's look at the example dashboard application to give you a bit of a flavor of a different type of app that you can build using Mendex. So this is a dashboard showing all the information, the events coming into the system, just to give you an example. And I think I've thought long enough now that the application is actually deployed. So let's switch back to Nick. Okay, so let's reopen the application itself. So this time it should refresh the app once we've done that. And we'll present it back with our chatbot and we should be able to take an image and it should call out to the recognition service and inspect that image. So we'd inspect. We tell it that we want to inspect an image. Let's let the conversational UI just catch up. So I'm going to say, yes, I do want to scan an image. I'm going to take a picture of myself. I love that doing this bit. So let's take a picture of me. Hit OK. Let's go back to our recognition service and see what happens. Okay, so it's processing the image. Let's hope it's kind with its response. So it's now writing this back so it thinks I'm a human. Oh, you're human. That's good. I was nervous there for a second. And it records I'm somewhere between 40 and 60 years. It's both easy on age, Nick. Okay, so just give you an example of the kind of things you can do really rapidly with low code and smart applications. Okay. Yeah, so that's how you can quickly build applications. One final note is that you can just try this out yourselves. Go to mendex.com, look and download their environment to build applications. As I said, we have native integrations with IBM and SAP. So you can build on, so deploy on top of these clouds or on our public cloud, which is running the open source cloud foundry, but you can also just enter your own cloud foundry details so you can just deploy straight into your private environment and build applications quickly and even enable your business users to take part into building applications. That's awesome. So that is a great use case of what low code, no code can bring to the table and cloud foundry. Exactly. It's a great combination. It's a great marriage. Yes. Thank you so much, Johann and Nick. Thank you. Thank you. It's proving to us that you're human. Indeed.