 theCUBE presents UiPath, Forward Five, brought to you by UiPath. Welcome back to UiPath, Forward Five, you're watching theCUBE's wall-to-wall coverage. This is day one, Dave Vellante, with my co-host, Dave Nicholson. We're taking RPA to intelligence automation. We're going from point tools to platforms. Mirage Matar is here, he's the director of intelligent automation at VMware. Yes, VMware, we're not going to talk about vSphere, Aria, well maybe we are, but he's joined by Thomas Stoker, who's the principal product manager at UiPath. And we're going to talk about testing automation, automating the testing process. It's a new sort of big vector in the whole RPA, automation space, gentlemen, welcome to theCUBE, good to see you. Thank you very much. Thank you. So, Neeraj, as we're saying, Dave and I, you know, really like VMware was half our lives for a long time, but we're going to flip it a little bit, talk about sort of some of the inside baseball, talk about your role and how you're applying automation at VMware. Absolutely, so as part of us really running the intelligent automation program at VMware, we have a quite matured COE for last four to five years, we've been doing this automation across the enterprise. So, what we have really done is, you know, over 45 different business functions, we've really automated quite a lot different processes and tasks on that. So, as part of my role, I'm really responsible for making sure that we are bringing in the best practices, making sure that we are ready to scale across the enterprise, but at the same time, how quickly we are able to deliver the value of this automation to our business users as well. And Thomas, as a product manager, you know the product and the market inside and out, you know the competition, you know the pricing, you know how customers are using it, you know all the features. What's your area of, main area of focus? The main area of the UIPAV test suite. For your role? For my role is the RPA testing. So, meaning testing RPA workflows themselves. And the reason is, RPA is matured over the last three years, we see that and it has adopted a lot of best practices from the software development area. So, what we see is RPA now becomes business critical. It's part of the main core business process and operation and testing it just makes sense. You have to continuously monitor and continuously test your automation to make sure it does not break in production. Okay, so and you have a specific product for this, is it a feature, is it a module? So, RPA testing over the UIPAV test suite, as the name suggests, it's a suite of products. It's actually part of the existing platform. So, we use Orchestrator, which is the distribution engine. We use Studio, which is our IDE to create automation. On top of that, we build a new component, which is called the UIPAV test manager. And this is a kind of analytics and management platform where you have an oversight on what happened, what went wrong, and what is the reason for automation to break. Okay, and so Niraj, you're testing your robot code. Correct. Right? Correct. You're looking for what? Governance, security, quality, efficiency, what are the things you're looking for? It's actually all of those, but our main goal to really start this was two front, right, so we were really looking at how do we deliver at a speed with the quality which we can really maintain and sustain for a longer period, right? So, to improve our quality of delivery at a speed of delivery which we can do it. So, the way we look at testing automation is not just as an independent entity. We look at this as a pipeline of a continuous improvement for us, right? So, how it is called in industry as a CI CD pipeline. So, testing automation is one of the key component of that, but the way we were able to deliver on the speed is to really have that end to an automation done for us to also from developers to production in using that pipeline and our testing is one piece of that. And the way we were able to also improve on the quality of delivery is to really have automated way of doing the code reviews, automated way of doing the testing using this platform as well, and then, you know, how you go through end to end for that purpose. Thomas, when I hear testing robots, I don't care if it's code or actual robots, it's terrifying, it's terrifying. Okay, great. You have some test suite that says, whoa, yeah, we've looked at the code. Why is that terrifying? It's terrifying because if you have to let it interact with actual live systems in some way, the only way to know if it's going to break something is either you let it loose or you have some sort of sandbox where, I mean, what do you do? Are you taking clones of environments and running actual tests against them? I mean, it's like testing disaster recovery in the old days. Imagine, imagine. So, we are actually not running any testing in the production live environment. The way we build this actually to do a testing in the separate test environment on that as well by using very specific test data from business, which we call that as a golden copy of that test data because we want to use that data for months and years to come. So, not touching any production environments or things. All right, because you can imagine. Yeah, absolutely. Yeah, it's like, oh yeah, we've created a robot. It changes baby diapers. Let's go ahead and test it on these babies. Yeah, that's right. I don't think so. What's the, does it matter if there's a delta between the test data and the production data? How big is that delta? How do you manage that? It does matter. And that's where actually that whole angle of how much you can in real life can test, right? So, there are cases where you would have, even in our cases where the production data might be slightly different than the test data itself. So, the whole effort goes into making sure that the test data which we are preparing here is as close to the production data itself, right? It may not be 100% close, but that's the sort of boundary or risk you may have to take. Okay, so you're snapshotting that, moving it over, a little v-motion. Yeah. Yeah. Okay, so, do you do this for citizen developers as well or is it you guys pretty much center of excellence writing all the bots? No, right now we are doing only for the unattended, the COE-driven bots only at this point of time. What are your thoughts on the future? Because I can see some really sloppy citizen coders. Yeah. Yeah, so as part of our governance, which we are trying to build for our citizen developers as well, there is a really similar consideration for that as well. But for us we have really not gone that far to build that sort of automation right now. Narrowly, just if we talk about testing, what's the business impact been on the testing? And I'm interested in overall, but the overall platform, but specifically for the testing, when did you start implementing that and what has been the business benefit? So the benefit is really on the speed of the delivery, which means that we are able to actually deliver more projects and more automation as well. So since we adopted that, we have seen our improvement or speed is around 15%, right? So 15% better speed than previously. What we have also seen is that our success rate of our transactions in production environment has gone to 96% success rate, which is, again, there is a direct implication on business on that point of view that there's no more manual exception or manual interaction is required for those failure scenarios. So 15% better speed at what? At implementing the bots and actually writing code? And to end, yes. So from building the code to test that code, able to approve that and then deploy that into the production environment after testing it, this really has improved by 15%. Okay, and what business processes outside of sort of testing have you sort of attacked with the platform? Can you talk to that? The business process is outside of testing. You mean the one which we are not testing ourselves? Yeah, so just the UI path platform, is it exclusively for testing? This testing is exclusively for the UI path bots which we have built, right? So we have some 400 plus automations of UI path bots, so it's meant exclusively for that. But are you using UI path in any other ways? No, not at all. Okay, okay, interesting. So you started with testing? No, we started by building the bots. So we already had roughly 400 bots in production. When we came with the testing automation, that's when we started looking at it and now building that whole testing platform. What are those other bots doing? Let me ask it that way. There's quite a lot. We have many bots in finance, in auto management, HR, legal, IT, there's a lot of automations which are out there. As I'm saying, there's more than 400 automations out there. So it's across the enterprise on that. Thomas, both of you have a view on this, but Thomas's views are probably wider across other instances. What are the most common things that are revealed in tests that indicate something needs to be fixed? So think of a test failure, an error. What are the most common things that happen? So when we started with building our product, we conducted a survey among our customers and without a surprise, the main reason why automation breaks is change. And the problem here is RPA is a controlled process, a controlled workflow, but it runs in an uncontrollable environment. So typically RPA is developed by a COE. Those are business and automation experts, but they operate in an environment that's driven by new patches, new application changes rolled out by IT. And that's the main challenge here. You cannot control that. And so far, if you do not proactively test what happens is you catch an issue in production when it already breaks, right? That's reactive, that leads to maintenance, to unplanned maintenance actually. And that was the goal right from the start from the test suite to support our customers here and go over to proactive maintenance, meaning testing before and finding those issues before the heat production. So I'm, oh good, please. Yeah, yeah, yeah. So I'm still not clear on, so you just gave a perfect example, changes in the environment. Yeah. Those changes are happening in the production environment. The robot that was happily doing its automation stuff before, everyone was happy with it. Change happens, robot breaks. Okay. You're saying you test before changes are implemented to see if those changes will break the robot. Yeah. Okay. How do you expose those changes that are going to be in a production environment to the robot? You must have a, is that part of the test environment? Does that mean that you have to have what? Fully running instances of like an ERP system? You know, a clone of an environment? How do you test that without having the live robot against the production environment? I think there's no big difference to stand that software testing. Okay. The interesting thing is the change actually happens earlier. You are affected on production side with it, but the change happens on IT side or on death side. So you typically would test in a test environment that's similar to your production environment or probably in IT in a pre-proud environment. And the test itself is simply running your workflow that you want to test, but mock away any dependencies you don't want to invoke. You don't want to send a letter to a customer in a test environment, right? And then you verify that the result is what you actually expect. And as soon as this is not the case, you will be notified, you will have a result, a failed result, and you can act before it breaks. So you can fix it, redeploy to production, and you should be good now. The main emphasis at VMware is testing your bots, correct? Testing your bots, yes. Can I apply this to testing other software code? Yeah, you can technically actually, and Thomas can speak better than me on that, to any software for that matter, but we have really not explored that aspect of it. You guys have pretty good coders. But good engineers at VMware. But no, seriously, Thomas, what's that market looking like? Is that taking off? Are you applying this capability or are customers applying it for just more broadly testing software? Absolutely. So our goal was we want to test RPA and the application it relies on. So that includes RPA testing as well as application testing. The main difference is typical functional application testing is a black box testing. So you don't know the inner implementation of that application. And it works out pretty well. The big opportunity that we have is not isolated, not isolated testing, isolated RPA, but we talk about convergence of automation. So what we offer our customers is one automation platform. You create automation not redundantly in different departments, but created once, probably for testing then you reuse it for RPA. So that suddenly helps your test engineers to move from a pure cost center to a value center. How unique is this capability in the industry relative to your competition? And what capabilities do you have that are differentiators from the folks that we all know you're competing with? So the big advantage is the power of the entire platform that we have with your iPath. So we didn't start from scratch. We have that great automation layer. We have that great distribution layer. We have all that AI capabilities that so far were used for RPA. We can reuse them, repurpose them for testing and that really differentiates us from the competition. Thomas, I detect a hint of an accent. Is it German or? It's actually Austrian. It's Austrian, well, so. You know, don't compare us with Germans. I understand. Hi, German, is that the proper, is that what Spoken in Austria, is that? Yes, it is. So, point being? Point being exactly as I drift off. Point being generally German is considered to be a very, very precise language with very specific words. It's very easy to be confused about between the difference between two things. Automation testing and automating testing. Yes. Because in this case, what you are testing are automations. That's what you're talking about. You're not talking about the automation of testing, correct? Well, we talk about... And that's got to be confusing when you go to translate that into... But isn't it both? 50 other languages. Is it both? It actually is both. And there's something we're exploring right now which is even the next step, the next layer, which is autonomous testing. So, so far, you had an expert, an automation expert, creating the automation once, and it would be rerun over and over again. What we are now exploring is together with the university to autonomously test, meaning a bot explores your application and the test and finds issues completely autonomously. So, autonomous testing of automation. It's getting more and more complicated. It's getting more clear by the minute. Sorry about that. Alright, Niraj, last question is, where do you want to take this? What's your vision for VMware in the context of automation? Sure, so I think the first and the foremost thing for us is to really make it more mainstream for our automation developer itself. What I mean by that is, is to really... So there is a shift now, how we engage with our business users and SMEs. And as I said previously, they used to actually test it manually. Now the conversation changes that, hey, can you tell us what test cases you want? What you want us to test in an automated measure? Can you give us the test data for that so that we can keep on testing in a continuous manner for the months and years to come down, right? The other part of the test changes is that, hey, it used to take eight weeks for us to build, but now it's going to take nine weeks because we're going to spend an extra week just to automate that as well. But it's going to help you in the long run and that's the conversation. So to really make it a much more mainstream and then say that out of all these kinds of automation and bots which we are building, so we're not looking to have a test automation for every single bot which we are building. So we need to have a way to choose where their value is. Is it the quarter end processing one? Is it the most business critical one? Or is it the one where we are expecting a frequent changes, right? That's where the value of the testing is. So really bring that as a part of our whole process and then, you know, go ahead. You're still fine doing that, great. Guys, thanks so much. This has been a really interesting conversation. I've been waiting to talk to a real-life customer about testing and automation testing. Appreciate your time. Thank you very much. Thanks for having us. All right, thank you for watching. Keep it right there. Dave Nicholson and I went back right to this short break. This is day one of theCUBE's coverage of UiPath Forward 5. Be right back after this short break.