 Hello, everyone. We are pleased to have Kevin the general manager of test project to share his experience on how to overcome five common obstacles in adopting open source automation. I'm sure all of us can relate to this topic. We'd like to thank test project for sponsoring this session and having Kevin join us today all the way from New York. Just a reminder, to the right of your screen you will see a discuss button. Please use the Q&A session while inside it to ask your questions. Without further delay, over to you, Kevin. Thanks Sahil. First, before we get started, wanted to say thank you for having us. I'm going to go ahead here and share my screen and also make sure that everyone's able to see it. Alright, so as Sahil mentioned, we're going to talk about a topic today that we're quite interested in which is open source test automation. Obviously everyone here, I think being the Selenium conference is interested in one way or another in open source test automation. So that's what we'll be talking about. A little bit about myself and I'll make this quick since I know we only have about 15 minutes here. I'm the general manager for test project at a company called Tricentis. Test project was acquired by Tricentis in 2019. We are a free automation solution that is provided to the community at no cost. It's built on top of Selenium and Appium. So if you're already using open source automation, I think you'll find it a nice tool to check out. So feel free to come by our booth and get a demo if you want to see this awesome free product. I've been in the software testing market myself for about eight years and I really love open source testing. It's something that I've been taking the lead on at Tricentis. I know Tricentis historically maybe wasn't thought of as a company that supported the open source initiatives in a meaningful way. But we've certainly been doing a lot more of that recently. We acquired test project. We've also acquired open source load testing solutions around flood IO. We've also acquired a company called spec flow in the BDD space and we did a 2020 state of open source testing survey, which was one of the largest open source testing surveys ever. And we'll review a bit some of the results of that here as we go through the presentation and talk through some of the challenges that companies seem to be having today with open source. So you'll see here a nice little graphic from our survey. So one of the first questions that we asked in our survey, which is very relevant to this, is how important is an open source functional testing tool? So that would be things like Selenium, Appium, et cetera to your functional testing process. And you'll see that between the people who said it was very important and important, 92% of those people that we surveyed said essentially that they wanted or needed to have an open source tool for functional testing, which is why we say that open source is on fire. But we found that open source isn't only for experts, right? We're really surprised to see that when we surveyed the companies that we're doing open source testing, the average time that people had had being a tester was something about nine years in the market. But you'll see that on either end here, we have huge experts that have 15 plus years in the space doing test automation. We also have people just getting started with under five years in the field. So it's really a wide mix between people who have been doing test automation their whole life and people who are just getting started. One of the things that we wanted to dispel in our survey was that people are only using open source testing tools because they're free. Now, we did see that we asked people, you know, what do they enjoy about using open source testing tools? And close to 40% of people did say that the cost was a major driver, a major factor, and then them adopting the tools. But a lot of other factors came into play. And when you sum these up, they're much larger than people who commented about cost, right? So things like the community support, the integration to other tools, the ease of customization, those become really big reasons for people to adopt open source tools and become things that people look for when they adopt an open source automation solution. And so all in all, this is where we come today and, you know, see the great numbers that we have in this conference and in some of these other open source testing communities. You know, there's at least 500,000 people we know actively using open source testing tools. And that's just what we can see on LinkedIn. I think the number might even be 5x that. So, you know, I'd be confident in saying there's millions of people using open source tools for testing. And that's, you know, a really easy thing to see why when you look at some of the numbers from the survey that we conducted. But I would say it's not without challenges, right? Open source is great, but you know, I think all test automation tools in general come with challenges and open source comes with a specific set of challenges. One would be we see that the technical skills required to create these tools are often very hard to come by. Two, we'll see that a lot of times these scripts that are built in open source tools might become very difficult to maintain. Three, we see a lot of test automation teams becoming siloed and, you know, open source testing tools not really contributing to a lot of collaboration. Four, we see that a lot of times people have really slow running test automation suite. Sometimes their test automation actually takes longer to run even than a manual test. And five, we see that, you know, there's a lot of lack of visibility into test results, a lot of fragmented test reporting and inability to communicate the results of these open source testing tools up into the management levels. So I think let's start with the first one, you know, why do we see this challenge around lack of, you know, the right skills to create these tests in these open source automation tools? I think the number one thing is that, you know, the people who are creating these tests are oftentimes, you know, QA or testers and very rarely developers, right? We see a lot of people talking about, well, you know, developers should write all the tests. Sure, that sounds great, but in reality, it's really the testers writing the tests. And we see that oftentimes they don't have the right skills or they haven't been trained up. A lot of times these people might be kind of migrating from more of a manual testing approach to an automated testing approach. And that's where we see, you know, that the training and skills become one of the biggest impediments to functional testing with open source testing tools based on our survey results. So what can you do about that? I think we recommend, you know, checking out recorders. There's a lot of great free recorders on the market, so no cost to you, at least from a licensed perspective. I think a lot of people are probably familiar with Catalon. There's obviously Selenium IDE, which has been really overhauled and relaunched in the last few years. But there's also test projects, so feel free to come by our booth, you know, different strokes for different folks, I would say. But, you know, some of the differences, for example, if you're using Python or using C-Sharp, you may find that test project is a better recording solution for you because it allows you to export into those languages and continue to learn, you know, how to code on top of the recording that you've created with a scriptless recorder. So we'll definitely check out these options. They're all compatible with Selenium built on top of Selenium, so should be great tools as you kind of transition from a scriptless, you know, recording approach into more of a coding approach with Selenium. Now, we talked about test maintenance. And so this, you know, picture here is kind of depicting what oftentimes being a test automation expert looks like, right? You're just constantly playing whack-a-mole and breaking tests that are, you know, constantly failing for different reasons. A lot of times being that, you know, a locator, for example, cannot be found. An element can't be found when running a particular test. This is a common thing. It might happen unexpectedly when a developer, you know, changes, let's say the CSS properties or, you know, the ID or things like that related to a particular element on the page. And so we've found that one approach that helps to solve this common challenge in using open source tools like Selenium is adopting something like a page object model, right? This essentially abstracts things from the page level and creates kind of a reusable model so that when things do inevitably change, it's easy to go to one place, update that locator and have it change across all of the tests that interact with that page. And so you'll see that the change is, you know, that everything gets abstracted up from just being, you know, separate, you know, unorganized set of locators to something where, you know, we have pages and the pages have actions and the actions then map to these elements so there's a bit of abstraction really can help with the test maintenance. So this is something that's common in a lot of test frameworks. It also is built into a lot of the tools like Test Project and Catalon that are the free tools built on top of Selenium as well. Another thing that can help with maintenance and just help with reuse in general is making sure that you build as many kind of reusable tests as possible in as small of tests as possible, right? So if you can build things like, for example, a login that could be reused across a number of different tests, you can essentially nest these tests within one another. This makes the process of kind of building out tests quickly very easy and it makes the maintenance process simple as well, right? So for example, if, you know, the URL to the site that you're testing happens to change rather than having to go and find where that is in every single test, if you've used this same login process consistently across every end-to-end test scenario, it'll make it a lot easier to make those updates. And likewise, if you parameterize and create variables for things like the site URL or things like username and password, you can quickly reuse these tests in sort of negative testing scenarios like an unsuccessful login and also have the same benefits, right, of not having to maintain these two separate sets of steps since they're essentially the same steps just with different data or different variables. So a lot of this has to do with taking a parameterized or test data-driven approach, right? So you can basically wire up these things like your URL, your username, your password in this example, and really pull those from a data source and define those as parameters. And then this way you can easily create a number of different scenarios to do, you know, negative testing, test with different types of users with different permissions and things like that. One thing that we do see a lot of people run into is as they start to automate their tests, their automated tests actually run more slowly than their manual tests. A lot of reason behind that is that they're not taking advantage of things like parallelization, which is essentially running automated tests, you know, in parallel. This is especially a good approach when you're doing cross-browser testing, right, using the same scenario across just multiple browsers rather than running them in series, right, where they run one after another. One way to really speed up the test automation process and get results more quickly is to actually run these tests in parallel using an approach where, you know, they can run in different threads at the same time. A lot of companies we're seeing, especially with our users and test project, are taking advantage of browser and device farms to do this. Basically, these browser and device farms will host machines and mobile devices for you that are ready at any time to run a test and then execute these tests in parallel so that you can get much faster execution and results back to your developers. We're seeing the most common ones that are being used in our customer base as sauce labs and browser stack and then also a lot of companies are also building out their own selenium grids, which is essentially a test cloud or a browser and device farm that they host themselves. All of these solutions work great. I think it just comes down to, you know, what are your security needs? What type of budget do you have and also how much time do you have to maintain, you know, this infrastructure, right, you know, hosting your own selenium grid can be a great option, but it is obviously going to require maintenance from you when, you know, servers go down, when new browser, you know, or operating system types come out, and, you know, it could be a good option for companies, though, that need very high level of security. Obviously, these solutions over here are running the cloud, and many companies, you know, aren't yet ready to adopt the idea of executing tests in the cloud. There are different considerations to make here, but regardless of what your needs are, we definitely recommend where possible to run parallel tests in a device or browser farm like one of these. Finally, we do see that reporting is kind of a major gap in terms of test automation and success there. You know, a lot of these tools will give you, you know, a kind of generic reports, which will show you just the pass and fail results of the various tests in your test suite, but you really want to have more than that, and most importantly, you want to have one consolidated report that shows you results across all the different open source test automation tools you might be using. So make sure you're measuring how fast your tests run, so how quickly they're executing, you want to make sure that's always getting faster and not getting slower, because it's going to slow down your ability to get feedback to your developers. You want to measure flakiness, which is the predictability of the results from executing the tests, right? Are they consistent to the tests, you know, give you the right result? For example, when something works, does the test tell you that it passed or does it sometimes give you false positive and tell you that it failed? Coverage, you know, how many of your business functions are covered by automated tests? This comes down to actually bugs and finding bugs. So this is where you might want your reports to be pulling data from tools like JIRA. You want to figure out how many new bugs are you finding, right? I think that's a lot of the value of test automation is catching and preventing new bugs rather than just finding the same old bugs that happen to pop up, you know, sometimes when an environment's not properly configured or something that's common continues to go wrong again. Impact, I think this is a good one to tie back to business, is when you're doing your testing and you find that a function doesn't work, kind of go one level beyond that, right? It's really crucial to know in the test report, you know, if, for example, the checkout process is not working and that means that when we go live, there's a risk that nobody's able to check out and the company collects no revenue. It's kind of an obvious example, right, where you would want to have some data available in your reports. So, just to wrap up here, you know, take a look at some of these recommendations we've got in terms of tools. We definitely recommend, regardless of whether or not you want to use an open-source tool or commercial tool, you know, four basic things that you would want to consider would be a page object model, you know, shared test and page element storage, ability to create tests obviously so that you can support kind of new and non-technical users and test parallelization, which oftentimes comes with reporting. And so, you know, there's a number of different options, like Selenium, Appium, obviously, TestNG, you know, GitHub for storing your tests. You can store your tests on a network storage device. You could create your tests with Selenium IDE, things like Catalan, run your tests on browser stacks, all slabs of Selenium grid, but also feel free to come by our booth and check out TestProject. We cover off, you know, a number of these areas as well and would love to help you take your Selenium and your Appium testing as well to the next level. So, thanks for the time. I'm going to go ahead and stop sharing my screen so that I can switch back here and check out any questions while I was presenting. All right, so let's roll through here. We can probably open for questions now. Q&A The only question I'm seeing here is about what about combinatoronics or scenario developer. Those are tools. I'm not familiar with those. I would have to look those up, but feel free to come by our booth. I'll be there for the next hour and a half or so, so I would love to learn more about those tools. I think that was Mendoza who answered that question or asked that question. So, I see a couple of questions in the audience in the first time if you can just plan through them. Oh, yeah, we're getting some more in the Q&A section now too. All right. Yeah, so one question is TestProject going to be open source forever. Yes, our SDK will be open source forever in the product itself since it's hosted in the cloud as a free product and we've made a free forever promise which you can check out in more detail on our website. We'd also be happy to talk about that at the booth. Question about if TestProject supports BDD Yes, using our SDK we have a number of customers who are using TestProject with frameworks like Cucumber and SpecFlow so we'd be happy to talk about that a bit more at the booth. And can you explain a little bit more on flakiness? Yeah, flakiness is the idea that when you run a test one time it might pass and then the next time it might fail but nothing's actually changed in the application. It's a common thing that happens when elements and locators change, for example. The way that TestProject gets around that is we actually have a self-healing capability powered by AI and that's brand new, we just release that so anyone who's interested in kind of building a less flaky test definitely say come by and check out a demo at our booth of the free self-healing in our new recorder that we just released last week. That looks like it in the Q&A section Let's see can TestProject be implemented in company cloud or has to be in your own cloud? It's in TestProject cloud for the moment, that's part of how we're able to deliver it for free but we are exploring options where people wanted to pay, they could deploy it in their own environment but we haven't figured that out yet. Let's see what happened to Protractor? Is it dead? I don't know if it's dead but I think Cyprus is becoming a tool that's becoming very popular in replacing Protractor for people that are testing JavaScript front-end frameworks like React and things like that. Apart from TestNG is there any other tool that helps in parallel execution? I think Selenium Grid is the main one that we're seeing in terms of helping people with on-premise parallel execution. Do you see keyword testing as being reliable? I think we do. That's something we'd love to show you if you want to come by our booth and how much priority should be given to UI testing and API testing? That's a tough question kind of depends on your skill set, what your app does and how it's architected as well. I know we're a couple minutes over. I see a lot more questions coming through. Sahil, do we want to keep going with answering these or can I answer these in the chat as we go on? Yeah, I think we can probably, yeah, that's what we can direct all the other questions in the booth if that's okay. Yes, if we can somehow just send those over there, I'm happy to go over to the booth and we can start answering those. Yeah, so just to conclude, first of all, thank you Kevin for sharing your experience with us and for everyone, Kevin would be in the sponsor booth, so if you want to catch him for further queries, he'll be happy to help and of course, again, a big shout out to test project basically, they're first sponsoring this talk itself and again, there are like more than 1000 people from body countries in this event right now. So that's actually a great opportunity to network. So I'll encourage everyone to go into lobby VIP lounges to actually find the speakers and ask them and not just speakers other attendees just to find out what's going on. And yeah, thanks all for attending this session. So there are slides handouts that are already uploaded to the session, you can grab them and yeah, have a great evening and great rest of the conference. Thank you. Thank you.