 Welcome to Fedora, welcome to Nest, this presentation is about QA and I'm going to talk a little bit about how you can get started with QA. So at this point I'm talking to a presentation, I'll be taking all the questions at the end. Note that this presentation has been kind of made record friendly, so post after the recording is over and it's live on YouTube, you still can access all the links, you don't have to actually take the links anyway. It has been built into the presentation, you will see how. So a little bit about me, I'm Shimantro, this presentation was supposed to be with another colleague of mine, who is supposed to be around or probably might be joining later. So in a nutshell, we as a part of Fedora QA take care of a lot of testing in Fedora and we make Fedora better. That's the motto that we swear by. Now going back, what does testing entail for anyone who is new to Fedora, new to open source, wants to take a, it's kind of distro hopping but wants to know a bit more about Fedora. So testing in Fedora entails a lot of things and I'm going to cover a couple of them over here in a bit more details and for everybody to kind of understand what's going on, how they can actually participate in test days and everything else. So the way that it will be bored is Fedora releases every six months. So every time we have a new release, we get a lot of changes. So as you have seen probably in Matthew's presentation yesterday, you have seen that Fedora has a lot of changes. We get a lot of new features every release and these features need to get tested so that we don't have any regressions, we don't face any bugs or like showstoppers kind of situation. And that's where curating impetus in the first, that we try to take test all the new features and the chain sets that come along with the new release, right? And then moving on, every time there is a new release, you basically get a version bump of a lot of packages which are done by packages or are basically curated during a mastery build, all of these packages are supposed to be tested by the QA team and the people who help the QA team who we call as the heroes of Fedora, right? And because they are actually the heroes who support making Fedora the best operating system out there, right? And then we take some special attention towards the trick path packages or the critical path packages, like for example, your GNOME settings or your basically entirety of Nautilus, stuff which are very critical to you like kernel, right? So we kind of make sure that these packages and any bug related to them are taken way seriously, we kind of have debates with during our QM meetings and stuff like that. There's a locker bug meeting specifically done to make sure that we kind of not ignore those bugs. That's a part of our job. The next thing is we run a lot of test days on regular cadence and test days are something I'm going to cover big details in the slides to come, but it's kind of like a whole day of testing for a specific feature. That's exactly the aim of a test day. And other than that, we kind of try to do something called release validation. And again, release validation is a job I'm going to cover up as we move, but it's all about how we take up an ISO and how we take up a compose per se and then basically run a lot of test cases along with it to see how it works and if it actually passes everything. So that's kind of validating our ISO or rather, release validation in short. We write a lot of test cases manually and automated. So we use Viki-tcms, which is using Fedora Viki as a test case management system. And then we have OpenQA, which is our automated testing area, which we use for testing a lot of releases. In fact, updates nowadays. And finally, we actually love to develop tools which make life easy for QA contributors. So our team has been participating in developing things like the package or dashboard. The previous talk might have covered a bit of what package or dashboard is, ease and how it works. We develop and maintain this app called Test Days. We have Test Cloud. We have the QA dashboard, which I'm going to cover in my talk. And then the back end of all of these good things, and many more of the things, right? And now to kind of put in short, QA team is basically a very small team of people who does a lot of things, but we are very thankful for the passionate contributors who are across the globe and help us run the show. It really is up to everyone's head. Here is exactly how you, as a person, can exactly get started with Fedora QA. And that's basically these five steps. Now, all of you who have been in Nest might have had a FAS account. If you should know, FAS account is the account we maintain. It's kind of like the identity inside Fedora. So all your contributions get mapped to your FAS account. You are a part of FAS groups. So that entails which particular area of functional contribution that we are trying to do. And then we use mailing list a lot. We use a QA list a lot. And this test list that we use is supposed to be a list where you come as a contributor to introduce yourself. If you talk about who you are, where you're from, what's your interest? So if you're interested in containers or CoreOS, you come down and you would say, hey, my name is Search and Search. I'm from such-and-such place. And I'm more interested in testing containers. And then we would kind of reply to that email with, well, we have a container test day running, or we have a CoreOS test day coming, here are a bunch of test cases we are trying to develop for CoreOS, and hope you can pitch in. And all of these discussions happen over mailing list. So it's very important that if you want to get involved, you join the mailing list for all kinds of discussions that happen. And over that, we have two other ways you can get started. Right off, which is the QM meeting. Now the QM meeting is every Monday. And we talk about the general topics, the general F35 status of whichever is the new release, talk about the status of that particular release. We talk a little bit about what is going on in Fedora. So if there's an bug, we need to discuss stuff like that, that happens over there. Then we cover a bit of test days, which is, again, a community when everybody comes to test a certain feature per day, like one specific day. And then we would go ahead and basically use, or talk about test days, or upcoming test days, or test days that are happening. So if you happen to attend a QM meeting, you would basically be getting everything in a nutshell of what's happened the entire week. Now, we usually do not skip those meetings, but when it becomes like a record end to three meetings, but we don't have nothing solid to talk about, we still refer to mailing list as the source of truth where all the discussion happens. And finally, we have something that we organize every three odd months, which is an onboarding call. An onboarding call is kind of like this video call, but very lengthy. It's like a one and a half hour video call, the me, Chef, who is another Q&A member. We sit together and we kind of demo out each and every two. We demo out every single process. So if you're one of those who is getting started, onboarding calls is the best part. So look out for them. I would probably suggest to go to Pugyaar, Pugyaar.fraterarchyway, slash federarchyway, and then you would find a onboarding call ticket where you can see the latest timings and what's happening. So these are exactly the simple five steps. And now, having knowing this much puts you in a situation to understand rather what you are possibly interested in doing. That's exactly what we're going to go over here with. Now, we test with friends, right? I mean, testing something without a bunch of people helping you out here and there is kind of like boring. So we go around and we use IRC. We refer to a bunch of other places, and then we do a bunch of this testing with the community. That helps us get a lot of very palatable feedback on the compose, kind of a very upfront feedback towards developers. And we have three areas that you can get started today if you wanted to. So the thing is we have something called release validation. Now, release validation is for folks who have somewhat like they can run a VM. They would like to go ahead from a VM, download a latest compose, and then once they download the latest compose, all they have to do is basically just go ahead and validate the compose against certain test cases. And this test cases would have what we call as release criteria. So if some test case is broken and doesn't work on a particular release, it violates a criteria. We kind of call it a blocker bug, and that's how we move into the process of testing it from beta to finite or rather from raw height to beta to finite. That's the process we follow. So we take, let's say, today's latest compose, which might become tomorrow's beta. We take it after my master build and we start QA-ing on it, right? And then let's say network is not working or the Wi-Fi is broken or Wi-Fi is not connecting. It's not connected to your SSID. Now that validates, that should create a criteria and that's how we are going to discuss it in the QA blocker bug meeting. Or we are going to call it, let's say, a blocker bug and we would put that in a blocker bug app after filing in bugzilla, of course, and then we are going to validate and keep colorating that for you. So this is for people who have some time, some hardware resources to go ahead with. Now, for someone who is getting started new and they kind of have very less time, they can actually opt out for the next two things. First is updates testing, which is simple. Every time we basically release a new operating system or rather a new version of the operating system, the previous versions, which are still not end of life, they still need updates. And these updates needs to be tested and that's tracked by a web dashboard called Bothy and you can actually go ahead and earn some of these packages in a positive sense as in this work and in a negative sense as in they don't, right? And then that's exactly what we call as updates testing. Now test days are kind of special events. They don't have a so-called cadence. They are basically like whatever comes new, we take it from the chainsets. We have some regular test days as well. And we take all of these and we put that into a time frame of let's say one day, two day or even a week long if it is a kernel, you might find something like a kernel test week. So we kind of test that exact thing for some days and we give the direct user feedback right of the back to the developer saying, hey, these features are working as expected. These are not. And all of these are tracked using something called a test days app. So if you are getting started, you would probably see a community blog post or a federal magazine post coming up, announcing a test day. And that's the easiest route to get started with or getting involved with. And now all of these QR codes point you to the right wiki pages. So whenever you want to refer back to the recordings, go ahead scan them, put them in pocket if you need them, and then go ahead and use them whenever you need them. And now that's more like consumption. We actually go ahead. So the release validation updates testing and the test days, they are a part of what we would expect every community member or rather every community member usually go ahead and write. But the most interesting part comes from the writing test cases. And that's one of the need of the hour to be very precise. So if you are someone who is familiar with Linux and someone who wants to write test cases for us, for packages, yeah, we would love to have you on board. And the easiest way is to follow a SOP that we have in place, which is the SOP for test case creation. Again, the QR code would point you to the right wiki at a later point of time. But there are two parts to this. One is writing the package test cases, which is kind of important when you are basically... When you're basically starting out, you see a particular package that you have been using for a really long time. There's some feature that you have been using for a really long time, but then that... When you go to Bodhi to karma that package, you see no test case. And that's exactly what we want to eliminate. We want to basically have contributors to write and maintain some of these package test cases in the form of wikiting servers. And easily to do that, we cover that in-depth on this onboarding session that we do. But in short, it follows SOP, where it's a very basic wiki template that we go ahead and we ask contributors to create a bunch of test cases for a bunch of packages they use. Often, if someone is very new to test cases and they kind of don't know how to write it or they're not familiar with wiki, we kind of ask them to update existing test cases. So we have a lot of test cases, which kind of don't make sense after a couple of releases because they have got a bit wrong. So it is very important that we keep wiki gardening. We kind of call it wiki gardening. But the whole idea is we need to update these test cases before we go run a test day or we run something more huge like a release validation. So in all the cases, we kind of go ahead updating the existing test cases if something seems off the bat or doesn't work or feels kind of dated. One of the extension of test cases that we did very recently is all of us have or kind of try to have a dual monitor set up these days. And our test cases were not kind of blocking on bugs, which would basically come up if you are raising a dual monitor. So we actually went ahead and extended a particular test case to cover that section of bugs. So just in case if some reporter finds something not working on a dual display, they will basically go ahead and call it a blocker bug from now on because we basically added that as a part of the criteria. Say if this test case fails, then that criteria gets violated and then we call it a blocker. So that's the kind of stuff that we keep doing. That's one of the ways you can participate and help us to drive the next generation of test case creations. Now Fedora has always been a very, very, very, very vast project. And when I mean vast project, it means it keeps coming out with emerging technologies and platforms which are, you know, which we have not actually tested in real time ever. And when they come out, it becomes really important for contributors to rely heavily on contributors and the QVETing basically goes ahead and helps a lot of contributors to do or rather maintain a lot of these new and emerging platforms. So we have Fedora Core OS and Silver Blue which are emerging and we have three official editions, which has the workstation, the server, the IoT. And these workstations over in IoT are basically our primary blocking. So if something goes wrong in any architecture of workstation, server, and IoT, it will probably just go ahead and block it to an extent. Of course. We have discussions around these blockers in something called the Blocker Bug App, which is a very separate app, again developed by the QVETing to go ahead and help contributors file blocker bugs and maintain a track of blocker bugs. Now, coming back to emerging platforms. Now, Core OS has always been very interesting because it converges a lot of containers when containers are propagating and a lot of other stuff. So if anyone is interested in containers and want to actually help test some of these, reach out to me or the QVET the test list and we will be helping you get sorted or started with Core OS testing. Essentially, we have just been having Silver Blue and Core OS test days, which is a starting point. But in short, when they get to the edition status, when both of these are either one of these get to the edition status, we would want to have a blocker and a blocker release validation from each one of them, which would then mean that we want more contributors and more customers. So that's the kind of thing that we have been doing to release cycle over cycles and that never stops. And the fun is always a part of it. Now, to interestingly tell a story, it has always been a thing when I got on boarded to the QVET team, it was a goal that we would want to bring a lot of contributors. But the only problem with that is we have the data separated out into multiple cases. So the QVET team has to look for bugzilla reports, emails for other contributors lists, probably other lists like desktop list and kernel list if something has gone wrong in that particular specific new version. And we would kind of look at every other place to gather up information and it becomes hard. And then if you are someone who is coming back and doing QA after, let's say, three to four months, once in three to four months or even lesser, it becomes really hard for you to understand how the process is working. And staying updated with the entire information that comes during a release cycle is kind of hard. And to make it easy, the QVET team decided rather brainstorm to create a dashboard which would give you this new and stepper feeling of how to get started with QA without actually looking individually into every single thing like a bugzilla report and an email and then basically both he and remembering so many things because it becomes confusing. So here's a sneak peek for the QA dashboard. A lot is in place and is yet to come. It's not yet released, but yeah, we are going to have some more stuff coming in. So let's get started. This is Lukash Brabetz, and I'm a Software Quality Engineer at Red Hat and I was mentoring Manisha Kania who worked on Fedora QA landing page during their outreach internship. Fedora QA landing page is a web application consisting of two parts. The first one is a QA dashboard, essentially an aggregator of various QA-related information from all over Fedora infrastructure. Information is presented in a form of widgets. Currently, we have these four. Timeline of current release cycle, a nice way to see scheduled dates of branching, freezes and public releases. Meetings and test days in this week, so you don't have to visit Fedora QA. Workers and Freeze exception statistics to see if we are okay to go. This give a meeting minutes and easy way to check your election items. We plan to add more widgets such as grid path help, test matrices coverage, lady's spoken QA fails, and links to the latest Fedora composes. The second part of this application is the contribution wizard. At the moment, we have a lot of helpful documents, tools and processes, but these can be hard to reach and understand. We believe they are major obstacles in bootstrapping new members of the community. This is the place where Manisha Kania contributed the most. The initial proof of concept was a bit cluttered and unrefined. Over the course of her internship, Manisha helped us create a nice looking and intuitive wizard. In the first step, you are presented with a choice. Do you have hours or days of time? Let's say you have a few spare hours and you want to help. The fastest way to contribute is to participate and vote in Vody Karma. Some of the steps provide help in a form of modal window. Let's go back using the breadcrumbs and choose release validation. For the absolute beginners, we created an introductory course to testing Fedora. If you tested Fedora before, choose area of validation distinctly like. Working on these test cases will boost the coverage the most. Clicking on the test case will open modal window with the tile steps. Maybe later, you will see the test cases in this part. They are all about programming and fixing issues in various projects. First of all, you need to choose a programming language from the list. Let's say that you know and like JavaScript. Click on it and you will be presented with a list of projects that, based on your library sticks, use JavaScript in the code base. Choose a project based on its description or one you are already familiar with. In this final step, you will see a list of the projects by the project's owners. Each link opens a new tab that will lead you to the issue where you can learn more from the description and the discussion. Soon, learning page will be available at qa.project.org. Currently, learning page is in English only. Once the code is stabilized, we plan on adding translations. If you want to contribute, you can do so. Lastly, I want to thank Manisha for choosing Federal QA or Indonesia. Thank you all for attending this lightning talk. And if you are part of working or interest group, we would be stoked if you came to us with more ideas for reasonable onboarding tasks we could add. Hello, and welcome to my lightning talk about QA. So that's the QA dashboard for anyone of you to go ahead and ask questions. So the way that we work is the more questions you ask, the easier it is for us to explain you and guide you to a particular place that you can get started with. So I will switch back to Hopin for more questions. So thanks, Andy, and thanks everybody. I guess you guys loved the presentation. It's an honor to work with all of you guys in the community. Thanks, Troy. Thanks, Daniel. So if you have any questions, let me know. You know where to find us. Alright guys, so that's from my side. If you still have questions, I'll be online here. And