 So, thanks everybody for coming out to the first Pittsburgh QA event. I am going to completely bore you for the next 15 minutes on regulated environments and testing thereof. So, my name is Jared Bill and I'm a test engineer at Onmix and we do have some QA positions available. So, you'll get a quick little preview of what it's like to work in a regulated environment and a little bit about Onmix. So, if you're interested, let's talk afterwards. Okay, so what are you going to walk away with in the next, well, 12 to 15 minutes? We'll see how long I go. I'm going to give you a quick little intro about Onmix. We're going to talk a little bit about what Onmix does and what we build. We're going to talk a little bit about regulatory bodies and how they come into play with our products. I'm going to give you a little bit of background about what a device history file is. We're going to talk about verification and validation. You probably get the theme at this point, but hold questions until the end. We'll have a little roundtable discussion. All right. So, what does Onmix do? Well, to answer that question, we really need to talk about pathology. Okay, so what is pathology? It's the science of the causes and effects of diseases, especially the branch of medicine that deals with laboratory examination of samples of body tissue for diagnostic or forensic purposes. So, what does that really mean? Okay. So, you see the person there looking through the microscope, and this is basically what a pathologist does on a day-to-day basis. They take microscope slides, they look at tissue at a very high resolution, and they look for patterns, anomalies, things like that, and that ultimately leads to a diagnosis. Sounds fantastic, right? In practice, it's not as clean and pretty as what you see in the top left corner. In fact, this is kind of what it really looks like. So, down here, you have stacks of cases of patient slides. These are tissue taken from a patient. This is all the paperwork associated with those cases, and this is what it looks like on the doctor's desk. Right? And so, I don't know about you, but for me, that's a little bit scary, okay? If I'm being looked at or reviewed for a deadly illness, cancer or anything of that nature, you'd hope that it's a little more organized than stacks of cases and paper actually the way it happened. So, in the modern world, we can do a little bit better than that, right? So, that's where Onlix comes into play. So, Onlix creates digital pathology solutions where we build medical devices, which are scanners. We ingest microscope slides. Those slides are then scanned at a very high resolution, and so this top scanner is the product that I work on right now. It's called the VL120, and it accepts 120 slides. The VL4 accepts four slides, and that's our predecessor device. Okay, the other part of our whole solution is some software to present this information. So, on the right screen, you see two slides being compared side by side with different stains, and on the left monitor, you see all the case information about that patient where the doctor can review the slide and make notes and then send that information back so that the patient can be diagnosed, okay? So, that's great. So, what is involved with building medical devices? So, one of the very first things that you run up against is regulatory bodies, all right? So, in the context of a medical device, there are some primary players, okay? So, here I'm calling out three different regions, but there are many of these, okay? So, in the US, we're primarily dealing with the FDA, and when you go and you submit for permission to sell, what you're really seeking is either pre-market approval, a PMA, or a 510K, okay? So, what does that mean? A PMA is basically you have a new idea and you want to get it out into the field for the first time, and so you present all this evidence, all this data, you submit it to the FDA, they review it, and they say, okay, your product is safe for use, all right? The 510K is slightly different than that, so this is where you're going to compare yourself to an existing technology and basically prove that you're no worse than that product, you're equivalent, okay? So, if we move over to Canada, you're primarily dealing with Health Canada, and you are seeking a Health Canada license in Europe. There's a few different paths. You can do a self-certification, you can get like a third-party audit or a notified body review, or you can get a design dossier, okay? Now, the route that you pick here is highly dependent on what classification your medical device is, okay? So, in general, there are three classifications for medical devices. One, two, and three. One is the least risk, three is the highest risk, okay? So, just to give you a quick example of that, a class one medical device might be like a toothbrush, okay? So, very low risk that someone's going to be harmed using a toothbrush, and so it's classified in the lower end. On the other hand, a pacemaker, right? Something that is implanted in your body and could shock the crap out of you is a class three device, okay? So, to make this even more confusing, you can have a single device in different classes depending on where you're submitting. So, for example, in the United States, the FDA has reviewed our information and they've decided you're a class three device, whereas in Canada and Europe, we're a class two device, okay? So, this is all very highly subjective sort of, and it's really kind of based on the comfort level of the regulatory bodies that are viewing you. Oh, by the way, in Europe, you're looking to achieve a CE mark. So, you may have seen that symbol on some of your products at home, and that basically means that that product is safe to sell in Europe. All right, so, with that said, I am going to give you a very boring slide of definitions now, but I think that it really frames up kind of the idea here. So, when we talk about a design history file, I realize that's really hard to read up there, but a design history file is really sort of the design controls around a product. So, the FDA does not prescribe a process that a company has to use. What it does say is these are the things that we want to have from you so that we can properly review your equipment, your device. So, it's completely the onus of the company to provide all of that evidence in the right manner, and so this is kind of the layout. This is, if you imagine, these are folders, and each folder would have kind of subsections. So, let's just go down through them real quick. Design input is nothing more than really requirements, and design outputs are the specifications that trace back to those inputs. Design review is a cross-functional review of implementation. Design verification provides objective evidence that what you said you were going to build in the inputs is actually built correctly. Design validation verifies that the user needs are met, and then you can transfer your design to manufacturing through design transfer, and then once it's transferred, any changes are documented to the design. So, I want to point something out here. Like most companies, your requirements are really kind of the control point as to how much work, in a sense, or how much churn that your development team is going to have. And we'll get into that a little bit more here, but just real quick, just going to throw out a feeler. What does this kind of remind you guys of? Hey, I was expecting to have to give you a hint. Very good. Yeah, so this is very, very much so set up to work well with long-running government sort of projects, things like that. And so in the modern age that we're in, we all want to be lean and agile, right? We want to release quickly, like think through math. So, we kind of run up against some sort of a... We end up having a problem here because the way that the organizations or the regulated bodies want us to operate really kind of collides with the way that we want to operate. Okay, so what does this look like? I'm glad you asked. Okay, so on a day-to-day basis, when we talk about verification, there's really kind of two buckets that they fall into. On the left, we have, I'll call normal verification. And on the right, we have capital V verification, very formal, okay? So, on the left, we operate as a scrum team in an agile way. And this is where our teams are writing test cases and doing exploratory testing. We're collecting data. We're doing studies. And we're really throwing everything we have at this product to make sure that we feel confident that it's ready for production use, okay? So, this is really about our organization gaining confidence that what we've built is working correctly. When we move into formal verification, we really want everything to be lined up and working, okay? The formal verification run is not intended, really, to uncover bugs, okay? We're expecting when the verification runs, it would be great if everything passed, and it's kind of a formality, right? So, all these records that are created, this is an example, I know you can't read the text, you're not supposed to, but it's just a test case that's laid out and, you know, this is what kind of is generated as objective evidence. This goes into the design history file. This is what the paperwork that an auditor would review if they would come in and do an audit, okay? So, this basically shows that you have test cases that map back to all of your inputs or requirements, okay? So, what does this look like when we do this formal run? When we have, like, golden data sets, right? So, all the data is predefined or built up into these big golden data sets. During a test run, we don't want to have to create, like, a user account or a patient record. All that stuff's established for us already so that we can narrow in and test just the single input, the design input requirement that is of interest. Predefined built environments, and then you get into device production equivalents or production intent. So, we want to be executing these test cases on the closest thing we have to a production device, okay? So, we wouldn't want to run our formal verification run on an old prototype scanner, okay? And finally, you have formal defects. So, when you're running a verification protocol, if you do encounter a defect, it's not the end of the world, right? We expect that, that's why we do testing. But what it does happen is this is looked at very closely. This gets in front of cross-functional teams. They deliberate on, you know, how serious is it and then they make a group decision on how to move forward, okay? So, you have your opportunity up front as a development team to lean and agile and talk all these bugs out. But if you don't find something and it ends up being found during a formal verification run, it's not the end of the world, but it's a little bit tougher, right? Okay. So, this is my last slide, by the way. When we talk about validation, validation is just a little bit different. So, validation isn't really set up to be, for lack of a better word, a paper exercise, right? This is really getting our feelers out into our users trying to see, you know, did we build the right thing? So, in this picture, you see a user, a pathologist, a model, using our device. And on the right side, he's doing a diagnosis of tissue. On the left side, he has the case up, right? And so, when you do this sort of run, you're looking for a lot of different things. One of the things is human factors, okay? So, if I'm building a large device that has a hole that's just big enough for my finger to fit into, it doesn't chop my finger off whenever I put my finger in there, right? Because users are going to do whatever users can do, okay? So, you want to be smart and intelligent about the way you design something. Performance can be a factor. So, one of the big things that we have right now is, you know, our doctors, so believe it or not, let me take a step back real quick. So, the microscope itself has been really around since, like, 1500, late 1500s, okay? And it was first used for observable recording around 1665, okay? And that, obviously, we've got better at building microscopes, but the technology largely hasn't changed since then, okay? And so, one of the nice things about using a microscope today is it's very fast, okay? A doctor can take a slide, put it under a microscope, look at it, and then, boom, grab the next one, move. So, if this doctor was trying to keep up with his normal workflow and it took two hours for him to load a slide or a case, obviously, that's a problem, okay? Usability. Very similar to human factors, but when I think about usability, I think about is the design intuitive, right? So, are you using standard icons? If you have your own custom icons, do they make sense? When an empirical user looks at that icon, do they basically know what they're getting themselves into by clicking that button? Repeatability. So, when you run these validation sessions, you want to do it in a controlled environment. So, if you were to run the same validation session again, you'd largely receive the same results, okay? So, you don't want it to be biased in any way. And all this kind of leads up to the classic meme, I'm sure, you know, all the good developers and testers here in the audience are familiar with this meme, but for those, if anybody is not familiar, I'll run through it very quickly. On the left-hand side, you see how the customer built, the next one over is how the project leader understood it, the next one over is how the analyst designed it, how the programmer wrote it, and then finally, the way the customer really wanted. So, we want to make sure that we're building rubber tires. Okay, so, thanks a lot everybody. And, like I said, we do have at least two testing positions open. And if you're interested in learning more about omics, come and find me after the talk. And I will recommend you so that I can get a bonus. Thank you.