 So I am standing between you and Vita who's gonna show you all about database access He's got the cool stuff. We'll see how long we go This this can you hear back there? No kind of Maybe you have to be louder There we go. Let me ask you a question right up front. Is there anyone here who's a data scientist? Not one Not a data scientist this this falls in the category. I need a doctor in the room. Have you heard that phrase before I? Need a doctor a PhD in data science. Nobody You got a PhD Okay. All right. Yeah, he's getting there. So if I if I fall down on this presentation, you got to jump on stage That's how this works Because let me tell you right away. I am not a data scientist So I'm going to be pretending to be one on TV or on camera or on stage But we're going to show you what it means to work with data scientists and how they become part of the Application workflow and we have a very cool workshop that we're not going to workshop But I'm going to run you through the workshop those you can see the resulting application It's an object detection application and the cool thing is after we show you the demonstration You can then do this on your own right? You can go back home and try it yourself It's a very clever little application as an object detector. They can determine between dogs or cats That's actually how we demo if they will try that today in other words. That was actually the key breakthrough So I forget how many eons ago, but it was actually not that long ago It's probably about ten years ago when a person who is a PhD was doing their data science thing They figured out that if they could train a machine learning model feeding it enough dogs and enough cats That the AI could determine a dog or a cat that was the major breakthrough So we actually use that as a demo because it's kind of fun You might think that sounds easy a human can tell a dog versus a cat, but a computer can now tell dog versus cat They're both furry. They both have four legs. They both sometimes have tails So, you know, how could it tell that so it actually took a lot of effort now? This is an open source thing. So by the way everything I'm showing you it's free open source. You can try it yourself So we don't actually have a lot of slide where let me just tell you a couple little stories though The way I got into this AI ML world was again I'm not a data scientist We have them at Red Hat and we had one actually train us a system basically train a model that looked for vibration Vibration patterns. So if you actually work in a manufacturing world, you know Where people make things with big fans and conveyor belts and all those machines Well vibration is the leading indicator of potential machine failure This is also true of your car by the way If you've actually been around cars long enough and it starts doing the Thing you take it to the mechanic because something is about to fall off or something is about to break a belt's gonna break It axles gonna break a bearing is probably burnt out and often it's those little bearings right the bearings in the wheel The bearings in a fan. So we actually did a huge presentation where we had everyone train We trained a model with people's accelerometers in the phones So earlier today when we had you shake the phone right to drive that application Well, we had done that before and we actually looked for certain patterns in the shapes In other words, we could tell if you were doing this or doing this And then we got to the point where we could tell you were doing this And we trained a model to do that and then we decided to have fun with it. We could tell if you were doing this Right like seriously we'd had people doing their trivolta move and in this case What the application did was that caused damage to a machine the vibrations that match the pattern caused damage And then we had a little bit group of repairmen that would come out and work on the machine Because that's what you would do in a manufacturing sense right repair people would go out and fix whatever the damage is So that was an example of machine learning an example of using a data scientist to actually help train a model And then we also did this one which is based on the demo. We're going to show you today So this was an object detection demonstration And we gamified it in a very clever way because what we said was like a wears Waldo like find the And so people in the audience had to get their camera and kind of isolate the horse And if they got a good shot of the horse into their camera phone on their smartphone You got points for that if we got the right match And I'll tell you there was one thing that we had to do that was actually fairly clever The model that we trained to do this had to ignore false positives And so that's actually how we got it really well gamified so people could really score good points and win So there's a lot of objects on here that the model just ignored Right, so we want to make sure that the things that should find it found and the things that should ignore it ignored Because by default what you'll see when we just show you the raw model It just tries to guess at everything right. It's like flower bear Man, you know, right? It's just trying everything and that's the model that we have today But this is exactly the same kind of idea and this of course is our keynote demo from a few years ago So if you have access to the slide deck, right? Bitly a AI ml notebook You you can actually then get access to that those videos, right? So those are both videos But we're going to show you a live demo today Now this is my workflow diagram This is kind of how the data scientist lives amongst all us application developers and it ops people Okay, so remember earlier we were talking about a platform engineer and all those type folks Where the data scientist is also part of this as well as well as a data engineer Okay, so the way this normally goes is the business leadership says we want to solve a business problem We want to have some form a trained model that processes mortgages, right? So when a mortgage application comes in instead of having a human review it We want to train model to review that mortgage and quickly accept it or quickly reject it or maybe reject it Right away or and then if it's okay, maybe send it to a real human, right? You can decide if humans are involved or not involved And by the way, you probably do want humans still involved because what will happen is a model can actually have bias All right, so just keep that in mind models actually have something known as bias meaning the people who trained it They trained it on a data set that actually puts bias right into the decision making So therefore you might have a mortgage application model that rejects people that it should not that are legally protected In your society This actually was something that happened in the us that realized that it was doing illegal activity But it was the bias of the scientists putting their data in and training the model So you want to be careful of things like that Another thing that's super important to understand is a model is trained on a chunk of data during a period of time Okay, so the data scientist, right? So the beta well, let's do this the business leadership sets the goal We want to train a model the data engineer gathers the data, right? I have to pull data from this system this main frame this database I've got to go to the data warehouse the data lake build a new data lake, right? They got to get all the data scrubbed and ready Of course the data scientist and takes all that data typically what you do is you give them a bunch of csv files You basically go to s3 or a big old hard drive or something and just give them a big pile of data And they go digging through it. Okay, and they forget how to write their code to basically train their model And what's funny about that that data if you think about it is a point in time Right, they're basically saying i'm looking over the last six months of data to train my model And that's a point in time because what happened six months from now for their data that they had is six months old Okay, so that's another important thing to understand is the thing can drift Not the model the model hasn't changed version 1.0 of the model is still the version 1.0 of the code, right? But the data set has slightly shifted over time So therefore you actually need to monitor the model in production Is it scoring more poorly? What you'll see when we show it to you is the difference between an aiml model machine learning model and code Is normally if you write code many people here write code of some sort Okay, yeah a lot of you so if you write code you say if The sky is blue Then do this right you say if the account balance is over 200 dollars then do that Right you write you write declarative code or you write definitive code if you will imperative code I should say and it does What equals something or less than something greater than something a model is not that way You give it your parameters and it comes back and says blue Ish sky is blue ish 82 It gives you a percentage of what it thinks the answer is And that means you have to change the way you write your regular application code around it 82 might be okay Yeah, we'll give a we'll basically approve that mortgage with 82 percent But what happens is as the data changes over time maybe the customer base changes over time Maybe now it's 81 and then six months from now. It's now 80 percent and now Six months from then it's 79 and somebody goes. Whoa. This is this algorithm is a little too poor now Let's train that model again So that's what that feedback loop here is for retrained models You do you need to be monitoring your models in production And it's not modeled drift some people call it model drift the model hasn't changed The data stream has changed and now the model is just not as good as it once was Look for that percentage. That's the key element. So when we were doing that object detection game We were looking to see if you got the horse greater than 70 percent That was what our code said and said, yep, you scored the horse if that's what the object We wanted you looking for so you'll see that when we get into the demo So just keep in mind that there's the data engineer the data scientists the machine Learn machine modeling machine learning engineer, right? They're really kind of dealing with the packaging of the models themselves the models are special artifacts And then app devs basically throw some, you know, rest code around it Like you might just wrap it with a python wrapper Put a flask app g unicorn app and then that's just the rest endpoint So everyone else in the world just uses a rest call or maybe it's a kafka message that comes in that triggers it We found all kinds of clever ways to trigger the model But that's what the app developer does and of course your operations folks are part of this your platform engineering folks Part of this everybody else sre is part of this too because this system Might run on a big old over shift cluster like we're going to run today Might run on a bunch of vms might run on the cloud All right, so that's the workflow that's important to understand and this is the thing we're going to go play with So here's what we're going to go do. Okay. This by the way isn't an is a workshop So let's walk you through it And uh, where to go where to go here. Okay, all right Okay, so I'm trying to think for a second. I haven't slept much. Did I mention that? So uh So what we're going to do is just walk you through it. I've already got it loaded So we're not going to be able to see it soup to nuts if you will But basically if you have an open ship data science Install and it's actually part of our free online sandbox. You've heard about the sandbox today You can just go there and it's it's there. So like here's the sandbox. Here's the topology view Look right here at this icon. You'll see red hat open ship data science So it's under this icon kind of hidden over to the side here and this is again entirely free environment When you click on that it's going to open up this little window here But it it does know that I've launched a jupiter notebook already But you can see right here when I click over to data science mode I have something like nvidia gpu add-on. I have thinking about Kafka streams You saw that in jas presentation earlier open veno is interesting. It allows you to use Things that would otherwise need gpu's but on intel right so when veno comes from intel it helps you run Accelerated work on intel and the jupiter notebook which is the important part here, right? That's the interesting part And so if you click on that It wants you to launch a notebook. We can see I already have a notebook up and running So let's just go into that notebook and this is what it looks like Okay, so the note jupiter notebook is a ide a python based ide that a data scientist would use It is their development tool and it's browser based by default So therefore you can run it in a pod very easily That's kind of the cool part by default if you actually support Does anyone here support any jupiter notebook people at this point from an operations perspective a couple of you So are you doing that with virtual machines, or are you doing that as containers? What that containers for you And vms for you. Okay, and that's common right you would set up a jupiter notebook server on vms And the data scientists could use it where you can set up in containers in this case It's all running a container But this is the tool they're going to use and it's how they like to work You can also run this directly on your laptop to a notebook directly in your laptop if you're a student That's actually how you'd normally do it But if you're in an enterprise environment, you might want to give them virtual machines You might want to give them, you know a container And you can also give them access to gpu's inside the cluster also if you want this one doesn't have any But everything works fine. It's a little slower. Okay And here you can kind of see there's a little bit of code there And so you basically will just edit the code like you would in python and i'm not really a python person But I can pretend to be one Okay, you know I can do things like You know hello world There we go And hit the little play button right here see that one And then it's going to execute that block of code So one thing to get to kind of get your head around here It's block by block Okay, it's almost like line by line or block by block and you would execute block one before you get down here next to block two Like if I say execute block two here, you can say this looks entered high So it says new text and my text see there print some text So you gotta you couldn't execute this block until this block has been executed because this block Print some text depends on a function to clear it up here in this block So that can get a little bit funny because the user can click down two or three steps and execute the wrong block I do that to myself You can also put mark down in here. This is kind of cool. So let's see if I can get that to edit So I can come here and say I want to um, you know, I guess I can call this plot hash stuff Take that out, you know, and this is just mark down, right? Just put in so so this is the third level And this is the fourth level so this nicely allows you to document things inside the um inside the environment All right Hit play there we go And so you can actually have your documentation in line And you will see that when you work with data scientists, they like doing that They're like, okay, I here's what this block of code means here's some documentation Here's what this plot documentation And so this is the jubiter notebook environment It should be noted There's some other elements here that are important. That's your git. How do I import from git? Okay And you can see that I already have a git repo imported here and you kind of see my master branch there So that's that's an element and so you can interact with git a git repo And while apparently data scientists don't like doing that normally you try to teach them to because Things in version control are helpful, right? Um, I not only say that because some of the folks I've talked to are like, yeah, data scientists may not check anything in the git Just not how they think that's on their laptop. It's good. It's all good there Well, that might be a million dollars worth of code on their laptop So you might want to get it checked into a repository of some sort And then you'll look here. This is where it gets interesting Okay, so this is just kind of how notebook works But now you want to explore the data, right? So apparently the process they like to use and again I'm not a data scientist But when I work with them, they're like Give us the data Tell us what the parameters are of the business case the problem space and give us a week This happens This is literally happening me multiple times with the folks that I work with the red hat like, okay Here's what I want. I want the scav I call that the scavenger hunt demo, right? And you have to find the objects on the page and in the case of the vibration sensing demo We actually refer to that as a dancing demo because there's all these dance moves We have people do in it and if you see the video there might be a moment Where you actually see a thousand people doing dance moves in the big audience that we had there We had a 1,200 1,300 person audience Running that running that little thing on their phone But you know, we gave the we gave the data scientists those kind of parameters And in the case of the mobile phone one, we actually had to build a little system right up front A little system that was nothing more than the accelerometer capture That basically took all the data off the accelerometer shoved in big json chunks And then we literally handed the guy a bunch of json Like, okay, here's data, right? And he basically just had to go, okay What movement were you doing with this json? What movement were you doing or what movement were you doing? So it was about 30 to 40 of us just sat there and did this stuff all day You know what I mean or even this one and he eventually figured out the patterns in the data set To basically train a model And then we also didn't want those battleship you guys know battleship where you put little pegs on a ship You know you you do you go for b5. I go for c2 and you go for b6 I go for c3 I go for b7 you go. Oh, you sank my battleship right that game Well, we trained an ai model to play the game And there was another one of our demos right and again That was just another thing where we kept playing enough And he and we captured the data kept handing it to our data scientists and over time he figured out how to beat us He could beat us pretty well by the time we did this for a couple months So it's that you have the reason i'm making that point is this is an iterative process Sometimes you got to gather the data give them the data They iterate to get a model you test the model not good enough send it back And you go back and forth and it's just like a piece of code from that perspective Okay, so you can see where it says exploratory You will they will be prototyping in here. So they'll be in here running things in here related to tensor flow They'll be in here running things to related to s3 right here So here's s3 s3 might be where you store your data So you can hand it to them you can hand it to them on a flash drive And if they were student that's probably what you do give them on a flash drive right But if they're in a enterprise employee You know you got to put it someplace and sometimes it's large right. It's a large dataset So s3 is a nice place to chunk it up in the cloud and they can download it from there Or you just might have some sort of you know internal Storage And let's see here. Let me I'm just running my little cells notice also what when I hit run you'll see a little ash there Right, so watch that closely when I hit run That means it's executing okay and see it downloaded this two dogs jpeg And so if you watch it when I click it you'll see it execute then it quickly gets done And so this is literally running it through tensor flow and the tensor flow model Okay, you see the matrix being built here. It's trying to identify what's in that image and you can see the model Specifically the model directory listed here. Okay, and again, I'm not a data scientist I don't dig into the models the cool thing about this model is completely free and open source So all you got to do is put the wrapper around it and you can use it So that's why I like this one. I don't have to be a data scientist and I can come in here and play with it Oh colonel got something died colonel's restarting So you will notice that too at jupiter notebooks. It has the concept of the colonel So let's see if it restarts and comes back to life for me Colonel not like linux colonel. It's just what it calls. It's internal runtime environment It's okay. Look it righted itself there. I think we're okay again That happens to be from time to time. I've noticed that before and let's go here And let's go here That one's still running see with a little asterisk. That's how you know it's thinking So that's the uh, right there got done. Okay, so there it found and labeled So that's one thing with object detection is trying to find things and labeling labeled them, right? So someone's trained the model with labels meaning this is a dog DOG that's the label right or or whatever it might be trying to find And of course it can take a long time to train these models depending on what you're feeding it In the case of images though people worked really hard to open source all these image trained models. So let's see here And there we go And so it says DOG. Oh, it's too blurry there. We'll see it better, but it says DOG 84 percent Let's see here And here And still too blurry there Okay, but that you basically that's the interaction model Work with the data train the model write different logic And there might be logic that actually looks like regular logic in addition to these models that we're dealing with here But the whole goal is to the to you get to a prediction algorithm In other words a model that you keep feeding at training data And by the way, this is a training environment But then you basically get it to the point where you think it's trained well enough to give you Quality answers this model actually specifically optimized for speed though So it actually does not give you great accuracy because we want it to respond quickly That's another thing that's also interesting about these technologies If you don't have a gpu because it does run intel cpus in many cases regular cpus You will find that when you hit it with a request it might take two seconds to respond So it's got it can have a very high latency So latency is one of those things you've got to factor into the equation Because a two second response may be too long for your application In the case of that Vibration sensing game where we're jamming accelerometer data in from a thousand phones concurrently all through a costa thing Just like we did this morning. But in that case, we're actually analyzing the movements We a two second thing was killing us, right? We it was just killing our our usability of the application was just taking too damn long But you know you work on that that's another thing you iterate on you make it faster Make it more performant you might have to make it less accurate Right, you know to give it a better performance and that's what's been done in this particular model Okay, you can see right here if you're a python person any python people in the room Yeah, so pip install requirements dot text you guys got that one, right? That's pretty standard stuff and you kind of see what the different packages that have been pulled in here Flask of course is how we get a little hdp endpoint to wrap this with And you can see there's the map plot lib as an example Okay, and tensor board What else is here google auth, you know, it's a lot of interesting things in there But it's you know, it is a standard kind of tensorflow model And let's go here and try this one And do I have my other? Let's go look over here Yeah, yeah, okay, so Two dogs actually let's change this out. So let's so let's try the cow On puppers. Let's see if we can actually change the code So I have this cow image over here Let's see if I got the right name that might be a png so you're gonna write no jpg. There we go Okay, so let's re execute this block since I changed the name of the image that it needs to pull And let's go run. Let's see the import numpy there And uh, let's see. Oh Not too I got to change it in multiple places On puppers Okay Let's see if we got this right It's thinking about it. It's thinking it's thinking see it's thinking that's what that means How long is it going to think? And again, it can be it can actually depending on what you've done, right? It can be a little slow sometimes, but that is again as part of the iterative process You work on that to make it if you want it to be more performant or are more latent, right? You get to make those choices Come on now. I know I changed your image on you, but it shouldn't freak you out that much The uh There's other ways to change those things But I was figured I'd give it a quick hack here to see if it would actually find my cow instead of my dogs No, no, no, and actually I do have an image here. This is actually my real dog. You can see what it did here So dog 81% so this is me taking a picture of the dog before I left home kind of thing So dog 81% that's the high score and my dog is very funny looking so At one point I took a picture and it said 51 dog 49 cat I don't have a funny looking dog And boy, this is not going to finish. We'll just we'll just leave that we'll come back to that in a second And so this is the flask application. Let's see if the flask application is up And we'll run a test against it real fast. Let's see here Okay, all right, so this was using the cow one And come on out. Yep. Yep. Yep. Yep and here Okay, but this is the process right as a if you're an application developer You would basically be interacting with your code and as a data scientist you'd be interacting with this code So, okay, get size deprecated. Okay, I understand that No worries there I did put this one now. It's still running. So there's something going on in my notebook. That's a little slow Did I do it? There it is. All right, there's my cow and again that we can't quite see it there But we'll see it later. So there's my cow found the cow. It also found some grass things like that Okay, and I can pick different images. Let's let's go in here and change it one more time Uh, did you do where to go right up here? There was this one that someone just sent me while I was here in With you guys and I thought it was funny and since uh, Jay talked about the shirt. There's actually a shirt that says kubernetes but it's got my face on it and There's some people who were god. I can't type today Wearing it and then they sent me a photo. I think I typed it incorrectly. That's the image right there I can just upload images by hitting this little upload button, you know picking stuff in my hard drive I don't know. These are they're interesting something interesting to see here. Not really But I uploaded this one And so let's run that and actually let's just run them all there's another button here that basically says run the whole thing Restart and run the whole thing. Let's try that one So you can see it going to the astros there And so it cues them up executes execute there it goes there there So when it turns back into a number, that's when you know, it's finished execution come on now And um, so before you see it had blue jeans So it isn't always sure, but you've got to look at the percentage and that was a 12 percent blue jeans So you have to say no 12 percent on blue jeans not kill it, but here we go So it didn't actually pick up my face as a human face, but it picked up their faces as human faces Okay So here here's what's cool about that This is the process you go through to test the model build a model once you have a model Then it's actually pretty straightforward All you got to do is basically deploy it assuming you have the application developer wrap it with an app Application endpoint and that's what we did here So in this case what you saw earlier with use of the dev console is you just come in here and say import from get And you import it and run the application. So that's what I did earlier So that's what's running right here And that has an endpoint on it. So if I open that up, okay, see it says status. Okay All right Set us okay because it's the back end it's the wrapper for the back end and the model itself There's a front end to this based on no JS and if I open up that URL Let's see how it does here. There we go All right So look at that. They got they got glasses 50 percent human hair human face Man 34 percent Man 69 percent But again, this was trained to be fast, right? So it's just trying to be fast And let's try this now Okay, let me see if it gets my cat right here. We're not my cat. Just want to found the internet So let's see where it. Yep. Oh, look at that cat 60 percent. I'm gonna pick that up off my phone So a picture on my phone taking a picture Right, uh, let's try this Let's try this here. Let's try it. Let's try our dog Yeah, huh All right try again There's our dog get the reflection right Whoop It's a puppy. Let's see dog 51 percent or human face 17 percent But but think about that for a second You guys see you guys see what I mean by the it's not a perfect Thing it basically gives you it always gives you the answer and a percentage of what it that the answer was This is important because if you're writing code around this in your system And we did that for all those demonstrations I talked about you have to then decide as the programmer having received the model and wrapping up the rest endpoint 17 percent. I'm going to discard that answer. I'm going to ignore it So you that's how you will filter out those kinds of false positives like okay 17 percent was too far away from anything I would do from a business logic standpoint I won't approve that mortgage. Okay And if that's what it was maybe a 51 percent is also too low Again, this model was trained so it was quick. It's relatively speedy Therefore, it's going to just give you a quick hit. You can actually train them so that they're there really are going to be more Close to the answer you want and then you get 80 percent kind of answers 90 percent kind of answers And that might be what you need for your use case You can see right here. It also found it found me and other things in the background here Clothing 22 percent, you know So you'd have to decide What is a good answer and what is a bad answer? And I think that's an important thing to understand if you are a programmer When you work with the data scientists and the and a project like this You have to communicate with them, you know, they're giving you answers And you may decide those answers aren't quite good And then you also have to decide in your workflow If the answer is kind of weak When do you kick it over to a real human to review like you used to before you tried to use an ai ml model? Right like let a human look at that Versus the chatbot versus the mortgage approver versus the image classifier That's essentially what this is versus the battleship player, right or the vibration detector So all these things are nice little models and what I'd like you guys to think about here Is this is and we're running pretty close on time here All this is documented in what we call the object detection workshop I'll give you the url But you can see it basically walks you through the whole process how to find it in the sandbox And the instructions aren't perfect. We probably have some bugs in here You'll see right here. It says start a tensor flow Pick default. It's actually says small now on that side Also, if you are in a if you're in the notebook and things get a little bit wonky You can come over here and restart the notebook Hub control panel Right so you might have to stop it and restart it so like let's let's try that We've already run it. So let's stop it and then let us kick it off again So like here you might see kick it off with a small container size That's that's all you get in the sandbox environment But you can also add environment variables So part of the workshop, by the way goes on to connect it to actually Kafka And so I don't have it set up right now, but with Kafka it'll start grabbing frames Right, so you can use your phone and go scan scan scan scan And it's grabbing frames from the video stream and sending them over and scoring those So Kafka uses a streaming engine to then pump it to the model who gives you the answer And then the answer just goes, you know has a little tells you what it is So, you would use your environment variables to connect to a database use your environment variables to connect to a A database a regular database or a Kafka and python connects to those also Right, so the data scientists could be drawing from those data sets as they do their work You'll see more about that in Vita's presentation as he talks about how to get access to data But here, uh, let me just pick tensorflow again and hit start server Failed to create no book please try again later. So sometimes I've had this problem on the sandbox It does seem to fail on this is a free environment. Okay. It is a free environment. So I have seen it fail a few times Uh, let's see here. Maybe because it thought I had an environment variable slightly started there Apparently, that's what it was Um, so, you know, watch out for that. Let's see. I'm curious to see if it goes through Notice the pvc attached It does try to remember your data for a little a short period of time So if you do have data loaded in pvc pvc stands for persistent volume claim in kubernetes Persistent volume. That's how you have some non ephemeral storage in a kubernetes cluster And a jubiter notebook wants non ephemeral storage because if the data scientist has been working and collecting the data and working on it They don't want that data to disappear And this took a notebook though You do see that it pulls what it needs from s3 kind of every time Just to be on the safe side. So let that run And what else is important here? Oh, so object detection we talked about that already And oh So I meant this is by the way where it sits object detection rest. That's what I imported This is where the models sit again. It's based on the open images v4 mobile net The that's over here, right? So you can see that's on tensorflow hub That's where we got the object detection machine learning algorithm from so we didn't make this Right, we just used it. So there's a lot of great open source models that are out there You might just have fun with like this one And then you can see here's this is the open image data set and talks about some of the things they can do Like detection boxes and things like that So there's a lot that's happening happened in the object detection world So you can then have fun with it like we did in gamify it Let's see what else here what I did with the demo you saw earlier. Oh, actually, let's try this real quick I should have said this You guys don't want to just see me taking pictures, right? You want to take pictures? Let's see if that'll work Where to go Yeah, here try it Let's see if you can blow up my system. It probably will blow it up All right, let's see here I got to come in a different angle here. There we go All right loading loading loading it's an ios is going to say am I allowed to talk to the camera? Yes, I am All right, let me turn this around Yeah, look at that. Oh look at all right. We got the selfie with the Human face 52 percent. I'm a man 30 percent this time. What did it work for you guys? Look at that And this is we're all running on a single pod on a free environment So you guys are actually paying banging on the system pretty hard right now Clothing 19 percent. I got a 49 man that time Okay, but that you can see that workshop is available to you. It's all out there Here are the urls you want to have And we're basically out of time here. So let me finish this up So you can see this is what we did. We did this thing called r o r h od s red hat Open shift data science od workshop for object detection. There's also a license plate detector. That's kind of cool So recognizes license plates off cars Mostly uk cars though. All right, so I don't know if it'll work off indian cars It doesn't really work off american cars because it was trained off british cars So if you have british license plates, it does a nice job That one is actually an interesting one. It's for environmental related activity Let's basically we're looking at cars as they travel through traffic cameras It's kind of the point and maybe you have people that are commuting too much All these things by the way, you've heard about somewhat today Then we have all these tutorials that are out there protect on our go Which is what I showed you in the previous session k native Corkus open shift Raw Kubernetes raw containers. So these tutorials are available to you guys to go try their get hub repos And yeah, they may they might not be completely update because we haven't had someone go look at it a while and update it But do check them out You can also issue full requests against them and these are the ones that we are working on today or this one in particular So do give that a try and then you can run it on your own and press your friends, right your spouse with object detection Just make sure that if it's your, you know, let's say your wife and you go, oh 42 man But I'm not kidding. I saw this happen one time. It didn't go so well Yeah, and and it will do that by the way So just watch what it does there and maybe wrap it with a little additional logic. Okay. Well, thank you for your time Let me go ahead and wrap up. Yeah, I just have one question So, whatever you show right now, is this available on open shift local as well or open shift local? It should work. I don't know Okay So, uh, the the Jupyter notebook capability is part of what's called open data hub We'd have to figure out if open data hub runs on open shift local Uh, the actual product red hat over ship data science, which is part of the sandbox does not run over ship local It's not certified for that environment And it it does take a little beef, right? There's notebooks are not cheap and easy You got to give them a little you got to give them some cores. You got to get some memory Well, like we have like a bunch of on-prem machines right now. Yeah, and uh, so I was just wondering We are using exactly a Jupyter controller should work. Yeah, it should work, but I don't know if it's It's gonna it's a beefy thing It should be fun. Yeah, do you think they'll work give it a try But when it comes to ocp, like I wish a container platform on your own premises It is coming as a certified supportive solution there Yeah, thank you