 Good morning. Good afternoon. Good evening. Welcome to a special edition of the data services office hour I am Chris short executive producer of open shift TV I am here with the one and only Koran saying he is going to be demonstrating Smart or green cities However, you want to phrase it call it what you want You know, it's it's essentially You know, we have a lot of it going on here in Detroit, you know, Detroit, Michigan is where I'm from There's a lot of activity in the autonomous corridor space between downtown Detroit and downtown Ann Arbor where the University of Michigan is So this topic is near and dear to my heart. So I will hand it over to Koran to introduce himself Please take it away. Koran Hey, thanks for this and thanks for having me here. Hey guys, I'm cut and sing see nothing at red hat and Yeah, same as Chris. So a lot going on in in my city Bhopal. I'm in from Bhopal right now in India And we're gonna talk about smart city and green city today and We would be showing you off how you could build a pattern of using open source technologies and adopting that in a use case and the use case that we have chose for the day is live capturing of images from vehicle and deducting the number plates In real time and you know the entire data chain from that point so That would be pretty interesting. I hope that you would you guys would find this useful and the the whole idea of this this is that this is just a demo of a kind of a picture scenario that we have built up And since this is all running on open shift It is leveraging open shift as the core engine and we are using lots of open source technologies on open shift And building patterns so that you guys can learn more from these patterns and try to adopt these patterns into your apps your production your kind of a use cases So that is the whole intention of this this mini demo I would say cool So to get to some things Set up here. I'll show the screen and give you glace of what we're going to talk about and Where our story going to evolve today. So chris just verify if you can see yes queen Okay, fantastic I'll promise I'll let bore you guys with with lots of lots of lots of speed and Presentation screens here. There are so many so I need to select which one to show off this one is real quick you will By the way, I'll give you the the the URL of the GitHub repository that we are using to store all the code bases All the open shift ml files sample data sets automation Including the documentation that you guys can see and peek around and call into the atmosphere So these kind of things that we can do once we have number plate of the car, right? So so this demo goes like this you have at each rolling station We will we will have you know open shift at the edge deployed and We have I'll switch to this one. So Yep, so yeah, so we have we will do inference as the edge We have open shift deployed on the edge and we have our model which is trained on detecting cars and extracting out the number plate and at the same time it will extract the characters of the car number plate and once we once Was that model has a detected a number plate from the car It will basically Send that onto a Kafka topic running on the edge So right now we are we are at the edge. So there's all happening on the edge We need to make sure the data should also come to the core because there's a hybrid, right? We need to build an extent of our Situation and scenario. So the data will go from Kafka topic and using using a using mirror maker so This anq screens has has a nice feature of mirror maker Kafka has a nice feature from mirror maker Which can it can you know, you can you can move messages from from a distinct Kafka cluster on to an other talk to us. So Uh, you know your take should be on this is that okay. This is one pattern. You will see a pattern here. This is a this is a even driven loosely coupled Scenario where you have edge and core and you want to move data from the edge to the core So this would be the actual or the the the main pattern that You would be deploying in your in your apps quite Kafka mirror maker will make this happen and you know messages will move from from from the from the edge on to the core Right and once you have message on the core Which means the message would be like a json json string off of the car At this at this time, you don't know much about this car, right? Because we just have deducted the number That we will do the rest of the processing on the core on the core data center So in here, we will we will process it. We will have multiple business logics Which will work on that detected number plate like, okay, please tell me Who is the owner of this car? Right and in real time we can we can go and talk to a database running on open shift Or a mongo db. We don't have mongo db here. But yeah, just a database that is running Which will give you okay this car belongs to let's say chris and Then we will start okay this is This is the car that chris were driving at this time We'll put a timestamp on that event so that event will will get enriched with some more data on the core So we are enriching the data generated on the edge Move to the core and then this enriched data But then before the the place so for example another would be I used to like this I have another service here, which is You know alert service, which will say okay, you know, I was looking for this car The the star was was lost or stolen from somewhere if You know if this this number plate matches with the spring or on the Kafka topic We will immediately send out alerts to to the relevant organization The third one you'd be you know storing this on for a longer term On on a look on a low cost object storage solution so that You know, this could be retrieved and used for data analysis data, you know dashboarding and in real time So we rely on we use s3 is the the capital way to store data You know on object storage So we use OpenShift data foundation backed by by step. So that is the underlying technology that stores this and once the data is there You know, you could you could use tools like Premiere Premiere is the open source project and the downstream project product is the starburst crystal I think they they changed the name recently. It's like the most out of the name but but yeah once you have so so pressure is pretty pretty interesting tool This is a distributed SQL engine. So the way it works is that From this from Trineo interface, I will just you know, I'll say Trineo or presses. So don't get confused He's just the same tool upstream and downstream. So Trineo is the open source version of it. So on Trineo, you will say, okay you'll write secret statements and this Trineo engine can and in the back end it can it can run distributed SQL queries that If the data is living on Let's say our dbms or a mongo db or or an s3 object storage system like odf From a single interface, you could query any standard SQL the data. So this is very powerful Right, you don't need you know, you don't need like three types of different tools to to pull data out from from different database engines and object storage Trineo can do it for you object storage self is You know, it's that isn't it's pretty fantastic. It is it is sd compliant So there is a completely no problem for Trineo to go and talk to s3 odfs3 and push the data out and once we have data on the Once we have the SQL engine ready We can query it by cli If you are a fan of it if you want to do it, but most people prefer to you know To preprepend one one dashboarding system or an importing system. It could be tabulo or it could be open source like it could be apache super set Uh to query do a adult query on on super Write an adult. I'll show you this thing. This would be pretty interesting Write a query on super set. It will go to uh to presto and then it will go investigate it from s3 That is pretty fantastic. We also have one dashboard, uh, like an operations dashboard so that people can see what's going up So so at this time, this is what we're going to talk about again the uh, I would emphasize on the patterns here There are a lot of patterns which are in build um in this uh in this demo that We are showing to you Like inferencing at the edge because this thing is really getting getting attention Like I want to do machine learning and AI inferencing at the edge. Help me do it So, uh, this is the way, uh, you can have it and once the events you have done the inferencing You need to move the data to the course for long term preservants or it could be It could be like an ml ops Scenario where you are continuously getting new images if the model is not able to detect that image You wanted to have that image that was not detected and train the model in the back end later and then push the model back to Back to the edge location. So this is all this all goes in containers and an open shift And for that matter to Kubernetes Is that we believe is the right way to do edge? and pull And have these amelops playing and data engineering thing tied in together So, Chris, does this make sense? Anything that you wanted to ask me at this time or Make help me or or should I explain more on any of the any of the section? Chris, can you hear me? All right, Chris is uh, and Chris is having internet problems right now I'm in turn. So I hope this this streaming system is going on. I hope so Yes Okay, fantastic. No way. So what what I'll do next is uh, meanwhile, Chris is busy fixing his His cable internet connection. So I'll I'll take this moment and and walk you through Uh, I think I will walk you through my my github repository because that is the uh, that is the core thing that has already So I So my my colleagues Dio and Kyle who are on this call. I guess they would be listening to this or tuning to this They they have three of us. We have spent a lot of time on this and we build this demo and uh, the way the way to use this uh, this github repository is that We will we will have multiple patterns. So this is what the pattern that we've worked on so called a smart city pattern So you go into this directory smart city and then you will have a full blown documentation that you could use for deploying each and every component of Of this demo. So this is a pretty comprehensive documentation That will show you how to deploy this entire uh smart city green city project on your open ship environments This could also be deployed on on communities if uh, but yeah, you you would need to afford that and do some changes At your side, but that should not be too bad to do it. So So, yeah, this should this should equally work well on community But yeah, this is this will be tested multiple times. I guess like, you know, several seven dozens of times on on open Shift, so we are pretty sure that this work thing will see an open shift um We will uh, yeah one one so So there's a directly called as deploy so deploy directly has All the ml files that you would need to deploy each and every component of this demo And these demo this demo ml file includes, you know, creating a secret or a config map Or a deployment service around the standard communities and open shipping, right? So you will you will deploy these components in together and in tandem they will work Once you have You know things up and running So chris, are you back or so? I guess he's still I am back. I think I hope Yeah, I can see and hear you. Yeah. Comcast is the best internet provider out there. Let me tell you Uh It's the second time this week something like this has happened. So I apologize everybody really I do Starlink is a solution now. Yeah, it's about to be Far enough north I can get it now but anyways yeah, so So yeah, I was just going through this this deploy directly which will have all the all the ml files And then the the crux and the core is inside the source In the source directory, you will you will see all the entire Python code that we have built up So, you know, write those logics and you know, how this is gonna How this will you know going and connect to opposed to people database And then how are you going to store data into this and then, you know, How would you initialize the Kafka? Kafka, you know listener and then How would you move data onto Kafka bus? Things like that. So these are all like Kafka consumers and producer configuration So all the code you can find in my in this repository They are feel free to to do and try it out. Let me know if you have any issues and just Create new issues for on this report if you if you're stuck somewhere. Have to help you have to help here I will go and Show you how does this look like once Even before I go to I'll show you my my open shift Appliance so you can see that So this is a running environment right now in a main space called as mart city and we have All the all the components Deployed database lpr servers life Equipment show servers and they made server generator the generator is like, you know, obviously we don't have real cameras at the moment So this is the generator which is just generating a randomly images of the cars and then this generator data is getting provided to The the inference engine which is detecting the coin number And this is the lpr service license plate recognition service which is doing the magic Uh, so yeah, everything is deployed at the moment. We have object storage Buckets in here. I'll go and see object bucket claims We have two buckets. I'm using two buckets to do it We are using one interesting pattern here, which is which is very important for for the listeners to to see I'll come here for a minute You know moving moving data from from tafka On to onto object storage, right? So we are using an open source source called called a stacker So stacker originally came from Pinterest engineering. It's an open source project We thought this would be a very very good use of this. I mean, that's why they build this moving How to move data for long-term retention on to tafka topic Like tafka is just a buffer, right? It's a temporary buffer. I would say it's a long, you know It's a buffer. You will you'll not store, you know, 10 years of data on tafka. That doesn't make any sense It's a stream user or you know, or any like put those messages through that puppy and off it goes Yes, right So, uh, and this data is very critical for for, you know, for lots of reasons. So we thought, okay, there has to be, uh um away And another dimension was that where, you know, we think, you know, why We can we move data from tafka to to a sequel sequel database or rdbms So when the when we're talking about this and our detail, like that's not really rdbms for right? You're not dumb Traditionally, you're not dumb lots of data on rdbms that just archival data, right? You would need a data warehousing solution for for such kind of things If you want to store petabytes of data for long term retention So so object storage is kind of the the natural choice for us Because it's really a pattern that dump dump your so this would be a data lake S3 powered by s3 interface Self object storage is a data lake and what we're doing with separate sector is a is a is a nifty tool which will move Based on my filters, uh, it will move data from kafka topic It's just a listener. It will move data from kafka topic and dump that into our own format like Orc or parquet different kind of kind of serialization format. It will format that And then dump that as an object into an object's worth bucket And once data is there on object's worth bucket number one, it is it is readily available for you It is it is cheaper because object storage is typically not very extensive And lots of apps lots of tools can know how to access object storage these days, right? So so this is a pretty good pattern that People are using in production and and that's that's why we also use this In a in a real life scenario like how do you use a combination of kafka and sector and s3? How can you dump it? Right. So, um, I'll come back to my open shift. So this is my my obc object bucket claim buckets Which are created on on odf object storage Um, so yeah, pretty much it. We have everything here. I'll show you how does the end result look like so we have built up this This dashboard on superset. I'll explain a little bit like, okay This is this is a dashboard for for for reporting like real-time reporting like, okay, you know, how much collection? I should refresh this It's been half an hour. I've been talking so this should get us new number. Yes busy day in london It's a busy day in london. Look at this 10,000 vehicles almost up 10,000 vehicles that have been passed from these stations and And out of which we have collected 43,000 whatever a dollar of pounds of Of toll fees. It's all it's all, you know, make up numbers. But anyways, you have an idea And look at this one this pollution key. So the vehicles which are very old which are emitting lots of Carbon into the environment they we are charging them extra. So this kind of a nice dashboard and then you know, we can see We have like as I told you we have like multiple stations. So we can start that by station ID We have built up these these panels in superset. You can also do that It's pretty pretty amazing to and simple to do it. So for example, uh, okay So station number 5201 is is you know, is is witnessing 22 percent of the total traffic. So these kind of metrics You know, you could get in real time Using this kind of a pattern based solution, right? Which means you need to do you know, maybe, I don't know whatever you need to put more More more police or traffic police to say it because change little change the timing on the lights, you know, that kind of thing Yeah, all nine yards. Yeah. Yeah. Yeah, right these kind of things that we should uh, that one can do and Like who are the top 20? Talk to the customers. Uh, so mr. Gemma not he is You know, he has to pay like 4315 pounds this month So his monthly bill would be this much at the same time you can you can see okay What he's driving he's driving Yeah, this is the address. I mean this is all fake numbers and fake addresses. So thank you I was I was I wasn't going to ask on air But I was really hopeful that was fake data and it looked fake based off the license plate. So I was somewhat Comfortable with it. Yes Yeah, yeah, so this is this is all this is an open source like this baked car images that we have chosen because Obviously, we don't have this much of because our data site and we can we can't go and install real cameras They're not allowed us to do it Not for you at least Yes So if you if you're if anyone from london smart city or any city if you listen to this contact me I can help you build this dashboard for real In your you know, you know, you know in your state whatever So yeah, uh, and this is the pollution fee like this one, uh, uh, okay sprawling issues Going back Okay one more time Come down. Okay. So stan king, uh, this is license plate number and he's driving like a vehicle which is of 2009 Which means his engine is emitting lots of you know carbon. So he has to pay some extra pollution fee Of 2,800 pounds. So, you know, you got this idea, right? And there's the kind of a distribution of okay what type of vehicle It's more like already are a as for the data side that they've been using and Toyota land cruiser are the you know the most popular one which is going through and Bmw6 cities, uh, just You know two percent of the population is doing. Yeah. So, uh, it's just to show you, you know These kind of things you can do once you have enough data right data Capture from the places and uh, and you know Using these patterns you should build up this kind of solution. So this is just an example of of smart city You could put this up into uh into an industrial environment like you're a manufacturing hub You're a car manufacturing thing you can do in this kind of uh situation use case driven Uh, you know patterns, uh using open source technologies on open shape pretty neatly So this was uh super set. Um, next I'm going to show you uh another cool demo uh a dashboard Which is actually showing you life This should be at least the last three hours Okay, my instance was down that is the reason it does not have much data Right You said your instance was only three hours old. So yeah Yeah, I just I was gonna cost sitting more. So I powered off all my so again, you know, uh, so this demo I uh, this was working fantastic mind. So I powered off all my instances because obviously it's it's a cost Right. I power up everything and because you can't get before I can yes. Look at this one. This is all persistent. This is the power of open shape this morning. Uh, I just powered on my open ship environment just once and an open ship brought up all the components All the services including my data everything Without without a glitch. So another I will copy All right So what I see here is this is that this is that the same london zone and these are all These are all the places we have where we have cameras installed and vehicles are coming and From from these stations and this is the last Detective vehicle number and this is a real image From from the vehicle and we are detecting in real time the uh, the number plate Of that car you you see this changes And you will also notice that you know, this is not accurate because this model is not accurate Right. Look at this one. It's uh, it's not very accurate. But so this happens in production You would not You've not built a model which would be accurate on day one. This is a continuous learning process Right. You will get new data sets. You have to train it. You have to build an amel You know amelops pipeline so that you can train and once you have the train model You need to deploy it back or ship it back to the edge location so that your edge Influencing engine should get updated with a new model. So you need these kind of tooling in your In your at your disposal so that you can do all these things. Uh, you know without complexity of, you know of Things going down or whatever Right. So that's the beauty of open ship that gives us all the tools to Do to do this magically beautiful I think what's amazing is Just the fact that the you know, you have it set up so the data just starts flowing in Whenever you start your services and everything just kind of picks up and goes Like that is very handy Right. If I'm a researcher and somebody's like, oh Yeah, go train this model I need to be able to Have that infrastructure and the same services and the same data and the same everything available to me and you can do that with this right like Spend up your own cluster. You'll have everything you'll see the models with all the unknown description data and you can You know clean that up As you know, someone that's just like tasked with cleaning up the data or somebody that's like tasked with Well, why is there an increase in you know traffic in this area this time of year time of day, whatever And you can go and investigate that with actual real-life solutions, right like You can drill down into the data in real time and find Whatever it is you might be looking for and it could be increased foot traffic It could be anything from you know, a car broke down in the center lane kind of thing That happens, right like everything happens. Yes, you know Humans machinery cities like total wild card, right? You never know what's gonna happen There's just too many moving pieces literally Literally correct. Yes Yeah and yeah, yeah, that's that's my place and uh, you know And I mean the way the thing that fascinated me here is that because yeah edge is it's really people are actually deploying These kind of things at the edge and I don't edge is very popularity people Across the across the industry like financial or let's say industry or automobile right or e-commerce or let the retail Or all the industrial vertical needs They they have some some a different form of edge and they want to do they want to deploy technology at the edge Right and and you need you need tools so that you can seamlessly ship your your app your Core basically at the edge so that you can get these kind of you know insights from the data and collecting it Yeah, focus focus on on the right tooling and using the right tool for your job is kind of very important Yeah, so uh, this is kind of uh, just to show you this kind of a grab that will okay You know vehicles per second how it is going? I spend the time looking at our heatman kind of a chart here another cool cool graph and chart here is I should shout out to to my collegium or given kyle who were hacking around their their skills of using gif And I was watching them what the hell these guys are doing but then they they came up with this Oh Okay, this is right, but yeah, yeah that they built up and and these numbers these numbers are real numbers So this is the actual And this is an operations dashboard, right? So right in in real time you are monitoring How much what service is using how much cpu? So look at this one. Oh my god at the edge It's just using 0.5 Oh, yeah, it's a I mean 0.5. Sorry 0.5 cpu. That's a minuscule amount of processing data. We could put sensors everywhere, I guess, right? Like I mean That is not not a full cpu of usage right there That's that's right. That's why you know, uh, so so Open shift on raspberry pi should be soon So, but yeah that that that idea has to like right Yeah, I mean, you know, we're working on single node instances for this reason, you know As part this is part and parcel are working on single node instances For open shift and you know as time goes on we'll get better with them, but The you know right now you could probably deploy this in a very small package Yeah, have everything you need right there at the edge. Yes I love this one. There's this mart city inference api So this is this is the api where we have our model deployed So in real time This is this is an a camera or a simulation of a camera. This is sending an image To my my inference api look at this red dot and then data is going back, you know to my kafka And then from kafka we are using mirror maker To send the data to to my core once the data is on the core that number plate It is going to second and second is storing that to that to my myself We they made a part. We don't have arrows and dots here And then you have super site which will go and query Post-receiver database and self object storage only public storage to build that build that nice dashboard So this is the whole journey From the detection until you see A report And this is all happening using a postal soon an open shift That's awesome. Another another variant of this is we said, okay, you know, super CQ is good. Let's let's build Let's just have fun. Let's build Memory I'll show you that as well kind of the same this numbers are Numbers are different. That's it. So this is my memory consumption I look at this mirror maker like, you know, 260 megabytes Of memory. It's consuming this to, you know, move Move messages from From edge to core because it's very bright Um, so yeah, this is uh, this is my dashboard. I'll come back to I also has another interesting Pool that we have used open source again. What's this cap drop cap drop? So I could learn new tools Yeah, I mean When we were we were deploying this like we were always, you know, going to the cli connecting to the uh, to the, you know Topic and turning commands. We love that. But you know, once you are in hurry, you want, okay I need just one thing that will give me everything because I need to focus the writing for right Uh, so this is this is cap drop. Uh, you can get it from github Um, and uh, in real time, this will show you, uh, the messages coming on your topic So I'm connected to my my core topic. This is the url from open shift And this is my lpr is my topic and you can see this is my my event Which is detected from the the edge. So look at this one pan stamp and vehicle number plate and detections were successful and um This has been detected on station a one three So we are getting this data from the edge And now the data is in cast now we can do, uh, whatever we want to do So this is a little Little pattern here that you can use So yeah, you know Even driven systems using kappa They they are they are pretty cool and they are flexible. So let's move Another thing that I will show you which is very interesting is, uh The capability capability of trineo, which is super set. Uh, not super. So, sorry, um, starbus propsto to write queries in sequel Such that it can go and talk to two different, uh, destination where the data is stored. Remember in this slide I talked about that that you have data stored on on the On your rdpm is database and you have data even stored on s3 odfs3. So, uh, Okay, my cpu is it's really hard right now 70 percent Something is eating up my cpu anyways so, uh, so yeah this part so so presto can go and talk to our database Using distribute queries and and odfs and i'm gonna show you one like how does that look like so This is a simple query. Uh, this is a simple distribute query that i'm gonna run That okay, you know, please go and uh, and give me a timestamp license plates of vehicle model number and owner And this this is written in such a way that it will go and and pick up data from from my five five is a component um in It's it's a it's a big data component that just it's a it's a meta store which will which will have information about the tables and The partition definition that we are data. We are the actual data of this table is stored on which s3 bucket So it's it's a it's a big data construct. So like it's a it's a tool. So Using this this hide Metastore it will go and query my my odfs, which is open ship data foundation with the kind of s3 component that we're using here This query will will go and and talk to and do a left join on on the database which is postgres so so again Just take a moment understand this From a single sequel query I am I'm I'm investigating or I'm just looking for records which has shown on two different places and getting them on one query This is very powerful because in real world I'm very sure you will have multiple different data. So data destinations. It could be postgres equal It could be mongo db. It could be object storage Or it could be, you know, uh, I don't know whatever you you might have apart from as well So you need a single mechanism And and sequel is like, you know sequel is there since last 40 years. So people know how to write sequels So I'm gonna I'm gonna run this this will take a few seconds so running the statement And uh, boom, you have the data here. So this is a real time You're querying this in real time and getting data from from this If you if you want to look into how Presto have done this so there is another uh dashboard for presto and I will unfilter The finished one. So this one is just like a few Just a few seconds before so seven or nine my time But this is our finished query. This is the query And I'll go to this Just happen because we run it from from superset superset will give instructions to Prenio and Prenio will will give instruction or basically go and crawl or get data from rdvms. So this is presto and we can see like plan. So So this is for you, you know, how you're how presto because presto is an engine. It's a it's a It's an engine to to to write a big data query. So It has its own mechanism of building plans and finding the most optimal plan to to grab the data and things like that So it's it is presto Um Yeah, this is uh, this is it mostly I'll come back to uh to the story where we have started. So so again You can sequence day. Okay karan, you know, this is not this is not something that you have deployed In a real world and or maybe to a customer. So how this is relevant to you The takeaway for for all the all the listeners and watchers on this on this live stream is that We want you to think how you can use these patterns in your use case in your apps Right, I mean this is a powerful pattern, right? Like this is A model of models for lack of a better term, right? Like Yeah, I see this and like that video stream. It could be anything coming in, right? Like it could be You know test results in a hospital. It could be, you know At a you know a vaccination site, right? Like talking about, you know, what inventories come in and out and you know How many people you vaccinated the whole nine yards, right? Like there's a lot of applications for this I mean factory working, you know, the whole nine yards, right? Like it It can apply to cities, but I mean if you look at a city, there's all kinds of microcosms within that city that this could apply to Exactly, exactly. Yeah, that is the I mean because when we when we go in and and talk to developers and and data scientists and and people and customers like, okay, you know We really like kappa and we have this in queue stream. Like what should I do with this? You can do this with this, right? Yeah, that's the whole idea I remember learning Kafka messaging queues, you know way back in the, you know, whenever they came out kind of days, you know the two zero zero some number after it That year and like just envisioning the possibilities, right? Like, oh I could actually like just throw it on a bus and have something else pick it up. That's amazing and To be able to just like, hey, we're gonna have this small deployment out there collecting data wherever it may be whatever kind of data It's gonna have Kafka running on it streaming the data back into the core so that we can analyze it and process it better, right? Like there's so many possibilities with that and then you put open shift with Kafka. It's like Okay What can't you do? That's almost a feeling I have right like what isn't possible. Well, let's figure out how to make it possible. You know, that's That's kind of what this demo is and it's really really cool, right? Like this is awesome stuff Yeah, I mean, I'm very excited Working on this project because you know, there This is just just a small thing that we have built up But you know, we can always plug in you know, serverless open ship serverless here Like, okay, you know, all of these all of these In real time, we are calculating fee like because that's the calculation, right? So you can you can have a soulless because late In the night, there will be less traffic flowing around Right. So you will you can just bring down those those containers and have Instantly wake up once you have, you know, a new message in Kafka topics. So that's another pattern A soulless pattern that you can use. We don't have it here, but that's another pattern that you could pick up to reduce cost One interesting, you know Dialogue that we have been doing with other people within within the SAP measures, okay, you know persistence How can I persist data? So So in here we we are actually using two forms of Let me see two forms. I guess you have two forms of data persistence using block storage and an object storage and typically, you know developers you know You need to have some kind of persistence layer in your open ship environment and without without open ship data foundation Is the is the ultimate choice because I love the technology it is built upon it is SAP It's a 10 year old technology and it does works super fine And we are so because you know, because this pose was equal database. It's a database, right? Database needs storage Yeah, what I've got what happens? Persistence matters Persistence matters a lot, right So, uh, like I powers of my entire open open ship and and right, but if there's no persistence here, I would have been, you know Figuring out how to recover my data, whatever. So yeah, we've been using block storage from Uh, first three c's percent volume plane from uh, official data foundation and then buckets And this is this is very very interesting Chris I I'm pretty sure this has been discussed previously on your on your channel, but they are Through open ship console, you can request for object buckets like as a developer You can request for object bucket and you will get a bucket And you will get credentials to your bucket. So that's what we are doing here So the we have created two two buckets. The first bucket is the data set The data set Bucket we have we are storing or the data set and the second bucket that we have is for Sector for sector is dumping data onto this bucket Um, so again, this is also a powerful construct as a developer. You don't care. You just need to write, you know code in which you will pack s3 bo2 Yeah, right and and you will say, hey, please give me one bucket or whatever and this open shift Using object bucket claim, which is a native, you know, Constructing open ship It will give you it will provision an object bucket claim on open to data foundation and give you a bucket and that just use that bucket It's your persistent storage for for all for everything I mean, it's yeah open data foundations is an amazing amazing kind of platform of tools, right and if you're if you're looking for the like fast easy way Right now we have in tech preview some tech preview. So keep that in mind when you're using this You can use what's called an assist at a staller Basically, it gives you an iso, but this iso can be configured with odf and virtualization And everything and all you have to do is just give it hardware and spin up the iso and it'll install full cluster with All the bells and whistles as far as you know, everything needed for this demo other than you know Some bits and pieces here and there and the data itself. Oh, yeah, did you know it could do that the assistance? I was just poking around in it yesterday building Yeah, I was rebuilding my cluster as I do because I break it all the time The channel helps with that and I saw it and I was like, let me try this and it was like There was odf there was Virtualization there was everything. I was Dang, this is really powerful. So it's in tech preview. Check it out. Go to cloud at redhat.com and uh, or try dot redhat.com or The open shift dot com slash try. I dropped the link in chat stream try you can hit that um, and you can go and Look for an assist at installer and you can check a couple boxes get an iso and you'll be up and running and you know less than an hour basically, uh, it just depends on how fast your infrastructure is Yeah, I'm gonna do that of course. Yeah, like it's like It was really really like click click click download Okay, here's some VMs on a virtualization. You know instance Open shift is really really really moving so fast It really does it moves faster than I think people realize and our what's new breathings are just, you know rich with content, um I've been using open shift almost daily. I'll daily use open shift, but even I don't know about this, you know, one new thing. Yeah Open shift is really Really running Yeah, it's it's moving at a pace Uh, it's hard to keep up with even you know from my perspective of having people come on every day to show me new things You know, definitely And talking about new things. I also have one more full full collection of things to show audience here You see these three things uh, superset and grafana and and starburst This comes in uh, odh so odh is another open Open source product open data hub Which is a collection of tools that help you to data engineering or or analytics things like that machine learning All running on top of open shift. So it is it is, you know, it is very uh, very flexible and helps you You know, you can you can paste these things very easily using open shift operator. I'll just really quickly show you So I've already have installed and uh, it is it is highly configurable There are so many components to this operator Of okay, I'll go to my install operators I have it installed so open data hub is the operator that that I've installed in here and uh, If you go to open data hub and then the k k f depth And your color is this John. So here you can define Which all components you would need you might not need all the components that ship with open data hub, right? You can choose you can choose what all components do you need in this ml file? So I'll say, okay, you know, I need I need superset and I need grafana And I also need uh And this is the this is the magic thing here, Chris. Um I and I want I want to use premium. Which is my sequel engine And this is where this is how simple it is to use Object storage with premium. Okay, premium. Please go to this endpoint This would and get your secrets and credentials from these two secrets This is the bucket name. This is storage class. So look at this one. This is how simple it is to configure Premium to use sdr. The storage on On the app, right? And once you have it, you will uh, premium install the open data hub installer will deploy a relevant components in your namespace Or yeah, once you have this kf depth applied, it will deploy everything for you and then it will also give you kind of a nice its own dashboard From where you can, you know, just jump on to the right. So odh dashboard Odh again is open data hub It's an open source project And uh, we have a downstream project for this open ship data science Which is kind of a collection of more matured Tools in here. So right now i'm just using two components from Open data hub. You can you can just go to go to this and And see what all things it does. But yeah, this is super tight and this is the fana and there are so many files in here Uh, just choose what components you want to deploy on open ship It's simple. That's beautiful. Right. So this these toolings will help you build the the right foundation for your For your, uh, ml ops journey for the data engineering journey You need to have the right tools. Um, that's what we do at run amp. We provide we provide the right tools To all the developers and all the people so that they can build, you know They can build amazing things like like what i've done here Yeah, i mean my pp was like super hard to yeah as i was about to say you're rendering some stuff And zoom is taking up a lot and then your browser and then yeah Actually, what app is it? I kind of need that Oh, this one. Oh, this is stats. It's again open source. Oh really? Okay. Yeah. Yeah I mean, I love open source. So the name of the staff is that Okay, this is not coming up yet. Look at this one stats Pretty nice stats. All right, I'll check Take it up right now All right, so So again, yeah, so guys Here is the the impact code base. Um, I invite you To try this out on your open ship environments uh, if you If you're running this on Kubernetes, uh, you need to do some work and change the Yeah, else the the code base should work the the Python file should just go fine. You just need to adjust the um, you know you need to deploy you need to deploy root set uh on On Kubernetes because root set provide you block storage and object storage And then uh, you need to originally modify some emails, but i'm pretty sure that this will also work on Kubernetes You need to just do some changes But yeah, hit up this hit up this URL and uh There's a nice documentation here, which we are also in process to streamline. There's already seen lines, but this is There's more more, you know more kind of a comprehensive so that we intentionally want to be comprehensive So that you can learn how we are building it's not like, you know, it's not a not a secret sauce like right or not Or a black box, right? I want we want you to work On this all of this and i'm pretty sure by the end of this deployment You would get to know a lot of patterns Uh that you can use in your apps and uh, you know make make us look proud That's awesome. Thank you so much for this demo and I apologize for my technical difficulties. Uh, I feel like I missed a little bit and um You can always go in and watch recording please Yeah, that's true. Uh, I will and definitely we'll check out this repo more. Uh, I shared it out for everybody to see there Uh, so yeah, this is this was an awesome demo and I'm like a mind opening kind of like a thing for me Uh, I hope that the audience has had a similar experience despite the technical technical difficulties But I mean this is amazing around this is awesome Yeah, so yeah, I would I would look forward to you know, uh, I would look forward like how you can pick up these patterns And uh build your own apps. So again, this is just a demo to showcase you the power Of open shape and all of the open source components that we have used in here and um how we have deployed this and and It is going to be proud of you know, we will have just a single open shape plus the core boom We are done demo is completed. But that was not very realistic and You know, we want to have something that people can relate to because edge is really really, uh The the heat of topic right now People want to do it and and they want to know how they can do it like Chris. We were discussing a few minutes back like, okay You know, this is taking this is not taking lots of cpm memory and this is uh, this is a mist and ship Conduction that people have oh edge, which means I need to have a lots of computing power at the edge. No, you don't Oh, that's a sliver, right? Yeah. Yeah That's really powerful. So if folks have questions or you know I don't Want to get in touch with you specifically. Is there any way they can do that? Do you have like a twitter or anything? Yeah, I have twitter. I'm acting on linkedin as well. Uh, Yeah, what my twitter id as I put it here current I know I think I follow you on twitter. There we go. Yeah Okay, and uh in case if you yeah, I want to do my linkedin because linkedin is something that I almost did like I'm Okay, I need to put it Find my which fight window Hold on you can do it. I believe in you. It's okay But that the linkedin that was filling now I got it now. Yeah So So, yeah, I look forward if you have guys have any anything for me if you want to learn more our Need any kind of help from this? Will you talk to me and let us know let us know in the comment section of Of whatever platform you're visiting uh, like If you have any question or maybe there could be another interesting thing or a pattern that we would be missing here Your new developers, right? We are developers so we can we can see things from a different point of view So you might be having something interesting that I would learn from you, right and maybe incorporate that in my next level Because that's that's what we're gonna do. We want to break more demos more more use cases that that should stuck That should stay in your mind And then you can use these things while building your next It's it's all about continuous learning and right like think like It's just because you can't use this right now. It doesn't mean that This demo didn't help, you know, if it says in your brain you pull it out later down the line It might you know save you a ton of time, you know, it's not You might not need this whole thing right now, but it's all they're waiting for you to use kind of deal, right? And you know where to go get it and Great again a great demo fantastic demo crown. Thank you so much for coming on. Thank you audience for watching Coming up later today, we have definition the show with my friend Sebastian Blanc and then after that we will be having well not immediately after then later this afternoon on the show on the Channel we have get ups guide to the galaxy. We'll be talking about helm get ups workflows. So please So yes one in doubt check out the calendar, uh, it'll link off to wherever you need to go to watch the content and We'll see you next time and thank you again karan and this was great. I look forward to seeing more later All right, thank you guys and thank you people