 I'm Keeb Chaudhary, co-founder of CLL, CTO, and BERS. And as I was describing, we have this platform called PowerAI. And what it does is basically we are doing grid inspections. And as we all know, grid infrastructure is failing. It's outdated. A lot of problems are happening, especially you see the utilities, the PG&E, so-called addressing. The infrastructure is causing a lot of damage and disasters, such as one fires. I'll highlight that. But before jumping into that, I just wanted to highlight what is the solutions? How did we launch? So we launched out of Stanford University in 2017. I'm part of the CE program, Atmospheric Energy. I met my co-founder over here in 2017. We decided to incorporate the company, commercialize the product that we were developing. And then we spent two years deploying the technology, working on our models, developing the technology. And then we got funded by venture capitalists in Silicon Valley that believed in the vision of infrastructure can be improved. There could be technology that can help our utilities make their processes more efficient, because we all know they really need that. But we were featured in PowerGrid, major utilities publications, GND World Energy Central, Wall Street Journal, Forbes. But our main vision is and our main mission is to revolutionizing and disrupt this industry, which is facing a lot of problems that's impacting the society. And speaking of the biggest problem, as we all know, wildfires, they're becoming a new norm. Fire season is a phrase that's become pretty common, which is sad. But one of the reasons why wildfires are happening, as we all know, is failed grid infrastructure. Here are some of the statistics. Campfire, which was one of the biggest fire in the history of this country, was caused by broken sea hooks. And sea hooks are basically joints that hold the insulators together on these high transmission lines. Before I jump into that, I said the phrase transmission. I gave people who are aware of transmission, distribution, substitution, rays of hands who have a background in power systems, energy. So I'll try to keep the conversation away from a lot of power systems, lingo. But yeah, transmission lines are high voltage lines. We're talking about 250,000 volts. Distribution is low voltage. So campfire was caused by these sea hooks that were broken, and that led to $9 million in losses, 85 dead. PG&E, in 2018, they had already identified 146 structures that needed replacement, but they were not able to get to maintenance and repair faster. 3,700 unanswered maintenance requests in 2018 before the major fires happened. And because of that, the fires happened. And again, another big fire camp will see fire produce 5.10 million tons of carbon dioxide. So that has a big impact on the carbon emissions as well. So this is the big problem that we're trying to solve. And what's happening is now utilities are on their feet. They are capturing a lot of data out on the feet. So they're flying drones. They're flying helicopters around these lines to capture the data. For example, PG&E captures 5 million images. Now they're capturing 10 million images on an annual basis. But what the problem is that these utilities have incorporated drone programs. They have helicopters. They have drone vehicles. And they capture so much visual data. The problem still remains is once that data comes back to them, it takes months of manual analysis time in order to make sense out of it and then send out a repair replacement. And the problem is during that time frame, things like campfire happens, things like the infrastructure damages, and at least one of the wildfire situations. So what we're trying to do over here is trying to solve that. So first of all, it's a slow process. Millions of visual data points coming back for a single utility takes months of manual analysis this time, unscalable. So at that level of data that's coming back to them, it becomes manually impossible to go through every single image and make sense out of it. And then because of that delay, there's high risk sectors losing $170 billion due to power outages, blackouts, poor shutdowns, mass disasters. That's where we are coming in with our solution of the Empower AI. We wanted to name Empower AI, but stuck with Power AI. Basically does data management, visual inspections, anomaly detection, analytics, a lot of heavy words, but I'll get into those, what that means. So first of all, we start with data management. Millions of data points coming back to PG&E. They don't know where to store this data. It's just sitting on hard drives. We have even seen utilities use VHS tapes, which is just mind-boggling. They said both VHS tapes to us, we don't know how to make sense out of it, but what we try to do is bring technology to them. We're trying to bring cloud and secure cloud to them, cloud storage, cloud data management. That's one of the functions of our platform. The second is once that data is managed in one single location, now they know where this data is coming from, where are these regions? A lot of these regions are in remote areas and they don't even know how to access those. So we do asset tracking, which is mapping out all different structures. So all the poles, all the trap, all the towers out in the field to the sensitive level accuracy. So now they know where the assets in the components are located. And then moving on, once the assets are, they're mapped, we go into the actual AI, which is a proprietary models called, that are based on computer vision technology. What they do is they process this data. So they would churn all of this imagery through the models, make sense out of it. And by sense, I mean, they'll detect whether the insulator is damaged, whether the conductor is damaged, whether there's overheating in the line and generate all these results back for the utilities. And then the last step is sending all these results to the utilities in a prioritized manner, because we don't want all these results to be sitting on our server and no one actually going out in the field handling repair placement. We want to send that to the work order system. And what that does, or what we do for them with that system is we prioritize. And what I mean by prioritize is some faults are emergency, like the sea hooks damage that we saw right now that caused a major wildfire. That's an emergency kind of situation. Go out in the field repair that as soon as possible. Versus, oh, we have some rusting happening. It's still not that bad of a deal. Touch that in six months. So prioritization is a big factor at the same time. To describe the process that we do, again, in a flowchart kind of way. So aerial data is being captured by drones, helicopters, millions of images on an annual basis. Comes back to us, we ingest that data scalably, process it, we add predictions that are reviewed with a tool called Human in the Loop, which I'll get into it in the later slides. We track feedback from subject matter experts. Again, our team is very technology focused. We have our systems background, but we're talking about line and field technicians, people who are out in the field looking at these assets, they have decades of experience, taking their feedback to make the models more better, more personalized, more robust. Visualization and analytics, so showing it in need format so that the stakeholders can actually understand where the problems are happening. Reporting, so generating inspection reports so that they can be prioritized based on work order tickets. And the last step is work order tickets. So getting a ticket, which means, okay, insulator X is damaged in certain area Y, go out in the field, you were there right now. Now getting into the technicalities of the algorithms we have. Again, our whole technology is based on proprietary computer vision, machine vision models. So a lot of machine learning AI systems that we have deployed. So the kind of algorithms we're using is, I don't wanna get in too much detail because that's a computer science topic, but image clustering, by image clustering what we mean is, let's say a drone goes out in the field, there's a wooden pole or a city line pole, they collect 10, 15 images out of it, that those come back to us and then they do it for 30,000 poles. So we have a lot of images coming to us and we wanna cluster them to the specific poles and that's what the clustering algorithm does is it uses geospatial or GPS information and it clusters it together. Segmentation, what we use segmentation for is the computer vision model basically looks at areas where there is greenery or dry bush and that is mainly used for vegetation encroachment which is a big problem again for one third purposes fire hazard. So let's say three branches on the line, that's a fire hazard situation. So we segment out those portions using computer vision because the machine sees that certain specific pattern in the greenery or dry bush and segments it out from the power line or other objects in the image and then senses out how far is this branch from the power line. And then lastly anomaly detection and this is mainly for electrical components. So let's say the insulators, detecting insulators, detecting conductors, other kind of electrical components and the damages on them. And how we do that is, we have used, as I said, we start commercially deploying in 2019 and we're still training the models continuously and that's the power of machine learning is that it learns. But we have to get a lot of training data and by training data this is all proprietary images that are out in the field that utilities don't really share with anyone. So we have to go out and partner with them. And now we have, I would say over a hundred thousand images of visual data points that are training data for us. So we take them, we store them, and then we label them. So we manually have to label thousands of images, go through them, draw boxes around them, as you can see over here, that's called a bounding box annotation. And then once that is done, send it back to our systems, our algorithms that do pre-processing of that imagery. So adjust the brightness, adjust the blurriness, those kinds of things. And then lastly is using that imagery, using that, those guidelines, we teach the machine to kind of learn from it, just like a human being learned like a five-year-old. Okay, this is an apple with green color or a branch. It learns what does an insulator look like? What are the specific characteristics of an insulator? What is the specific characteristic of a power line? And then it learns over time. And then we deploy it. Model deployment is a big factor at scale. We're talking about millions of images coming on a regular basis into our system. So we want to keep the system heavy, robust and secure. So that's a big factor, model monitoring, how the model is performing. So penalize it where it's making mistakes and then reward it where it's doing good. So just like a feedback system, active learning, which I'll get into that in the next few slides, and then model branching. So that is something that we stole from tech companies like Facebook and Instagram, because we got to know to our colleagues what kind of system they are using, which they shouldn't have discussed if I did it. I will name my friends. But so basically what they do is they map customer profiles using a branch and trunk model kind of system. And what that means is that there's a trunk and there's branches coming out of it. And each branch represents a model for a customer. So Instagram will have just like my profile, they have a trunk model that gets trained on a, let's say every two months, but they have a branch model coming out that represents me as a customer that gets trained on a regular basis. For Instagram, it's hourly, for us, it's weekly. So that brings in the personalization effect. To highlight some examples, what we see is I named a lot of electrical components, insulators, conductors, those kind of things. And what they look like. So start up with these two things. These are dampers. What they do is they absorb vibrations in the power line as this electromagnetic energy in the power lines, high voltage power lines, they will vibrate. And these are the ones that stop those vibrations so that they don't interact with each other. So this is what it looks like when it's damaged and when it's damaged, what happens is that they start vibrating again and they can crash. Another example, this is from a helicopter. Again, very zoomed out image. One of the things that we have to battle in terms of challenges is we don't have any standardization in image capture. So we'll get pretty zoomed in images and then we'll get this. So we had to train the models to actually identify what is a good image versus what is an image from a bad angle. So we still do high accuracy results on this one. And then we see images like this. It is super zoomed out. But what we were looking for in this image was vegetation. So as you can see, there's a line and there's a heavy vegetation. So our systems are able to kind of detect that as well. And the last four ones are insulators that are damages. As you can see clearly in our systems are able to capture. And then before I get into that, I named insulators a lot. One of the purposes of insulators is to separate out the high voltage lines. So as you can see, transmission lines have longer insulators. And you'll see distribution lines which are in the world as you have smaller insulators. The next process is called human in the loop. And this is a tool that was kind of an idea given to us by one of our customers because they got really scared when we approached them. And we said that, hey, we have AI and then they're like, just hold on over here. Are you here to take our jobs? And we said, no, we'll be trying to work with you. This is another tool in your tool shed. And they said that we want to be included. These are linemen who have experience of like 30 years in the field. They go out in the field with their binoculars, look at the lines every single day, take notes. And we said that, hey, we want to help you. We want to make your lives much more easier. So we introduced this service or tool in our platform. And what it does is that the AI does predictions and it generates recommendations like this. This is a check into the insulator over here. It's broken. What it does is now the linemen or the field technician or engineers can go into the platform and then they can provide their feedback. Again, we don't say that our AI is 100% accurate because no AI is 100% accurate. That's where the subject matter expert feedback comes in. And this is something analogous to the healthcare sector. If you see, there is a lot of subject matter expertise going on in the healthcare sector as well, especially for lung x-ray scans, you have surgeons that are looking and providing the feedback for that year to learn more. We kind of replicated that over here. And what it does is that all that feedback goes to a backend system and then we retrain the AI continuously. We take that feedback and then we make an action out of it. And that system is called active learning. So it's continuously learning from humans and then providing that value to humans back. That brings into the personalization that makes the AI faster. Just for statistics or numbers, using this system, we've seen a ramp up of accuracy from 65% to 95% within three weeks. And the last thing is, we want to make sense out of these results. We don't want them to be sitting on a software or a server. So what we do is we generate reports for people who don't understand technology. And we're talking about some of the people in utility space, which honestly, we just had an interaction with someone who was still using Windows 7. So we want to make it easier for them to understand what this is. So we make reports ingestible for people who are electrical engineers. You don't have a background in software or AI. And we generate reports in dashboards that are easy to kind of ingest. And then we can export it out in form of PDF inspection reports or CSV spreadsheets. And all of those go to work order systems. So what a work order system is, for utility, they get various kind of work order requests. And what that means is, okay, insulator Y is damaged in the location, prioritize that and go out and do maintenance and repair. That's the whole purpose of the system. So we take that, imagine them getting 10,000 requests on a single day. And these are just one single field technician. This becomes impossible for them to go out in the field and repair. We take that portion from them. We use AI machine learning to prioritize various kinds of faults for them so that they now know which top 10 ones they have to go out in the field and the next ones they can tackle in the next few days. And we also integrate the geospatial system. So as we are GIS to kind of overlay the data, so they know where these locations are. So that was mainly for our lines. So transmission line, distribution line, that's more like linear infrastructure. Now we are also deploying our products for substations today. So that's a new product that has come to us, new use case. And what really happens at substations is, there's a lot of theft. There is a lot of intrusion, a lot. And this would be surprising because this was surprising for me also. People go into substations and steal copper rings that are sold at very high price. People have told us that people have got, a lot of people have got electrocuted doing that. So detecting people that are injured at substations if they were able to intrude and were trying to steal and they got electrocuted. So these substations are heavy, voltage substations, transformers, high energy environments. So what we did was we wanted to bring in the power of surveillance into this. And how we did it, substations have cameras deployed over there. We deployed an AR system on top of the cameras to just do 24 seven surveillance. So looking at people that are authorized to come into the substations. So intrusion detection surveillance, heat events, seeing if there's smoke or fire, those kinds of events starting to happen. So using thermal imagery for that, I have an example I'll show later, detecting thermal heat signatures on these substation assets, transformers, reactors, all those electrical components, it then damages to them. So if there is a damage to an electrical component or if there's injured personnel out in the field lying down, detecting that as well. And then logging all these anomalies and then sending out alerts specifically to personnel so that they can take some action out in the field. And these are a few examples of what it looks like. So we have two cameras deployed at a substation, one does visible, one does thermal infrared. So visible is looking at, you know, there's physical damage on the insulators or conductors, as you can see, or there's resting those kind of problems. And then what the thermal does, it looks at heat signatures. If any component is overheating beyond its thresholds, send it in a hub recording. And then again, it goes back to the learning system and action has been taken. So all in all, what we are trying to do is we're just trying to make the operations and processes of utilities much more efficient. Utilities are still very backward in terms of technology adoption. We have talked to utilities that have sent us VHS tapes, as I described, we've talked to utilities that have people going out every day in the field, walking down the line with a binocular looking up and taking notes. We have seen people tell us that the utilities infrastructure is being shot down by, that happened in Southern Texas, is being shot down just as a target practice. So these are things that we're trying to get away from with the power of remote sensing, with the power of drones and machine learning. So making everything faster, making everything much more efficient, making everything much more accurate. Imagine you're an engineer sitting on the computer. You get a supply of 15,000 images every day. And you have to go through them every single day, look at them, and it's not a good job to do. So taking that away from them and having them repurpose to the things that they're supposed to do, which is repair replacements, because these are electrical engineers, they're not supposed to be looking at images eight hours a day. So making that process more accurate, then doing it inexpensively, again, we are automated system. That's the power of AI on an average. We have alignment shortage right now in the country. And if we compare that with the salary of alignment right now is $250 an hour with our AI that drops down to $3 an hour. So there's a huge incentive for utilities who adopt this and then actionable. So again, we don't want all these results to be sitting down in a server. We want them to become actions so that people can go out in the field to repair replacement in time so that disasters don't happen. So just wanted to highlight some of the projects that we have done and they continuously working with utilities. One of the utilities we worked with and are working with is Newfoundland Power and they are based in Canada. So they get extreme snow storms in the winters they're in the province of Newfoundland. They have say five to six months of heavy snow. They said that six feet of snow at the peak season. So we have a lot of problems that happen on the infrastructure. And what we did was we just developed a system for them. And we analyzed 5,000 distribution poles, 500 transmission structures for them. Around 45,000 images found out heavy hotspot areas where problems are happening. Implemented the human and the loop workflow and this is where we saw the accuracy jump from 65% to 95% within three weeks. With their people, we have 25 inspectors and design main fluid technicians on our platform continuously and I'm still using it. And then providing asset tracking capabilities and this was a big one for them. So when we started working with their team, we got a spreadsheet full of pole numbers and their GPS coordinates. And then when our drone team, this is a drone partner up in Canada, they went out in the field. They said that there is no pole over there. So half of them were off by miles. So that whole sheet was outdated. Their GPS coordinates were long gone. So they had to drive a lot. And we saved them a lot of time by updating all of that. And we got to know that sheet was updated 15 years ago. So we did the update and now it's continuously getting updated, which is a big advantage for them. Another big utility we are working with is National Grid. They are in the New England area. And what we did was for their distribution poles. As you can see here are some of the examples, finding out anomalies on insulators, conductors, wooden poles, any kind of rust kind of problem. We are primarily working with the helicopter team. So they're trying to get more better resolution, high zoom cameras in their vehicles. But that's another area that is getting a lot of technology infusion. And a lot of utility teams are dropping drones. Drones are becoming much more commoditized. They're cheaper, they're smaller. Why would someone fly a helicopter with the risk of crashing the line and causing a lot of problems? And for a statistic, there are around six to seven helicopter crashes every year. And it's, well, I would say, fifth most deadliest kind of job to do as alignment. Another utility, Amron, they're based in the Midwest. They collect a million. Now they're doing 2 million images every year. We analyzed around 2000 transmission structures for them, detected around 3,500 anomalies. And here are a few examples. These are all insulator damaged. So you can see these are heavily damaged insulators. And these can cause sparking in the line. And again, there's dry vegetation around it. That's another big problem. But sparking causes the line to go down. Power outage, any kind of blackout. And whenever there is a blackout at high transmission or high voltage line, everything downhill was bad also. And utilities lose a lot of money. So we're trying to prevent that as well. And then going to more substation use cases. This is another team in the UK. They were walking around the substation with their DSLR cameras around. And what they were trying to detect was damages on their structures. So as you can see, there are concrete footings that are damaged, cracks happening. What that can lead to is just a whole structure falling down and causing a lot of damage. So we detected that with our AI as well. Another big one, this is New York Power Authority. They are the biggest state utility in the country and we're working with them. Their plan is to become fully digital by 2030. I would say it's a pretty aggressive plan but we're trying to help them with that. So infuse a lot of software, a lot of AI machine learning into their systems. And this is actually what we do. We deploy our software AI on the cameras and substations and continuously detecting the health of electrical components. Then we have the thermal. So this is also part of, you know, substation looking at the heat signature. Whenever there is a high heat event, detecting that early on and causing and monitoring that and sending out reports or alerts accordingly. And then we have this. So we have a system already deployed and a substation as you can see. This is a video from one of the live feeds and this is AI already running on top of the video. So we're getting 24 seven video. This is a lower frame so that you see some land. But basically what it's doing is it's running on top and detecting any kind of damaged components. As you can see, it's mainly looking at insulators right now, but there's other components also. So we have two video feeds that are coming to us. One is in this format, which is visible spectrum. And then the other one is the thermal, which was the one before the image. So that's detecting the heat signatures. Should we take questions? How is China? That's all I had. And take it. I'm also firing. So anyone who's interested in being part of the mission. Like, I can do wrong but in the introduction, I read that you have necessarily sent through the University of the energy engineering. What led you up to, while being the energy engineer, but what led you up to be focused on machine learning and stuff? Yeah, so I was in wrong lab for some time and they were using a lot of tools on machine learning. So I got involved in a project there. We bought a drone and we were trying to find that. I went with Dan over here to a site in Atom Bay and testing out some drones. So how it started was, I don't come from a software background. So my background is in civil environmental engineering. And then I took a lot of courses at Stanford, obviously, like everyone else who does machine learning, 2-2-9, and do those conventional courses. But what the vision was to apply that kind of technology to the energy sector, because that was the turning point we wanted to bring a lot more emerging technologies into this field. And it was through projects I got involved. So initially, before Buzz became on the power lines, we were trying to do wind turbine inspection controls. So another application of computer vision. So wind turbines get a lot of damage from high winds and structural damages. So we were trying to do that. And a history of Buzz is that I took a course called Entrepreneurship in Civil Environmental Engineering. And that's where I met my co-founder. I think it's called something else now. But this is a launchpad course. So in my final quarter at Stanford, I took that course. We had this idea of using drones for wind turbines, building some machine learning models. Took some courses, wanted to try out my hand and deploying them. But soon after the course, we realized we wanted to incorporate and make this commercial. But once we started talking to utilities, we talked to around 35, 37 utilities. And all of them had the same narrative that why don't you apply this technology for power lines? Because we are facing a lot of problems over here. It's a much bigger market. And this is before the wildfire started. So they knew something was going to happen. And they just didn't have the resources to actually mitigate that. So they kind of gave us the hint. And some of the big California utilities actually told us that why don't you apply this technology to the power sector and use that. So that's when we pivoted. We didn't do wind turbine inspections. We focused on power lines. But it was mainly through projects and then I took courses at Stanford. And then I wanted to apply that in industry when employing products at scale. Thank you so much for your presentation. I can actually relate. I'm also at the background in civil. When I have done damage inspection using computer generation in railways in Japan. And I wanted to understand more of the model that you guys do. I guess the problem is multi-class classification. And so far, I heard you talk about power lines, later, grads, dampers. But how many objects or how many classes are you guys working there? Yeah, so I'll start with the classes. So we have, I'd say, 30 major ones that cover transmission, distribution, and substations. So we're talking about transformers, reactors, insulators, dampers, as you saw, and conductors, which are the actual lines. Then the structural damages, if there's wooden pole damage brought on the wooden pole cracks, then rust corrosion, then vegetation, obviously. So we have around 30, and we're continuously increasing as we get more and more data. And the challenging part of this field is that every utility have their own inspection criteria or routine. And for example, a utility in California would have a priority for vegetation, because they have fire hazard situations. Versus a utility in New York, they have a lot of snowstorms. So corrosion resting problem is big for them. So we have to cater to every single one. And that requires a start of time working with the utility, customizing the solutions, making them more personalized towards specific utility. So that's one of the challenges that we have to work with. But we are seeing much more standardization happen, much more regulations being followed by utilities in terms of taxonomy guidelines. And yes, it's a multi-class classification problem. We do object anomaly detection. We don't use off the shelf models, because we try that our Vision API from Google, recognition from AWS, they don't really work in this space. That's why we don't see Google being the next utility big company. They've tried it out, it just doesn't work. So we have to build our own models, even though we have a backbone of convention, you know, resident and those kind of, but we have our own layers and we have our own processes. So that's why I didn't go in-depth over there, because it's kind of a trade secret for us and for writeriness, is the way how we label the images is different from everything else. The way we pre-process the image is totally different. Then the way we train the models, the way we layer the models is just totally different. But yeah, but at the end of the day, we do the convolution in the network. And just a follow-up, you mentioned, I mean in the slide, you mentioned all the images were bounding bots, but you're actually doing segmentation, right? So pixel by pixel. And when you do the active learning part, human in the low part, do you ask the operator or the person to correct the bounding bots or actually pixel by pixel? I was curious about that. Yeah, so some of them would just do, because you know, after they've seen like 100 images, they just get lazy and you're like, oh, we'll just draw a box anywhere. We've seen someone draw a box on a human being, seeing it's an insulator, so wow. All right. But we ask them to do like pixel by pixel, so polygon, how we were tied, they can make that. But if not, then we have our own labels also. But as long as we can get free work from them, I mean as a business, it's much better for us. Got it. Thank you. You mentioned that the New York utility wants to go fully automated in digital by 2030 and like that was an aggressive timeline. I'm curious like what that looks like beyond like this automated detection, like what else they need when they say that? Yeah, so they have multiple like vision or plans. So first is digital, become fully digital and then they have aggressive. What's digital? So making all the processes digitized. So they have a lot of like people through right now. So one example is this. So they'll go out in the field, write everything down on a note pad, bring them back, build another project report which is a paper trail. So that's really inefficient because let's say five years down the line, they want to go back and see the inspection. You have to go through like tons of paperwork. So that's just not an efficient one. Another big one is drawings at substation. So they have to manually draw like circuit components and wires at every substation. That takes a lot of time. So they want to digitize that also and then use computer vision in it. So it automatically draws everything. So kind of like OCR, which is optical character recognition that you see in your papers are all digitized now. So those kind of things. So that's on the digital side. And then they have full aggressive goals for decarbonization also. So they want to electrify everything. So they are putting heavy emphasis on the hydro. So that's another goal. But on the digital side, machine learning AI is a big portion of it because they want to automate a lot of digital processes. So digital is more mainly like software based. And then on top of that, they want to infuse machine learning AI into that. So they have a lot of use cases for that one. So prime use case is ours, which is power line inspection. They have substation inspections, which are very manual and very paper heavy. Then the other one is, so they have turbines underground that generate electricity. And whenever the turbine is damaged, it generates a lot of steam. So you would see in the city of New York, the steam's coming out of the manholes. And if it's a lot, then it actually means that there's something wrong happening under the road in the turbine. So early detection of that. So we're building a system for that also. So deploying on the city of New York cameras, deploying our AI system to detect heavy steam exhaustion over there and then sending out an alert accordingly. So that's another use case. Cool. And then the last question is, so I don't know anything about this. I imagine when you're selling to a utility, if they have to get software approved by the PUC or something, is it hard to, are they willing to pay? How does that work? Yeah, I mean utilities, in my opinion, and a lot of people's opinion, utilities are the hardest and toughest customers to work with, takes forever to sell to them. Just to give you context, a sales cycle for utilities 12 to 24 months start to finish. So takes forever because they're just heavily regulated. So they have to follow certain standards. They have to follow certain processes. So they need to, and these problems are problems that impact society. Like this is not an app that someone uses. And if anything goes wrong, then you can just roll it back and then deploy it again. If anything goes wrong in the prediction, let's say we missed out the prediction of an insulator and that insulator falls down on dry branches and causes a fire, then you're on the hook for that. So these are problems that are heavy physics problems and heavy engineering problems. So they take their time in terms of running a pilot program, running demonstrations, making sure everything's validated, everything's strong and in place before they go out and fully deploy it at scale. And that is the effort that they spend money on. They spend their time on and that's one of the reasons why the sales cycles are long. So yes, it takes time, but it should take time because these are impactful problems. Oh, thank you. For the vegetation management problem, how often do you see utilities using LiDAR instead of cameras? Yeah. And second question, why do they still use helicopters? Yeah, so a lot of utilities have started using LiDAR. LiDAR sensors have become really small, the size of this can fit on a drone easily. We are also starting to see a lot of LiDAR data come to us, so that's our next progression is building models for LiDAR, taking the 2D world and making it 3D, because that's much more accurate in terms, especially for vegetation. So that's definitely not product roadmap and you're building on it. So a lot of utilities are using LiDAR and that's the next digital kind of process. And then why the utilities, but also from two dimensional images, we have a kind of an algorithm that detects the depth of it, which is not very accurate as LiDAR, but it can detect the depth in the image. So let's say there's a pole over here, there's vegetation behind it, it can tell that it's behind it and it's not overlapping on it, so those kind of things. So that's on the 2D imagery. And then in terms of helicopters, a lot of utilities are decommissioning helicopters. New York Power Authority is one of the ones that decommissioned all the helicopters last year, because they had eight people die in like 2019 in a helicopter crash and a huge line went down. So that was one of the reasons they had to decommission it. PG&E has had their fair share of helicopter crashes as well, which is a big problem. So that's why a lot of utilities are adopting a lot more drone. Most major utilities, investor-owned utilities, they have in-house drone programs now, they have drone pilots, they have a fleet of drones, starting from DJI and now we skate scenes, Heidi over on, so it's a fleet of drones they're buying. And the big portion of that, why it's getting pushes, drones are cheaper now. You can buy a drone for $200 with a really good camera on it. So that's one of the reasons. Yes, about the humanity of the workflow, I was hoping to make sure. So the drones are having helicopters take similar buildings of images. Like, do those AI programs get feedbacks actually from the human? Like, I was confused about what the workflow was from AI to human to AI. Yeah, it's kind of bi-dutch. So the first class filter is AI. So think of it as a recommendation system. So it's giving a suggestion, hey, in this 10,000 images, there's like 10 insulators that are damaged. So now the person can go in and just look at 10 images where those insulators are damaged and can review it if it's okay or it's not okay and that feedback is then tracked from the human in the loop. And where it's wrong, it gets penalized. So that image goes back into the system, it's retrained on. So it learns again from that image, from that feedback that the person provided. And then next time it sees that image, it recognizes the fault accurately. So it's like a bi-directional. It feeds back to the human. So the human, so it feeds to the human, the human feeds it back and then it feeds back to the human. So that's this kind of cycle. So it's just a quick question, like being not sure about like, if this is the level of your company, but like, are you also thinking about like, utilizing like satellite images? Yeah, we tried it. Satellite is an interesting one. So the open source satellite data, which is free is not good. This is total BS. Cause a lot of times there's cloud cover and then you cannot penetrate that unless you have a technology called SAR, which is expensive data. So for SAR sensors, you have to pay money. But in terms of resolution also, we're talking about, so we are even detecting pins called quarterpins. So these, all these electrical components are kept together by pins, right? So if those pins are missing, they will fall down. So we are even detecting if they're there or not, which is called quarterpins. So that's like this big. So satellite cannot detect that in that. Kind of like two questions. I was first wondering like how you're doing the validation. Like, do you have, for example, like your model finds all these things that are off or anomalous. And then somebody goes out into the field and sees what is actually out there. And you compare like what you guys tagged versus the problem that actually exists. So you guys don't miss problems. And then secondly, maybe I have some confusion on the process that you talked about. The human in the loop and how my men were, wanted to be involved because they were afraid of losing their jobs. But then he also said that your technology reduces the cost of aligning from $250 an hour or $3 an hour. So when you're talking to a company that, I mean, I can imagine that politics and just not wanting to get it bad around, like people don't want to lay off a bunch of their employees. And I'm not like, I'm not saying pro or against anything, but it was just something interesting that I was thinking about in terms of like, a lot of companies are automating tasks nowadays and how do you sell it at a higher level is like, this is gonna reduce your overhead and then, but the employees are told that they're gonna be included somehow, I was wondering about that. Yeah, that's a great point. I'll start with the second question because that's holds much more weight. So in terms of the stakeholders that we're talking about CIOs and like CXOs and VPs, if we bring a solution to them that we say, hey, we'll save you $50 million every year. They don't really care if there's job cuts or not because that's a different level. It's more about efficiency over there. But for our tool, it's actually a tool for linemen. So the real job of linemen is to do repair and replacement go out. So you see the cherry picking trucks and they're doing repairing because that's what they're skilled for. That's their electrical engineers, their field technicians. Now what they're doing is since there's so much data coming, they are the ones that are sitting in front of the computer for eight hours a day, five days a week and then sifting through the images, making sense out of it. So they don't really have time to go out in the field. So that causes a lot of delay. So what we're trying to do is we're trying to reprioritize their work which is supposed to be their actual work. So we're taking away the looking at the screen kind of thing and then making sense out of it. We're letting the software do that and then having them review it and then take actions on it. So it's not like we are cutting the job. This was never their job to start off with that they're supposed to do now and that's causing delay. So in terms of just deployment and reprioritization of work, that's a big one because linemen also want to be out in the field and do the job that they're supposed to do. So they're still getting paid that amount of money but instead of spending time on this plus that with the delay of that, they just have to do the job that they're supposed to do. So if they're still getting paid that amount of money, how do you save the company so much money? If like there would have to be, I'd imagine that there are linemen that are on the computer and then they're in the field. So you're cutting the linemen that are on the computer. Yes, because from the start, they were not supposed to be on the computer. They were supposed to, there are purposes in the field to do maintenance. So now they're basically the company is spending because there's delay in the maintenance now because they have to manually go through images. So the company that's a PGE spending $50 million every year just for the computer plus maintenance. While we are saying that we take away that computer cost making it more efficient while the linemen can actually do their job. So instead of $50 million, they're spending 25 now. But to be clear, we are, we are cutting linemen for the company, right? No. No, we were We were trying to say that they could do the job more efficiently. We are reporting this one. We are reporting this one. It's happening to the green party. Also, again, we are out of it. Yeah, yeah. So we are, so the linemen are still doing their job, which is out in the field. Imagine this, I'm a lineman. I have 10,000 images come back to me. I'm spending, let's say a week to go through them. And then the next seven days to go out in the field. So now I'm spending 14 days on a job which should have been seven days from the start. Like they should have just been out in the field doing maintenance. So we're taking that seven days which they're not really want to do. And we are making it much more efficient. Even though, I mean, they still are on the system. I mean, they're still interacting because this is a digital tool. So think of it like that. So their jobs are not getting cut. They're just doing the actual job they're supposed to. So then is your like maintenance, are they able to do more maintenance then? Like I've seen that there's like, I've done the analysis on like what the throughput is then. Yeah, yeah. So there is a huge backlog right now. So whenever we approach a utility and we see the work order system, there's a huge backlog of six months on there. And then we're reducing that backlog by half and even more. So linemen are already, because again, if they don't know where to go, they won't be able to go. So now they know where to go, what to look at. So it reduces their backlog and it just makes their job more easier basically. Does that answer the question? I feel like we're still... I just wonder like how you're able to cut costs so much if you aren't getting cut. I thought I was just like my main question. Because we're saving time and time has money associated with that. So let's say in order to do a job, so previously they were doing a job in 14 days. So they're just paying with the last time? Yeah. I have a question more on the business side of it. As you said, when you said that drones are cheap, I thought, yeah, drones are cheap and AI now, like anybody can just rent an GPU in AWS and start building models. So I just Google like what kind of competition you guys have and there are many other initiatives. In Huawei now that's like inspection of power lines. So in the future or right now, what kind of like additional benefit or company advantage? How do you guys keep our plan to stay afloat? I'm curious. Yeah, so our models have the highest accuracy right now as compared to the rest of the plans and this has been validated by our clients utility. So we are on an average are, so that when we say accuracy, we say precision. So on an average, we are at 90% and our closest competitor is at 35%. And yeah, so we're way ahead. So that's one of the things. The other thing is we're using this technology for various other use cases, substation being a prime one. We are also now deploying this on gas pipeline inspections for methane leakage, which is a huge problem. So the early detection of methane leakage before it becomes a huge end. And that also relates to climate carbon emissions. So we're trying to resolve that as well. And so yeah, so we're taking this and applying it to different use cases. And that's one of the things about our models, our AI is just really flexible to just get retrained really fast, recalibrated really fast for different use cases as long as it's in energy infrastructure space. So that's another one. And then the last one is we're solving a privatization problem, which is workflow privatization, which is one of the things we talked about, which is for linemen to be easily able to save a lot of, make their operations more efficient, which a lot of companies, our competitors don't are not solving. So we're using machine learning for that as well. Okay. Any questions? How reliable are the cameras to weather issues over dropping? So these cameras are really powerful. Now the ones that utilities are using, high zoomers resolution, we have even seen cameras, a hundred megapixel cameras, just zoom in on really, you know, small targets, especially the ones that are deployed on drones. So in terms of whether we, whether it's a big factor, so we get data and we have made our data set variable like that. And that's another competitive advantage is we don't only have data that is in like bright sunlight or sunny day. It's actually, we have had data come where there's snow in the background, where it's rainy, where there's, you know, a storm had just happened and everything's like destroyed in its path. So we've had all kinds of, we have, we've had data at nighttime. So our AI is, is calibrated to deal with different, you know, environments accordingly. And one of the things that we have put in place is, is this, is this single shot learning, which is learning from a small volume of objects. So as you know, there are more healthy components out in the field as compared to damaged, right? So, so we have to balance that out. So we develop certain algorithms to focus on it, especially the components and kind of look away from the noise, which is the background environment. Now we have what we are doing is, since we have a good enough data set, we are generating our own data, which is called synthetic data generation. So we are using generative adversarial networks for that, which is another topic, but we're generating data from data. And in that data, we can simulate different conditions. So we can simulate different environments. We can put in much more noise into the data. So the models are much more available, much more robust in its, in its functioning. And any studies on different locations? Yes, as I said, we have had, you know, location from in the Midwest, we've had data, which has snow in the background. We've had data that has, you know, rain and all those kinds of things. Is there any, have you heard talks about, so these are looking for ignition prediction for wildfires, like vegetation ignition prediction? I've seen some papers that have been created on like predicting when I imagine you could do that with your data. Yeah, yeah. So that's something that utilities have actually asked us. And that's more like looking at metrics, like one of them, one of them is NDVI. So looking at the vegetation index of, and the moisture content in the vegetation and then predicting, okay, when can we expect this vegetation to become combustible basically? So that's one of the things that we are looking at. So there's on the vegetation side. So we're using thermal infrared cameras on roads to look at vegetation and then calculate the NDVI indices for that, then correlate that to the actual, you know, dryness coefficient. So that's something that we're looking at. That's one more question. Yeah, that's one more question. Do you have access to non-image data, those predictive and pending faults? Yes. So we are starting to get a lot of non-visual data as well, as you know, a lot of the problems that happen outside. So it's kind of correlated, whatever happened outside of the line is also related to what's happening inside and vice versa. So let's say there's heavy load. Let's say in California, in Palo Alto, there's 100 EVs getting charged at the same time. There's a lot of load put on the transformer and here is transformer and it might blow up. So looking at those conditions as well is fluctuations in load. As we see more renewable assets come online more EVs, EV charging stations come online. So kind of mapping that, monitoring that as well. Again, the grid is kind of the backbone of everything basically. So as more, you know, uncertain things are getting added to it and a lot of climate change effects are happening as well. We're starting to collect more data on the non-visual side, which is line data, AMI data, load, voltage fluctuation data, environmental data, so temperature humidity and then mapping that with the visual data itself and then building predictive models. So we're starting to do that since we're getting much more historical data. It will take some time, I would say two, three years to build accurate predictive models but that's where we wanna go. So we can tell, okay, this asset can blow up before it actually blows up. That's end game scenario to kind of simulate the environment. So take the physical reality and bring it to digital and then simulate different scenarios in digital is that solution. I'm curious about the cameras mounted in the drones. At first I thought it was just an RGB but you can get NDVI in your infrared. I think you also mentioned thermal. So how many channels or how does it work? Yeah, there's our drone partners, they're using multi-channel, multi-spectral cameras actually. So they're using RGB, they're using thermal, they're using infrared. Companies like Zen News and all these companies have multi-spectral cameras now. They're also using LiDAR at the same time. So not only multi-spectral but also going in the LiDAR space. On each drone, you have all those capabilities. And these ones are massive, like these are industrial sized drone at this big. I'm not allowed to touch them. Any more questions? No, go get the possible one. So you mentioned like the appeal to a utility is that like it saves money and also it prevents like bad things from happening that they can get in trouble for. If a like utility like gets paid their like rate of return like given to them by spending more money they probably won't want to save costs, right? So maybe like how do they think about that? Like do they care about just like how do you tell them? So the incentive, I mean utilities are an interesting, they're an interesting organization. So one of the things about utilities is if it's costing them something too much they'll just hike up the price of the way it appears us and you won't even notice. And that's how they make it up. But that's changing now. A lot of utilities are kind of deploying and adopting emerging technologies that can actually make their processes more efficient like us. And then there's a lot more companies that are helping utilities. And what that does is that now utilities have to meet their ESG goals. They have to meet their carbon emission goals and those kinds of things. So the money they save, they invest in the innovative technologies. So they have their innovation arms. So PG&E has an innovation arm. So Calibus and all these major utilities and the amount of money they're saving they're implementing in research and innovation technologies that can help them become decarbonized as soon as possible. What are the big carbon emissions from the utility? Like I guess other... Natural gas generation. Oh, so generators are included in this transmission thing. I have to separate. Yeah, so there's generation where the electricity is produced and that's where your coal plants, your natural gas, those kind of now renewables are in the mix too. And then once the electricity is produced goes downhill and you have to translate that. So that's where high voltage transmission comes in. So 230, 250,000 volts goes to a substation. It turns it down using transformers. But they're the same company, the transmission line and the generator. Not necessary. So I guess I'm just trying to understand. It can be. Is there a way for the utility to play with the transmission line? Are they a middle of carbon? They have a footprint of carbon. I mean, there are, you know, transparent electricity that is needed by us and that electricity is produced at a generation facility which could be owned by utilities. So PG&E, for example, the power authority, they produce their own electricity. So they have a lot of hyper-powered generation facilities. So they generate electricity and a lot of times their plants are also natural gas-based and that has a footprint on it. So even if the utility doesn't own the generation facility in cases and they only own like the distribution, they still indirectly are, you know, there's two consumers of that electricity and they have their own goals as well. Cool, thank you. Okay. Thank you very much. Congratulations. Cool.