 Hey, welcome back, everybody. Jeff Frick here with theCUBE. We're in our Palo Alto studios for a CUBE conversation. We're talking about startups today, which we don't often get to do, but it's really one of the more exciting things that we get to do, because that's what really would keep Silicon Valley, Silicon Valley. And this next new company is playing on a very hot space, which is Edge. You're all about cloud. The next big move is Edge, especially with Internet of Things and Industrial Internet of Things. And so we're really happy to welcome Edgeworks here, fresh off the announcement of the new company and their funding. We've got both founders. We have Farah Papayu Anu, and she is the president. And Kilden Hopkins, the CEO, both of Edgeworks. Welcome. Thank you. Thanks for having us. So for those of us that aren't familiar, give us kind of the quick 101 on Edgeworks. So I've been looking at the space. I was a venture capitalist before I joined up with Kilden. And I've been looking at edge computing for a long time, because it just made intuitive sense to me. You're looking at all these devices that are not just devices, but they're compute platforms that are generating all this data. Well, how are we going to address all that data if you think about sending all that back to the cloud, latency, bandwidth, and cost? You talk about breaking the internet. This is what's going to break the internet, not Kim Kardashian's butt photo, right? So how do you solve that problem? If you think about autonomous vehicles, for example, these are now computer-owned wheels. They're not just a transportation mechanism. If they're generating all this data and they need to interact with each other and make decisions in near real time, how are they going to do that if they have to send all that data back to the cloud? So that's where I came across a Kilden's company, or actually the technology that he built when we formed the company together. I looked at everything and the technology that he developed was far leaps and bounds beyond anything anyone else had come to date, so. So Kilden, how did you start on that project? Yeah, so this actually goes way back. This goes way back to like about 2010. Back in Chicago was looking at what architecture is going to allow us to do the types of processing that's really expensive and do it closer to where the data is. So architecture was in the back of my mind. When I came to the Bay Area, I jumped in with the city of San Francisco as an IoT advisor and everywhere I looked, I saw the same problems. Nobody was doing secure processing at the edge in any kind of way that was manageable. So I started to solve it. Then years later after doing, I did some deployments myself and after seeing how was this stuff working, had finally arrived at an architecture that I thought, okay, this thing's passing all of these trials and now I think we've got this pretty well nailed. So I basically got into it before the terms fog and edge computing were being thrown around and said, this is what has to happen. And then of course it turns out that the world catches up and now of course there's terms for it and everyone's talking about the edge. So it's an interesting problem, right? It's the same old problem that we've been having forever which is do you move the data to the compute or do you move the compute to the data? And then we've had these other things happening with suddenly this huge swell of data flow and that's even before we start kind of the IoT connection on the data flow. Luckily networks are getting faster despite Jesus around the corner. Chips are getting faster and cheaper and memory's getting cheaper and faster and then we had the development of the cloud and really the hyper growth of the public cloud but that still doesn't help you with kind of these low latency applications that you have to execute on the edge and obviously we talked to GE a lot and everyone loves to talk about turbines and harsh conditions and nasty weather and it's not this pristine data center, how do you put compute and how much compute do you put at the edge and how do you manage kind of that data flow? What can you deal with there? What do you have to send up? And then of course, you know, there's a pesky thing called physics and latency which just prohibits, as you said, the ability to get stuff up to some compute and get it back in time necessarily to do something about it. So what is the approach you guys have taken? What's a little bit different about what you've built with Edgeworks? Sure. So in most cases, people think about the edge as like it must lead into the cloud. They say, how can I pre-process the data? Maybe curtail some of the bandwidth volume that I need in order to send data up to the cloud. But that doesn't actually solve the problem. You'll never get rid of cloud latency if you're sending just smaller packages. And in addition, you have done nothing to address the security issues of the Edge if you're just trying to package data, maybe reduce it a bit and send it to the cloud. So what's different about us is with us, you can use the cloud, but you don't have to. We're completely at the edge. So you can run software with Edgeworks that stays within the four walls of a factory if you so choose, and no data will ever leave the building. And that is a stark difference from the approaches that have been taken to date, which have been tied to the cloud, but we do a little at the edge. It's like, come on, this is real edge. Right, right. And so is it a software layer that sits on top of whatever kind of the BIOS and firmware are on a lot of these dump sensors? Is that kind of the idea? Yeah, no, actually it sits, exactly. It sits above the BIOS level, but it sits above the firmware. It creates an application run time. So it allows developers to write applications that are containerized. So we run containers at the edge, which allows our developers to run applications they've already developed for the cloud to write new applications, but they don't have to learn an entirely new framework or an entirely new SDK. They can write you using tools that they already know, Java, C-sharp, C++, Python. If you can write that language, we can run it at the edge, which again, allows people to use skillsets that they already know, they don't have to learn specialized skillsets for the edge, why should they have to do that? You know? Right, I think, and good for you guys to get Stacey Hagenbotham to write a nice article about the company long before you launch, which is good. But I thought she had a really interesting breakdown on kind of edge computing, and she broke it down into four layers. The device and the sensors, as you said, as dumb as it can be, right? You want a lot of these things. Then this gateway layer that collects the data, you know, some level of compute close to the edge, not necessarily in the camera or in any of these sensors, but close. And then of course, a connection back to the cloud. So you guys run in the sensor, or probably more likely in that gateway layer, or do you see in some of the really customers you're talking to, are they putting these like little micro data centers? I mean, how are you actually seeing the stuff deployed in the field at scale? So we actually gave Stacey that four layer chart because we were trying to explain people to the people who didn't understand what there was. And again, people refer to all these different layers at the edge. We actually think that the layer right above the sensors is actually the most difficult to solve for. And the reason why we don't want to run on the sensor level is because sensors are becoming more and more commoditized. A customer would rather have a thousand dumb sensors where they could get more and more data than have like 10 really, really smart sensors where they could run compute on them. So unless there are special circumstances, like in the case of a camera where we actually working with the camera that has GPU capability where they can actually run on the edge, we'd like to run at a level up there. And there's a couple of reasons for that. One is if you run on the devices itself, you can't really aggregate each other's devices. You can't aggregate a temperature sensor, cannot aggregate a pressure sensor's data. You need to set up the layer above. Also we're able to serve as a broker between low levels of wifi and Bluetooth versus high levels of TCPIP, right? Which you also cannot do at the sensor level. If you were to run at the sensor, you basically have to do what Amazon does, which is device to cloud, which doesn't really afford you the capability of running real software at the edge. Right. So when you're out, let's just say the camera, we talked a little bit before we turned the cameras on about the surveillance and surveillance cameras. I mean, where are those gateways and where's the power and the connectivity to that gateway? What do you see in some of these early examples? So for cameras, you've got basically two choices. Either the camera is a dumb camera that puts a video feed to some kind of a compute box that's nearby or is on a wired network or a wireless network that's private to it. So in building cameras that are already in place that are analog, you can put a box in the building that can take the feeds. But the better option than that even is to have smart cameras. So probably new Greenfield deploy would have smart cameras that have the ability to do the AI processing right there in the module. So the answer is somewhere you have a feed of sensor data, whether it be video, audio, or just like a temperature, time series data, and then it hits a point of where you're still on the edge but you can do compute. Sometimes they're in the same unit, sometimes they're a little spread out, sometimes they're over wireless. That first layer up is where we sit no matter how the compute is done. Okay, and I'm just curious on some of the early use cases, how do people see the opportunity now to have kind of a software-driven IoT devices separate from the actual firmware in the sensor? I mean, what is that going to enable them to do that they're excited to do that they could do before? Yeah, so if you think about the older model, how can I make this device get its sensor readings and somehow communicate that data? And I'm going to write low-level code, probably C code or whatever to operate that and how often do I pull the sensor in? You're really thinking about just, geez, I just need to get this data somewhere to make it usable. And when you use us, you think, okay, I have streams of data. What would I do if I wanted to run software right where the data is? I can increase my sampling frequency. I can do everything we are going to do in the cloud but do it right there for free once it's deployed. There's no bandwidth cost. So it opens the world of thinking we're now running software at the edge instead of running firmware so I can just move the data upstream. You stop moving the data and you start moving the applications and that's what's like the world changer for everybody. Plus, you can use the same skill sets you have for the cloud. And up until now, programming IoT devices has been a matter of saying, oh, if I know how to work with the GPIO pins and I can write and see maybe I can make it work. Now you say, I know Python and I know how to do data analytics with Python. I can just move that into the sensor module if it's smart enough or the gateway right there. And I can pretty much push my code into the factory instead of waiting for the factory to wire the data to me. We actually have a customer right now that's doing real-time surveillance at the edge and they have smart city deployments and they're looking at an example of border control, for example. And what they want to be able to do is put these cameras out there and say, well, I've detected something on the maritime border here. Is it a whale? Is it debris? Or is it a boat full of refugees? Or is it a boat full of pirates? Or is it a boat full of migrants? Well, before what they would have to do is, okay, well, as an edge device, maybe the basic level of processing I could run is to say, let me compress that video data and send some of it back, right? And then do the analysis back there. Well, that's not really going to be that helpful because if I have to send it back to some cloud and do some analysis, by the time I recognize what's out there, too late. What we can do now with our software capability because we have our platform running on these cameras is we can deploy software that says, okay, well, I can detect right there, right at the edge, what we're seeing. And I can not just send back video data, which I don't really want to do. That's really heavy on bandwidth latency cost as well as I can just send back text data and say, well, I've actually detected something. So let's do some, take some sort of action on it and say, okay, the next camera should be able to detect it or pick it up or send some notification that we need to address it back here. If I'm sending textual data back because I've already done that processing right there and then I can run thousands of cameras out there at the edge versus just 10 or 12 because of the amount of cost and latency. And then the customer can decide, well, you know what? I want to add another application that does target tracking of certain individual terrorists. Okay, well, it's easy for me to deploy that software because our platform's already running and just push it out there at the edge. Oh, you know what? I'm able to model train at the edge and I can actually do better detection. I can go from 80% to 90%. Well, I can just push that data and do an upgrade right there at the edge as opposed to going out there and flashing that board and upgrading that way or sending out some sort of firmware upgrade. So it allows a lot of flexibility that we couldn't do before. Right. Well, I was going to ask you, now you got a pile of money, which is exciting and congratulations. Thank you. I was going to say, kind of, where are you going to focus on your go-to market? You know, within any particular vertical or any specific horizontal application, but it sounds like I think we've used cameras now three or four times in the last three or four questions. So I'm guessing that's a good, you know, kind of early, early adopter market for you guys. That one's been a strong one for us yet. We've had some real success with telcos. Another use case that we've seen some real detraction is being able to detect quality of service issues on Wi-Fi routers. So that's one that we're looking at as well. That's had some adoption. Oil and gas has been pretty strong for us as well. So it seems to be quite of a horizontal play for us and we're excited about the opportunity. All right. Well, thanks for coming on and telling the story and congratulations on your funding and launching the company and bringing it to reality. Great. All right, thanks. It's Kilton, Farrah, I'm Jeff. You're watching theCUBE. Thanks for watching. We'll see you next time.