 Hey, hello, everybody. Welcome back to the Think Tech Hawaii Studios for another episode of Security Matters. Today we're going to try a little deep dive for you with, I think we called it New Compute for a New Era. We have Sean Holliday with us today. He's a Senior Director of Business Development for Vision at Blaze. Blaze has built a new architecture for AI computing and we're going to get into the why, then we're going to get into the what. And we'll work our way through this. Sean, I really appreciate you taking time out to join me today. I know Gary and your team are always working hard, so I know taking some time out to share with the world is a good thing. I appreciate it. I really appreciate the time, Andrew, and thank you for inviting me on the show. No worries. Well, let our audience get introduced you a little bit, maybe take them through your background as much as you'd care to share, and then we'll get going. That sounds good. So again, my name is Sean Holliday, and I've been with Blaze a little over two years. And Blaze is a fast growing startup. I actually even argued that we're not a startup. We're moving into full commercial production with our second generation products. I come from a long history in IoT, wearables, mobile devices, majority of my career Intel corporation. And so I'm excited to be here and talk about the market opportunity for AI specifically and how our technology and products fit into that space. Okay, we are going to get there first. I noticed on your LinkedIn profile and I got to bring this up. It was like AI, IoT and Bicycle. So tell us a little bit about that and then we'll get into Blaze. Yeah, well obviously number one is AI because that's just the buzzword of the market plus it's an area that I'm deeply excited about with Blaze, obviously. IoT is where I've spent most of my career. So at Intel, I helped build up some of the original IoT strategy back in, you know, 2010, 2012 timeframe. And then bicycles. I'm all about bicycles. So I occasionally race, but I also founded and run a nonprofit that provides free bicycles to low income homeless and various different regions around the world. Man, I love that. I love that story. I love it how our tech community gives gives back, you know, it's one of the hallmarks of our industry that I really appreciate. So thank you for that. All right, so let's, let's talk about AI a lot of people, I think the word gets overused personally. Give us give us what what your, you know, your, your introduction into AI was and then what where is it today from your perspective I'm one of these you know it's still machine learning sort of guys right and a lot of a lot of email gets, I think, re care or characterized as AI instead of a form of AI. But anyway, talk us through what what what it means, you know, from your perspective and where it's at today. Yeah, I sure will. Well, number one, I mean obviously AI, especially if you're an invested community technology AI is is a definite buzzword today and rightfully so. If you look at some of the analysts out there and everyone has their numbers by 2025 in our area, it's $150 billion opportunity business opportunity. So it's, it's the reason we're investing where we are we've developed our architecture is to be a key player in this space. But if you look at the applications of AI, it's going to touch every business around our globe, whether you're farming doing agriculture, you're, you're building gaming machines you're using it in medical or finance. The application of machine learning and deep learning to reduce operational costs increase efficiency increase increase customer satisfaction is all going to be driven by AI technologies and that the capabilities that it offers. So we're talking about using data to maybe drive behavior data to drive industrial decision making data to speed up an assembly line data to slow it down maybe. You know, without without human intervention is when is when does that data truly become intelligence in your perspective. It really depends. If you look at the early days of AI, and a lot of the deployment today everything's done in the cloud, you know, Amazon AWS Google Microsoft but all done a phenomenal job and building up the infrastructure, but you should you're shipping all of your data to the cloud making a decision and then sending it back. The major transition we're seeing in AI today is moving those decisions to the point of data acquisition. So in reference to IOT at that device that is collecting data whether the images video vibration data off of an industrial machine trying to detect maybe a bearings going out so you can do preventative maintenance and not shut down a production line for example. That's that's really what's driving a lot of the AI. And the big decision is days where do you put that compute you put it in the camera around the camera in the networking devices in the servers or in the cloud. And the answer is yes to all of those really depending on what you're trying to accomplish the timeframe involved in the business models involved. Do you think, or you know from the architectural perspective do you think we'll be able to dynamically sort of reallocate that intelligence as it grows and I'll give you an example of my question. Let's say I'm using just a camera to look for someone in a room of 10 people and let's just say the cameras got enough power on board to do that. But now there's 1000 people in that same room Wow so I've got to push that that query or that that workload to the let's say a server in the back room. And then let's say it's all of a sudden, you know that was just the weight maintenance crew then it was the football field now it's the whole stadium you know 70,000 people, and I'm trying to find this thing and I've got to push that to the cloud. Is the is there nothing tells us in the architecture today to sort of do that dynamically and say hey, I'm overloaded I need more compute power let's shift this workload or something like that is that happening or is it possible or am I am I crazy. It's happening already and as I mentioned earlier, there's really four key areas again to put your technology either at the point of data acquisition, or all the way back in the cloud and all four of those in those key categories can be working simultaneously. I'll give you a great example so we're working with customers that are deploying our technology in an industrial setting. They're tracking the movement and the position of people around machines. They're tracking the movement position of those machines and making sure that the unsafe environments aren't created by people putting their hands or they shouldn't be. At the same time they're tracking the widgets coming off that machine maybe the quality. These things evolve over time. There are situations where you need to look at something later so you may ship that data back to the cloud and make a decision overnight versus the things that need to happen immediately somebody's hand going in the way of the machine. You need to make an immediate decision. So those types of models are happening simultaneously. So you can take that set exact same architecture via applications and maybe look at load balances. As you mentioned at the in the evening in a smart city application. This is another application that we have working with customers where technology would be deployed on the street corner, tracking automobiles reading license plates tracking people. Well during rush hour, you probably need more compute power, you need to add more processing to detect all the various different objects at three o'clock in the morning, you probably don't, you need to burn up your local power, but also playing into these decisions of going from the cloud or maybe just distribution is, what is your connectivity are you over 5G do you have the bandwidth. Do you have a, an operational model where it's very costly to constantly upload that data, that also plays in that to the decision so the answer to that is yes, but it really comes down to the business model, and the timing of when decisions need to be made. And you brought a bandwidth. Is this, is it still a constraint today I mean I know this. It's a lot when we say data that's a, you know, broad term right so when we're collecting you know 10,000 points a second or whatever it may be. Is the, is that a true issue is this what why it's maybe better to do things at the edge because it's just there's not time to take all this data back process it and send it back down for action. Well again it really depends on the application and let's say in a, again industrial setting, where you get functional safety, you have to hit that magic number of making a decision under 100 milliseconds, 100 milliseconds. If you look at your network topology, you're not going to send something up to the cloud run it through a model send it back down within that timeframe so there it's very important to make that decision at the point of data acquisition because it's critical it's not going to send functional safety, you can apply that same methodology and say to mobility or autonomous vehicles, autonomous delivery machines need to make everything at the point there's there's no you're going to drive a car with the cloud, for example. I don't think we want to try that at this point in time so I guess in the end to take it to the physical security world so this device let's just say a regular back, you know, bms a door switch right it actually senses that door open door close immediately and send that signal downstream. So, IOT AI compute what we're talking about is everything happening out there that quickly and then the decision that has been I guess programmed or a series of decisions or outputs that are required, come right out of that device themselves and go to the receiving actor or whatever it may be. Okay, so yeah, just giving a great example of let me play upon this door example. One of the number one use cases, especially in retail and business in the world of COVID today. A lot of business have existing building access systems. Well, they only want people coming into the building who are conforming to mask requirements, if it's required in your region. So they need the ability at the snap of a finger to read your credentials, but also scan you are you wearing a mask yesterday know if you're wearing a mask, the door opens, if it doesn't to give you an indication put a mask on, and we'll happily let you in our build. So those types of things are being combined the AI piece would be the mask protection standard IOT would be, you know, open the door close the door type of decision. Yeah, so a little bit of combination with the old world in the new world. You know, I've seen time a boxy speak I'm sure you know of him the wizard kid even is really doing a lot of health care driven stuff and he's given some classes on how easy it is to stand up and code for for AI. And let me ask you a question how, how far along are we are we 1% of the way are we are we just have we scratch the surface yet do we really know what we're capable of with with artificial intelligence. I think we're just scratching the surface I mean AI technology has been around for decades, but the application and we're just now getting to a point via next generation technology tool sets being available to make it more pervasive and easy for people to program for example it plays. We have standard SDK development tools so you can program and CC++ use TensorFlow by torch cafe AI type of models, but the, there's a whole another segment of the market that's going to explode and these are business owners, maybe they're not data scientists on board, but they need to use AI, and we've developed software called AI studio that allows you pretty much click and drag via GUI interface and create a complete AI application without writing the single line of code. The, you know, a big part of this industry is how do we take the complex that takes data or excuse me data scientists and transition to such a ways that the average guy like myself I, you know, I can write Hello World that's about the extent of my programming experience, but be able to develop an AI application and put it into a real world scenario. And I, is the thinking that the people who do the job or the ones that understand what they need it to do is that why we want to give that sort of power. Is that open source by the way is that is the tooling available or is it something that you sell is that something is part of our overall software environment that we offer with our hardware products. Our business model is selling devices, we've developed a core architecture that has gone into silicon that silicon goes on to various different types of board products that plug into servers, or their standalone like for embedded IOT applications. This software enables you to, you know, build those type of models in what we call a code list environment for our products. Wow. Yeah, I've seen and there's a lot of code sort of out there and this. It's amazing to me the variety of types of code that I see listed that can be written for AI is that specific to the type of data people are gathering, or are they gathering from some environment where using a certain type of code just makes it easier to interact with like a legacy system that has the data that they're trying to grab or to, you know, give it to Well, it's at the application level you may have to build your applications that conform to specific interfaces depending on the environment that you're in. And that will tell you specifically what tools that you use, but from the AI development model. It's pretty typical your intensive flow five torch or cafe in terms of traditional AI models, but then you may go into an open DX or an open CL or a C or C++ environment. A lot of applications are being written in Python today, for example. These are all capabilities that our software supports that gives you a lot of flexibility in going forward. The other big part of AI is that the types of models you use. Detection models classification models recognition models are all different. Some of those may be optimized in a specific language or or environment. And which would say, hey, you know, I'm going to use this model because it's been there's a lot of work I can gather from GitHub and others for TensorFlow based models or for five quarters or five towards her cafe models, for example. I got you. I see. Okay. Yeah, interesting. It's so good. That's so that's for all my parents out there get your kids studying these languages because it's going to be what they need to know. Sean will be right back. We're going to take a break. We'll pay some bills. We'll be back in about one minute with Sean holiday stick around. Hey, Aloha. Welcome back to Security Matters. We're with Sean holiday today of blaze and we are getting ready to dive a little bit deeper. We've been talking about AI talking about the why of AI talking about the what of AI. Sean, you were just mentioning a little bit about the different types of workflows that are that are that sort of are perhaps found if you want to do if you want to build an application maybe with facial recognition. There's a lot of data available and it's in a certain type of of programming language already so that makes your adoption easier or easier to get to is the school down in under the hood just a little bit. Now with these, these, the AI specifically blaze and I think others probably have been trying to leverage hardware, you know, along with the software to deliver these capabilities that we all see we are not capable of building. Talk to us a little bit about the what was lacking maybe when the founders of blaze I think maybe was a 10 years ago or so what was lacking then that they saw, you know, about where we'd be today with with something like what blaze has to offer. Okay, great question and this is really goes to the heart of why we're doing what we're doing in our architecture. And so if you look back, and you know all of us you know around, kind of round our age so to say if we went through university and technology we're, we were brought up on x86, we eventually learned, you know, GPU architectures and so forth. And you know these are phenomenal general purpose architectures. Obviously they feel the PC revolution server the build out of the internet and everything else. Right from multi purpose running fairly generic type of workloads but when you get into machine learning and deep learning. You get into running very specific types of math that isn't necessarily tailored to run on the general purpose CPU, or GPU efficiently they definitely can't I mean obviously Nvidia has done an amazing job building out and applying their products towards AI, but the technology isn't there there's a there's a power penalty that you're going to pay. There's a price penalty you're going to pay and really what what drove our architecture is in looking at various different ways we could apply it, especially in the automotive area if you look at companies that are invested in this like Daimler for example, their biggest challenge going forward in the next generation autonomous and EV vehicles is, how do I maintain my performance that I need for AI for detecting what's going on in my code doing drive passing planning, but not have 3000 watts of compute in the trunk of my car. So, it's all about compute efficiency. So how many, what is my performance per watt per dollar. And that's where our architecture comes in where we're seeing anywhere from 10x up to 60x performance enhancement over some of these more traditional architectures like an Intel x86 or an Nvidia GPU architecture. And that that energy expenditure that heat low we in Hawaii we pay like 43 cents a kilowatt hour so like data center work here adding a server to a space that's going to cause heat. And then you got to pay to cool it there's a lot of cost there. Is that an issue with AI compute and data centers as well but they that they want to bring that cost down for power and cooling. Well definitely and you look at the big players like a you know an AWS or Google or Microsoft, you know they're paying you know 10s of million dollars a day or a week and just electricity costs. Wow, so constantly looking at next generation technologies that will lower the power is going to save their bottom line. And so whether it's a very low cost, low performance IoT device sitting out on the edge, or it's high performance server sitting in the cloud, how is always a concern. And looking at these next generation architectures that could do the same workload or more at a much lower power budget is a key driver in some of our local, I would say that many of our discussions with customers today. That's awesome. Our is it is it be simply because the, the particular workload is specific versus like my only reference to be like my windows computer can do all kinds of different things. But of course there's a higher cost and it's slower whatever it may be, if it only did one thing, for example a certain type of workload. It could probably be cheaper and run faster is that is that kind of the idea. But, you know, our architecture what we call graph streaming processing architecture is still 100% programmable, but it's done in such a way that the way we've implemented our internal pipelining of data utilization of memory utilization of the math or the math is made much more efficient to run that same workload that may run on an x86 or an NVIDIA GPU for example. So, again, that's how we focus so, and this gets into another area in the market is fixed function devices that maybe do two to three, two to three things really well. But if you need to read, you know, do that fourth thing that falls flat on performance or power for example. So if you look at what we put together, we focused on how can we provide the best performance, while maintaining the most flexibility on a product and that's we bring those two worlds together that give our, you know, our customers the ability to support many different types of networks, as well as giving them flexibility to upgrade and make changes, you know, in the future. It sounds like you're just right at the very heart of what this next evolution is going to be and be in the hardware layer you are. Yeah, is the what what where's the where you see in the the earliest adoption or I'll say the best acceleration or or both you know whatever where you guys feeling like you know where who's calling you up and beating you up every day I need those chips now. Well, you know it's a good problem to have and I like to say we have a problem because there are so many applications and, as I mentioned earlier when we first started out, AI is going to touch every type of business around the globe. And we're seeing big uptakes industrial is really moving forward and let me go back to preface this is that vision processing if we look at what the analysts say if we look at it from 2025, about 70 to 75% of the total compute in the network is going to be on vision processing whether that's fixed images or real time video streams because you're trying to detect things you're trying to recognize you're trying to classify so you can make a decision. So the transition to doing more visual based compute is going to be immense and that's really where AI steps in today. So where are we seeing the applications. Mobility is another area. Automotive is all over AI technology in terms of moving to next generation ADOS or advanced driving assistant, you know that the lights of Tesla, for example, in terms of autonomous driving or almost for autonomous driving is a big application we're seeing retail. So the next generation of retail centers again especially in the world of COVID-19 and you know that the pandemic is how do you utilize the technology to ensure a safe shopping environment. Do people have masks on our customers doing social distancing are there situations that are healthy happening within our retail center that we could change due to what AI is detecting shoplifting or placement of products, all those types of things. Smart city is another big area. Again, as we we see municipalities are trying to do more with less utilizing the cameras they have around their city streets to make intersections safer to turn on and off very easily toll tracking maybe at different times in the day, just because you're you're built you can do license plate recognition on multiple or simultaneously using AI technology. These are just some of the areas that we're seeing an application agriculture is another area. Imagine the big John Deere tractor going down the field, and it has multiple cameras and it looks at each individual plant. And based off of the model it's trained on you'll say does this plant need fertilizer ABC or D or insecticide 123 or four. And it only applies what that particular plant needs, all based off the AI technology so really cool usage model small for sure. Okay, I love the agricultural yield question to that's a thing the whole world's facing. I saw one yesterday. A young man had coded it up he was running it on his phone, but he could tell us a guy was getting ready to steal steal a base and baseball. And he said it was pretty accurate. So I thought that was pretty interesting application, maybe for maybe for the gamblers in Vegas you never know. Well, I think that yeah the game industry is probably they embrace AI because it can help them, but at the same time AI in the hands of maybe that the person on the other side of the sheen maybe it gives them some sort of advantage right. It's so interesting in insecurity, particularly in my you know my industry in this applied industry. What part do we have to play you know we're, we're like deployment guys we're fairly applied right the technology is built for us we just go out we install it we configure it. What's, what's the role for our industry DC. Are we going to just be the delivery guys for the technology or do we have a greater role to play. And again, I'm coming from a background of many years in IOT and security was always a big aspect of it, especially when you have all these devices hanging off of networks all over the place how do you secure that device right. How do you know that somebody hasn't broken into that device and is utilizing it in an improper way. So the security industry is going to play a huge part in continuing doing that, securing the AI devices so people can't modify the models. So imagine that in AI, a big portion when you train a model you create what we call weights. And those weights help you determine is this a dog is it a cat is it a bird, you know just using a simple example. What if somebody hacked into a system and made a very slight modification that changed that decision. That that could wreak havoc on an AI network, for example, especially in a safety scenario so the good work that the security field is doing you know blacklisting white listing technologies. The encryption technologies that protect the applications the AI applications running on hardware is going to remain the same but as as it proliferates throughout around the globe, it's even going to become more important to become an integral part of this. And I think as we've seen security become part of like you know when media went digital security integrated into the media via conditional access and other technologies right. We're going to see the same requirement for AI is, how do you secure that model right, how do you, how do you know that image you're looking at is the image that's coming from the device that you should be monitoring and not something inserted by a hacker, the full device. Those type of things are going to be very, very important. Yeah, I absolutely love that I've definitely done some reading on you know and kind of messing with the input to AI or to ML models right and when you can corrupt that input you can create you know obviously strange outcome. So under a minute or so left. Okay, sort of closing thoughts challenge for the industry or whatever you'd like to share. Well, I think, you know, AI, if you're not involved in AI today. Look at how it's going to impact your business if you're a business owner watching this podcast, start talking with your colleagues start talking with consultants learn how AI can benefit you from your optics and your cat that standpoint. If you're a developer if you're in the software space, you have to get involved in AI today, learn the coding learned the various different frameworks learn how it can be integrated new technology. And of course, if you're looking for an AI solution we highly recommend you take a look at place, you know our architecture is novel it's new. It's a different way of doing things and it really shows a very strong promise and meeting those new performance and price point metrics that we're starting to see. This is awesome. Sean, thank you so much for joining us that super informative episode. Check out blaze if you're in the game people. Take care everybody out there.