 Hey, welcome back everybody. Jeff Frick here with theCUBE. We're at the innovation and motion mapping and navigation event. It's a really interesting thing. I had no idea, as most industries, you start to get into it a little bit in the layers of complexity and the level of detail and things you've never even thought about. Even just in navigation, just around autonomous vehicles, is pretty outstanding. As put on by the AutoTech Council, hosted by Western Digital, and we're really excited to have our next guest, Jeff Wondery. He's a marketing manager from Velodyne, and I'm sure you know Velodyne as the LiDAR that's the spinning little jobbers on top of all the self-driving cars and a really key component to get autonomous vehicles to where they are today. So first off, Jeff, welcome. Well, thank you, thank you for having me. Yeah, so that was a really interesting panel that you just got off of, talking about kind of different ways to collect the where am I information for the vehicle, whether it's a map and trying to really pull it in off the map and how accurate is that versus a LiDAR system, which is right now really the premier system that we see on all those cars versus some of the new technologies around more optics and different ways. So interesting different ways to try to solve the same problem, probably none perfect, all can kind of leverage some of the other technologies. Right, and as we kind of discussed earlier, I think there is, currently there's basically three technologies that are kind of in the running. So this is basically laser radar and cameras. So I think it's part of the question ends up being, okay, where is the innovation occurring? And where are there opportunities for cost reduction? Whereas, what is the application? So I think from that perspective, I think there's a lot of things to happen within laser and I think that's kind of what's happening in the marketplace right now. And there was another conversation too about kind of not thinking of kind of the maps and where you are in kind of a static as a single entity, but really there's these multi layers of granularity depending on what the application is that you or the vehicle is trying to execute at that moment and really starting to think of it as a really complex, multi-layered representation of space as opposed to I'm here. Right, I think a common thing that I get typically is there's typical questions I get are really, there's a static component to this, which is really the buildings, the road, the lane markings. Those things typically don't change very often or they're slow to change. But there's the dynamic things as well. So the dynamic things can range from being either an animal crossing the road or it's a person or it's a pothole or it's weather conditions. Those are all dynamic things that still have to be accounted for that change your driving habits. So if you'd have, I'm from Minnesota, so if you have a lot of snow, you typically slow down and you slow down immediately. So how you drive changes based on what's going on around you. So it's the same thing with like night driving as well. Right, and the other thing that keeps coming up over and over again is Moore's law, not specifically but contextually and really as an innovation move forward that as more and more of these things get created, you're starting to get much more scale of production, it's gonna drive all kinds of efficiencies, price comes down and you were asked an interesting question in the panel, so that means we're gonna have not one LiDAR but four and you said it really is gonna open up an opportunity to rethink the deployments of these things and how they work within this integrated system. Yeah, and I think one thing I was trying to kind of also talk about is there is the idealized case when we get to full autonomy when everyone is driving autonomous vehicles. But I think there's a lot of, as we go from zero to a hundred percent, a lot of things are gonna change. I think, but there's some common things that are happening within laser, so I think there's a general trend for the innovation portion of that laser component which is to move to solid state, eliminate moving parts, lower the cost down. I think as those products are getting released, I think you see more adoption that occurs so as you go from zero to 10 to 20, those things will start to catch up and kind of move along in tandem together. Right, I'm curious in terms of your interaction with some of the larger car manufacturers in terms of the adoption and the rate of adoption and interest in autonomy, if you can share any stories, is it going faster than you expected? Is it going slower than you expected? Are there hurdles that you thought were there that just blew right past and are there some kind of surprises that have popped up that you really never thought about? It just, I don't think people see how fast this train is coming. Right, so I think there's a couple of things that I think there's general knowledge type things. So I think all the manufacturers, all the automobile manufacturers have said, yes, we want to go to autonomous vehicles and that has ranged from either 2019 to 2021, having at least something that's out there, at least in the short term. That doesn't mean that all the vehicles will be that. All vehicles will be autonomous in 2021, but we're going to start to see that trend. So a lot of investment, a lot of money, a lot of people, smart people are working in that area and kind of spending a lot. That also means that there's other applications as well that are becoming autonomous as well. So I think that's made a slower trend, but I think a lot of it's being driven by what's going on automotive based on the scale and to your point to Moore's law, when you have that volume, then it drives a lot more applications. Right, right. And then also, the speed of networking, 5G's coming down the road, and really more kind of the two-way communication of data between the devices, more and more, not just sending data, but also receiving data back in some aggregated way to either retrain the algorithms, make them smarter. So I'm curious if you can share kind of how the data relationship in something that was really more, I presume, maybe I'm wrong, kind of a one-way sensor of data back into a system now starts to open up where you can actually take data back into the sensors and into the systems to adapt their behavior, make them better, faster, cheaper, or cheaper. I think there's a common push and pull that occurs here. So the general thing is that there are requirements people are asking for, either longer range, less expense, less cost on the sensor itself, because what the main function of a sensor is, it doesn't, it's independent of the technology. Right. At the same time, people do want to communicate with the sensor, so there's elements of that that they also want to grow. There's diagnostics that they want to be able to figure out as well that are sensor-specific. Right. But generally, it's what can the sensor provide, and typically it's more data coming out and that ends up pushing back then to the control parts. And the control parts are the storage, and then it's also how it's being processed as well. Right, right. And have you seen much of a growth in applications for other uses of the data coming off the LiDAR? I think the main thing I try to point out is that it's usually you're mapping an area first and then you're navigating later on. So I think you can't really do an autonomous vehicle until you've kind of mapped the area first. I mean, it doesn't really make a whole lot of sense to just start navigating, you know, kind of willy-nilly. It's kind of dangerous. Right, right, because it's not quite that fast, depending on how fast you're going down the road. So that, which begs the question, because that was another one of the debates that came up on one of the other panels. Is it, you know, primarily map-driven, where you've got the, you know, a baseline of data that you're progressing down a road and you kind of know where you are, relative to, as you said, the stationary things, or is it primarily, you know, intelligence derived from the vehicle based on its own sensors to tell it where it's at and really more appealing to the dynamic stuff. And, you know, obviously you need both, right, to be successful. And I think since it's being a chicken or egg type, type of environment, I think, you know, so they do kind of loop together and they do make a nice kind of loop. So I think they'll, you know, over time, this loop will get bigger and bigger and more robust. Right, right, super. So exciting times for you guys. Any special things coming on with Veldine that we should be aware of? I think the, you know, the main thing is, you know, we continue to invest and we continue to kind of develop new products. So I think, you know, that's fun for us and kind of, you know, how those applications look like is also, you know, it's fun to watch and be a part of. Well, and you guys are in a great position. So congratulations to you and the team. Yeah, thank you. All right, he's Jeff Wendry. I'm Jeff Frick. You're watching theCUBE. We are at the Innovation in Motion Mapping and Navigation Event. Thanks for watching.