 Hey, welcome back everybody. Jeff Frick here with theCUBE. We just got out of our ride in the Phantom Auto Lincoln, which is not really an autonomous car. It's kind of an autonomous car, but the real autonomy was Ben driving back here from the office. So we wanted to catch up with Ben. He's also a software engineer. When he's not driving, he's fixing bugs and writing code, but really happy to have him. Ben Schuchman, he's a software engineer, but more importantly, the driver of the Phantom Auto Lincoln. So Ben, great job out there. Thank you. I'm really curious as kind of, what does it feel like? I mean, we can see the gear behind us. You're basically using high end video game, Logitech gear in terms of just the pure hardware, but what does it feel like driving this thing? You know, driving remotely is actually very different from driving a car normally. And I know it might sound obvious, but there's a lot of things we take for granted driving the car that we don't have access to when you're driving remotely. For example, you don't actually understand the momentum shifts that are happening in the vehicle so you don't know how hard you're braking, or you might have a different depth perception because the optics on the cameras like warp the image in different ways. And so all these things kind of add up into a completely different driving experience. So a big part of it is we have systems in place to feed information to the remote operator so that they have this information, sensory feedback, such as the momentum shifts inside of the vehicle, and we give that visually to the operator through our system, and that allows them to create smoother driving experiences for our passengers. So is that like through some type of accelerometer meter that shows that you've got so much momentum forward or backwards, or how do you do that visually? Exactly, and that's just one of a large number of systems we have in place. Right, right. So we're in the Lincoln. Why the Lincoln? Lincoln MKS I believe, right? MKZ, right. MKZ, excuse me. So we are driving a Lincoln MKZ 2017, and the reason this vehicle is so good for autonomous vehicle development is because it has a lot of the driving, steering, gas, and brakes is enabled through a system called drive by wire. And that means it's an electronic signal that goes through the can bus and initiates these features, locomotions in the vehicle electronically. And because it's an electronic signal, we can create an artificial electronic signal and inject it into the place in the vehicle where it processes that information and artificially move the steering wheel with the brakes or the gas like that way. He said an interesting thing before we went out on the drive and we were talking, you know, that if any system today, most newer cars, a lot of them have the automatic cruise control. So it's got the sensors out front, the sensors in the back, it'll slow down. And also the lane kind of adjustment, but usually that just gives you a little shimmy in your hand if you're drifting, but you're saying you can use those systems to basically go turn to turn and have full control of the gas. Absolutely, I've seen some open source projects and a lot of other companies kind of aiming to create autonomous vehicles out of older vehicles that are already on the road. And I think their ideology is that if a vehicle already has cruise control and let's say lane assist, those things are actuators or drive by wire enabled components inside of the vehicle. And they access those drive by wire enabled components to use, to turn your car into an autonomous car. Theoretically, if you have a car with lane assist or cruise control, you might be able to turn that into an autonomous car. Right, interesting. But then the piece you talked about that's so unique and not as universal yet is putting it into gear and whether that's a mechanical or an electronic process and braking. You said still a lot of, and I'm surprised you hear a lot of cars still when you push, it's a mechanical thing. It's not an electronic thing. I've seen videos of OEMs trying to turn their cars into remotely controlled cars. And in order to do that, they have to install lots of linear actuators to say force the gear shifter into a different gear manually and mechanically. But we have an electronic system and that allows us to control it very easily. However, we are trying to integrate with lots of different vehicles for our research and development. And because of that, on some of our new vehicle platforms, we're installing actuators just so we say have a better understanding of controlling SUVs as opposed to controlling a coupe. Right, right. So that's a whole other kind of interesting feedback loop. Every car is different. We get used to our own car, but you get in a rental car, you get in a pickup truck, you get something you're not used to driving, especially like usually you're not smooth on the brake, you're not smooth on the gas. In those instances, how's it going to work for Phantom when you've got a controller, let's just pick it, let's just say he's operating UPS, open UPS out just to pick somebody. And they got four different types of trucks, six different types of trucks. How do you address kind of the differences in the vehicle, the differences in the characteristics and how does this operator manage that back here? That's a really great question. At the hive. Absolutely. So as Phantom Auto is trying to be a vehicle agnostic system that will provide safety solutions and edge case scenarios for many different autonomous vehicle companies. And so regardless of the kind of vehicle that a company might utilize, we have to be able to control that vehicle smoothly and safely. Now what's really interesting is say we're controlling a golf cart and then we have an operator operating a golf cart and then he switches over to a large truck, the handling of that vehicle is going to be completely different. And so when he says slams on the accelerator, it's going to be a jarring difference and the passengers are going to get really rattled because their operator might be moving their vehicle as if he's never driven it before. So we want to reduce the variability between different kinds of vehicles. And so to do that, we're building systems so that we can be agnostic in the input and the output of the operator and the vehicle. So the same vehicular input that we place into the operator console will be translated into a constant output regardless of the vehicle. So turning the steering wheel will be constant over a wide range of different kinds of vehicles. So are you defining kind of what is a smooth turn? What is a sharp turn? What is a loose turn so that you can kind of have a baseline definition of what's a good turn versus a crappy turn and then be able to translate that to the big giant truck where you need to go out further to whip around versus a smaller vehicle where you can be tighter to the curve on the right hand side. And maybe a big part of it in the future will be individual preference for every single operator because if we have that translation system, we can adjust the parameters. So the operator has a most comfortable operating experience as possible. Maybe they fine tune the parameters to what they specifically like. Right. So you've been doing this for a little while. You're like the primary test driver here. So as you've been doing it, it's probably hard to answer because you're updating code and stuff all the time as you find bugs. Do you ever kind of cross into feeling like you're in the car? I mean, do you get to a comfort level in terms of the mechanics and the feedback loops where it starts to feel like you're sitting in the vehicle? As I'm developing the system, I'm testing it and seeing exactly the information that I need in order to create that safe and smooth driving experience. Right. And so I'm looking at what's difficult for me as a remote operator or what information am I lacking that I think would help me drive the vehicle better. And then I go back and develop those things. And so from working with, now we have multiple operators and seeing their feedback and the evolution of their abilities to drive this car, we have developed a process that ensures the training of the operator is up to par with what our quality standards are. Right. Now you pulled a really tricky maneuver when we went through the gas station on our demo drive. We were coming out. We needed to cross two lanes of traffic to get into the dual left-hand turn lane. And we were waiting for stuff to get by to make our maneuver and this stupid person turned right and kind of cut us off and cut off that lane. In a normal situation, I could look at that driver face to face. I'd eye contact, do you see me? Are you good? Okay, I'm going to make that pull. You made that maneuver. Obviously you could see that driver, but they could not see your eyes. So what other kind of clues are you looking at? How does the system work? Because that wasn't a dangerous maneuver. We weren't moving fast or anything, but it was a pretty tricky driving maneuver. Certainly a great example of kind of an edge case. Absolutely. I think that really highlights exactly the edge cases that are giving the autonomous vehicle industry such headaches. In those kinds of situations, there's pretty much social cues that you're looking for in other drivers and say you are trying to inch your way into a lane and the other guy is kind of creeping forward also on the tune of you guys are playing chicken. How does an autonomous vehicle process these kinds of social interactions? We like to call up a delicate dance where every single person's kind of moving slowly and it's really difficult to process because you also have to think about what is the other driver doing? And in those kinds of situations, I think that is a really strong suit for us because we're able to handle those situations because we have that human driver in the loop able to drive the vehicle through those situations. But clearly it was not a wide open, easy to ascertain situation, right? It was a little bit more sophisticated in terms of the inputs. One of the fascinating things about edge case scenarios is that we can only fathom so many edge case scenarios but in the future when we have more autonomous vehicles driving around on the road, just the process or just the reality of having autonomous vehicles interacting with other autonomous vehicles might create new edge case scenarios that don't exist yet. And so because the current industry leaders can't even predict the future of edge case scenarios, we have to have systems in place that will be able to solve any kind of problem. And a human's cognitive ability to process information on the fly, we think that's the hidden key to making autonomous vehicles a reality. Yeah, that's fascinating. So are you documenting either kind of by case or by kind of situation these edge cases to find out what are the most likely, what are the ones that you can standardize right software against versus those that there's just no software, you need a person. You got it. When a customer calls up on us and needs our help for edge case scenario, those situations we want to collect as much data as possible because that customer can really utilize that information because that is the elusive edge case that everyone's looking for and we have the most data on those things. Right. And then just the last thing to close, so you guys just finished up CES, you had a great show and ironically, you were driving the cars from here in Mountain View down in rainy, nasty, flood-ridden Las Vegas. Absolutely, yeah. So how was that experience being that much further? I would imagine there's a little bit of latency charge because of just pure speed of light and distance, but was it different than when you're just doing your test drives around here? It felt very similar and that's a really interesting thing is that I was driving in places in Vegas that I've never been before. And so a big part of the drivers, the operator's training, is that they have to get acclimated with the environments or the geofence locations that they might be working in so that they can understand the lay of the land of those areas. Yeah. And we were driving in Vegas, it was more than 500 mile distance and I felt I had a really clear connection, I had really clear controls over the vehicle and I was able to move it very smoothly, I think, through the Vegas strip and that was really something. All right, well, Ben, I won't keep you any longer, you got to write some more code. We ran into some more edge cases today. So thanks for sharing the ride and taking care of us, appreciate it. Thank you. All right, he's Ben, I'm Jeff, you're watching theCUBE. We're at Phantom Auto in Mountain View, California. Thanks for watching, we'll catch you next time.