 I like to drive, but also there's times that I'm tired and I don't want to drive and sometimes it's a chore. Sometimes you like to go out to dinner and have a drink and you'd want to be safe on the road. So having the ability to actually allow a safe technology to provide mobility around is a real key to us moving forward as a society. Hi, this is your host Apil Bhartiya and welcome to you for Let's Talk and today we have with us Ryan Smith, Senior Vice President of Solutions at Oxa. Ryan is great to have you on the show. It's great to be here. Thanks for having me today. I'm really excited about this. It's my pleasure to host you here today and this is the first time you and I are talking. So I would love to know a bit about the company. How old is the company? What do you folks do? What specific problem are you starting to solve in which industry? Oxa is a software development company that's based in Oxford, UK. We've got offices in both Canada and the US. We're about nine years old as of the end of September and we're a spin out from the Robotics Institute at Oxford University. Our mission is to change the way the earth moves by providing self-driving software and services that unlock the power of autonomy for industry in any vehicle, in any environment, whether that be multi-passenger mobility on-road with shuttles, operations monitoring in solar farms, or logistics handling in yard movement. What kind of industry we're looking at because most of the types, some of these use cases are very, very industrial and most of us don't even hear about them. It's a really good point. I'm glad you bring that up because there's a lot of focus on RoboTaxi when you talk about autonomous vehicles and that's something that we've shifted away from with our business model and our approach. We really focus on our primary product which is universal autonomy and that applies to any vehicle in any domain in any kind of weather and it's a really flexible architecture that we've built up called the Compose Architecture and it allows us to be agnostic to the hardware from the vehicle through to the sensors and we're able to put this on any different kind of vehicle to suit a customer's need and customizing Compose for specifically that need. So you'll see our software powering vehicles that work in solar farms with zero occupancy doing monitoring for hotspots and albedo monitoring. You'll see our vehicles in airports like Heathrow doing logistics moving and baggage handling. You'll see us doing multi-passenger movements with shuttles with our new partner Beep and we'll be doing that on-road in multiple places around the U.S. When we look at these different, you know, once again autonomous vehicle, don't they build their own whole platform which is a good example could be Tesla for example where they build a software stack and they build the hardware stack. How is your industry different from the other platform that we are aware of? So we're different as we're not a complete vertical. We're actually a B2B company and we're a horizontal across multiple verticals and we enable the customer not to be beholden to a specific hardware or a specific set of solutions that can solve a problem but may have deficiencies in other areas. So we enable the customer to provide autonomy across mixed fleets so you can see we can power different types of vehicles from different, let's take the example with Beep. They may have different types of vehicles, different types of shuttles that may move from six to eight passenger up to 12 to 16 passenger. Those are fundamentally different vehicles but our software can run on both of them and make both of them self-driving. So they don't have to be beholden to a certain software package for a certain vehicle. We can enable that mixed fleet and allow them flexibility to work with one single provider across all of their vehicles. So when we look at this autonomous vehicle, I want to understand the whole software stack and where do you folks come into the picture? Universal autonomy from Aux's point of view is a software platform and it's very much like Android and it provides basically the OS layer of the whole software stack. You can build different applications on top of it that are customized per domain but really and fundamentally at the core were the OS that operates on the hardware. And we're agnostic to the hardware so Android is not really a bad comparison there. It's a really nice one because it's flexible in its hardware. It allows other users to build applications on top of it and we provide open APIs to all of our stuff so that you can actually integrate very easily. We showed this with some fleet management systems with our partner Akado in that we were able to adapt their fleet management system into our software so they can use their fleet management to control our vehicles or their vehicles operating our software in their environments. So when we look at Aux's platform or when we look at autonomous vehicles, a lot is going on there. It's not simply, these are not remote controlled vehicles. They are making decisions as they are in that and as you gave some example, these are different trains also. Form is totally different than an airport versus a term act where you're getting luggage from there. A lot of sensory data is coming in where they have cameras, sensors, lasers, all kind of sensors and then there's some kind of AI which is making decisions. So what is the scope of Aux's technologies when we look at full-fledged autonomous vehicles? That's a great question. So AuxaDriver which is our fundamental platform that goes on to the vehicle that holds all of the AI and the technology to operate self-driving vehicles is built on a multi-sensor modality type of platform and really we're interested in a couple of questions here. Fundamentally it's where am I which is the localization question and it's what's around me which is the perception question and we cover the full stack from perception to tracking to prediction to planning and control and also localization. And we do this through, like I said, multimodal sensors. We use cameras. We use lasers. We use radar and where available we will use GPS but fundamentally we built the autonomy system to be infrastructure free. So we don't rely on GPS. We don't rely on any external infrastructure to be there to support our vehicles. They are fully independent when they operate. And we chose those three modes of sensing technology to be complementary to each other. The use of radar in both perception and localization gives us superhuman powers in dusty environments in snow and in heavy rain to actually see through some of those permeable objects that maybe a camera or a laser would get stuck on. However, cameras and lasers work really well during the daytime and are very fundamental to the rich information they're able to provide in both like an HD mapping or an HD view scenario as you highlighted there. And so I think what we bring to the table as a differentiator is through the multimodal sensing and the complementary aspects of the sensors that we've chosen and the algorithms that we use to process the data coming in, we have the advantage of an extra level of safety and an extra level of perception and localization added to the vehicles that are operating on the road. Can you also talk about if there are any industries where these autonomous vehicles are not in the earlier evaluation phase, but they are like not only fully production, but they heavily rely on these autonomous vehicles for their operations? One that comes to mind right away is mining. Usually that happens in very remote locations in very dangerous and dusty and dull operations where you're performing the same loop over and over again. And we've seen applications of autonomy for probably the last 20 years there in that scenario, driving 350 ton ultra class hull trucks down to the bottom of open pit mines and back up moving material and giving us the ore and the elements that we need to run the world. What industries are you seeing in the almost the next step of once again the mass adoption of these autonomous vehicles? Right now I really see the mass adoption coming in multi-person transport on road through shuttling and through reducing congestion within urban areas and within cities and moving people around in that form. What kind of AI technologies that you are leveraging? What are your thoughts on Genetic AI, especially in the autonomous vehicle space? That's a really great question. And I think AI, generative AI, machine learning is really important to help us understand and help us make complex decisions and semantic reasoning while operating on road in complex environments, especially with self-driving. But you have to use it very carefully. And what we do is a fundamental physics-based approach and we sprinkle the AI and the machine learning in where it really helps and gives us the most bang for the buck. We're not an end-to-end ML framework. We use it only in certain areas fusing before and fusing late in our system to give us the benefits where it really adds the most value. When you come to generative AI, I think there's a point here to note that I'm really excited about in our development is a product that we have called MetaDriver. And this is basically a simulator on steroids. It really allows us to generate scenes, scenarios, and expand our data to allow us to explore all of the edge cases and scenarios that may come up that you're just not going to see in a normal driving scenario. Even if you drive 10 to the 9 miles, you're just not going to see these. And we're able to generate many, many different scenarios to test our autonomy in the most challenging of scenarios to make sure that we're providing the safest and most secure autonomy system to operate on the road. Now, let's talk about your launch in the US and your partnership with Beep. You did mention that briefly earlier, but I want to understand the scope of it. And what does it mean for Auxa as well as the US market? This is a super exciting partnership. And we're about to launch passenger shuttles with Beep in Florida. And these are autonomous electric vehicles, multi-passenger shuttles driven by Auxa. And it's just a really exciting thing. You've got some testing going on right now at Sun Tracks Facility in Auburn Dale, Florida. And we expect to see these vehicles on the road very soon. Of course, there is a lot of discussions about autonomous car, self-driving car, especially in the consumerist space. It's still like, hey, they're not fully self-driving yet. You still have a driver on the seat. What kind of future do you see for autonomous vehicles in general public life? I'm not talking about the warehouses or some industrial setup. What kind of progress do you're seeing? Are you pragmatic about it or are you saying it has limited use case there? I've never been to a city planner or a municipality that said, please give me more cars. We want to reduce congestion. We want to eliminate the single vehicle, single driver scenario. We want to empower mobility. And at Auxa, we want to change the way the world moves. That's really our mission statement. And so what I really see in building up here is the multi-passenger mobility as being the game changer for autonomy going forward. You definitely will have those side things that are the single car, that are the robot taxi, that are emerging and on the edge. But there's a really challenging problem of solving the drive everywhere with any vehicle at any time problem. That's a long way off. But we are really good right now at being able to enable the somewhere problem and driving a bus route or driving a shuttle route in a specific urban area, providing mobility to people that need it in a safe and reliable fashion. One more last question is more of a visionary, philosophical question versus a technological question is that why autonomous vehicle when we are good at driving, our brains can multitask, why should we even care about what value does it bring to our lives? I think there's a safety aspect that comes in really heavy here. And I like to drive, but also there's times that I'm tired and I don't want to drive and sometimes it's a chore. Sometimes you like to go out to dinner and have a drink and you'd want to be safe on the road. So having the ability to actually allow a safe technology to provide mobility around is a real key to us moving forward as a society. Yeah. And I mean, also, as you said, I also like driving. I have a sim racing setup in my workshop where I spend a lot of time. But if you're like traveling eight or nine hours or commuting a bit, that's based on hours. We could save those hours. And if you have once again, the whole family can sit in the back and enjoy while the car is driving itself. We should not be wasting our valuable life and just driving from one place to another. When we walk, we look at the scenery, we enjoy the scenery, but we are driving very stressed on the highway just looking at the road. So safety is there. And of course, you know, more use of your time. In the imagine how many hours we spend. So I also see that that's the future, but we need more realistic and, you know, pragmatic technology that you folks are building there. So thanks for not only building those technology, but thanks for sharing that story. And I would love to talk with you folks again, because I do see that a lot of work is going on there. And we will be talking about that here as well. So Ryan, really thank you. Thanks for your time today, too. I really appreciate this. This was a great interview. And I really look forward to speaking with you again and actually sharing the successes we're going to have with Beep on the road very soon.