 on the topic. Very excited about this stuff. So please go ahead, Arjo. So thank you, Mark. Hello, everyone. Today, I'm Arjo from Open Robotics. And I'm going to be talking about maritime robotic simulation and focusing on some challenges, perspectives, and some new features we've added to our simulation platform, Gazebo. So a bit more about us. We are a company based out of Silicon Valley. We were originally the Open Source Robotics Foundation. It's a privately-held company with 50 employees. There's a bunch of engineers all over the world across three continents. We had founders and key contributors to some of the world's most widely used robot software, including ROS, Gazebo, and now we have the OpenRMS. So this is sort of our product range. I guess I'm introducing a company because we'll be having, I think, two others in these sessions. And then there'll be Catherine later on from our company discussing open ethics. So yeah. So without further ado, I'm going to dive into the agenda. So I work, as I said, I work on Gazebo, which is the simulation platform that we have. So today, I'm going to introduce Gazebo. I think many of you who work in robotics would probably have no idea of it. And then you can run. I'm going to be working with that. And then we'll be discussing what makes maritime robotics different, because this talk is about maritime robotics. We're going to talk about some recent developments in Gazebo SIM that support this. And finally, we're going to talk about using a case study where we actually use Gazebo to simulate a real-life robot from a built by Monterey Bay, a aquarium research institute. It's a long-range underwater vehicle that is used for science sampling missions in the Pacific Ocean. So a bit about Gazebo. So Gazebo started out in University of Southern California. It was a project there. And then eventually it was adopted by Willow Garage for integration with Ross. And we've been using. Then open robotics took over the development of Gazebo in 2012. So around 2019, we felt that it's about time that Gazebo has a rewrite. And so we've started as a refactor, but eventually turned out to be quite a different monster. And so now we created the ignition Gazebo, which right now we're in the middle of a rebrand. So we are going back to client Gazebo SIM. So you can see on the top left-hand corner over here this is a screenshot of old Gazebo. And the new Gazebo on the bottom right-hand corner with a lot of new features, including softable physics engines, rendering engines, and much more extensibility. So with that aside, I'm going to talk a bit about maritime robotics first. So maritime robotics refers to underwater and surface vehicles. My talk will be mostly focused on underwater. But a lot of the stuff that works for underwater also can be translated to surface. So a few things that make it different. The physics is different, right? So we have things like ocean currents. You have to account for things like buoyancy, which is what the hydrostatic is about. Hydrodynamics, which involves how the vehicle moves. Environmental currents. So like, as I mentioned, ocean currents, wind, waves, things like lift, drag. These are things that, well, lift is something that aerial robotics care about. We don't really see any ground robotics so much. And they make simulating this a bit more. The physics is a bit different from what you would need to simulate for a real ground-based robot. Apart from that, we have issues with communications in the maritime world. So if you're underwater, water will block the radio waves. And so you can't sit and use Wi-Fi or traditional 4G networks underwater. You have to use other systems. So often we resort to acoustics, optical communication, or tethered communication. Acoustics being using sound. Optical being using light. And tethered being using a wire, which obviously has some issues with range and how things can move. The other challenge is localization underwater. So no radio means no GPS. You will not have very good static features, like in the pitch on the right hand side, like underwater, you've got plants and animals, which don't make very good static features. Sometimes the water quality is very, very poor. So we have to resort to other techniques. Some of these include using things like Doppler velocity. We'll talk about a bit more in a moment. And acoustic transponders, which, again, I mentioned before, acoustics is a huge part of maritime simulation. So I'm going to keep saying acoustics over and over again throughout this talk. Then there are perceptions. The sensors we use also are different underwater. So you've got things like imaging sonars, which are kind of, think of it like a medical, the image you get is something similar to medical ultrasound, where you have this fuzzy image. You can see in the, this is the photo you see on the top right here is an imaging sonar photo. You've got Doppler velocity logs, which is this thing down here. This is actually used for positioning of echo. It uses a sound wave. It emits a sound wave and waits for an echo. And when it receives the echo, it measures the amount of Doppler shift there is, which is the frequency shift, which tells you how quickly you are moving. So that together with the depth information can sort of help you localize. And that's inertial memory units. The traditional sensors like cameras and LiDARs also exist, but cameras often have very poor visibility underwater with few features and underwater LiDARs, while they exist are very, very, very expensive. I mean, imaging sonars themselves are very expensive, but underwater LiDARs sonars are more expensive than that. So these are some perception sandwiches. Environmentally, unlike your traditional robotic simulators, where we are looking at mostly two 2.5D terrain, we're looking at a very three-dimensional dynamic terrain. So here you can see some screenshots from our software where we are visualizing data. This is science data that is provided by the Monterrey Bay Institute Research Institute. So for instance, things like temperature, salinity, and why do we really care about these? Well, one is of course this customer that I'm gonna talk about later on cares about environmental, they do a lot of research in the environmental conditions and ecology, but apart from ecology and stuff, we actually need to know all this stuff because things like the speed of sound change with temperature, then knowing ocean temperatures also helps predict in their flow, whether there's a lot of reasons why we need this information. There's also the fact that the world is round. So most of our robot simulators, including Gazebo, assume that the world is flat and I mean, we do live in a flat world. If anyone tells you otherwise, they're lying, but apparently the maritime world believes the world is round. So yeah, we'll be going for, we have to count for the roundness of the world in our simulation and apply the appropriate distortions to the data we have. So the thing is underwater simulation is actually a big field because underwater robotics is actually quite commonly used in the industry where we are seeing oil and gas, sign, mayvies and many other people using underwater robots for exploration, for maintenance work, for lane cables, for repair work underwater in our infrastructure like oil rigs. So there is actually a huge drive for this and hence there's actually quite a few simulators built for underwater robotics. And this is just a small selection of simulators that are more prominent in the community. So one of them is the UUV simulator, which was based off the Gazebo 9, I think. Project Dave, which Fox UUV simulator actually is also partially maintained by us and the group at Naval Postgraduate School in the U.S. Its focus is more on manipulation, but there's others like UWSim, Virtual Robot X. Again, these are all, so UWSim uses its own engine, but Virtual Robot X is also based off Gazebo. Now, the latest trend is that we're bringing a lot of these features into our main line Gazebo sim. So you don't need to rely on other repos, you know, downloading extra software just to support this use case. So Gazebo sim, as I mentioned, is our rewrite of the old Gazebo. So a lot of things have had to be rewritten. So I'll be focusing a lot on Gazebo sim itself, which is, so as I mentioned, this is actually a very active field. The maritime robotics community is growing. We have in the last year added many features to Gazebo and this year is also going to be very exciting. We'll be adding many more and we'll discuss them some more in depth. We've also created a new Ross Maritime Working Group where people who are interested in underwater robotics or surface robotics can come and join and discuss the problems there are. We've had two meetings so far and it's a monthly schedule that we will be following. But let's talk a bit about the recently developed features. So this is a list of recently developed features. It's pretty text heavy, but the idea here is that we've developed quite a few features for the maritime specific. So how do we build an open source? So now I'm gonna transition a bit. So I thought just now about Gazebo sim, right? I'd like to now go into deep dive into a case study where we were actually built using Gazebo to create a digital twin of a real life underwater vehicle. This vehicle in front of you built by Monterey Bay Aquarium Research Institute. It's the Long Range AUV, which is used for serving and science missions. This vehicle uses its own controllers and they're appropriate controllers, but the folks at Monterey Bay Aquarium Research Institute want to hook this controller into a sort of simulated rig. And they chose Gazebo because it's extensible. It'll allow them to iterate quickly and it has support for multi-robot systems. So we can do a lot of very interesting things with it. And obviously the reason they want to simulate this is because testing this is quite expensive. You have to test it in deep ocean. So you have to bring it out to the ocean and drop it in. You have to charter a ship, which can be very expensive, sometimes about hundreds of thousands to millions of dollars just to do a test. And the vehicle itself would also cost and similar orders of magnitudes because of the sensors on board. So they have been using their own simulator, but they felt that it's not as extensible as Gazebo. So they came to us and asked us to build something for them. So we'll be going through some features that we introduced for this. So I'll just introduce this vehicle first. It's a rather interesting vehicle. It's the folks at Monterey Bay write us with this diagram. But basically you've got several systems here that affect the actuation of the robot. You've got a shiftable battery pack which causes the robot to tilt. You've got an emergency drop weight for emergencies you drop that, surfaces. We've got a variable buoyancy system which basically tunes how buoyant the robot is. So the variable buoyancy system is combined with a shiftable battery pack to sort of see, to adjust for maybe inconsistencies in the mass and stuff. Then there's an elevator and rather for controlling the direction and finally a propeller for pushing it forward. We also have a bunch of sensors and these on the diagram, you can see the green and blue dots, the green reference to stuff that was already worked on in the past year, the blue referring to stuff that we're working on right now. So acoustic communications is already in. We're working on a Doppler velocity log which measures the speed at which the vehicle is moving. The rest of this is science sensors mostly which are specific to the Monterey Bay Aquarium Research Institute but a lot of research institutes that are building underwater vehicles have similar sensors that they care about that give data at a certain position. So they might read like, so here you have CTD, which is basically a certainty temperature and density of the thing. There's a fluorometer for measuring chlorophyll but you could also have things like DNA sample. DNA sequence starts placed in the bulkhead. So we need something that can do positional look up. So let's take apart what we have to actually do to simulate this first. So we have this system and so ignition gazebo is same as great because you can extend it using plug-in. So here's an example snippet of what a plug-in, how you bring a plug-in and you just give this XML and you put it in. This slide may look scary but basically we're talking about hydrodynamics here and drag. We are simulating the whole equation. This is called Fawson's equation and I'm not gonna go into the exact details but the basic idea here is that this is what we use to approximate the behavior of water. So it damps how the vehicle is moving. And so we've introduced a plug-in upstream for this work. So you can actually do that. And there are some parameters you have to fill. These normally are engineering parameters which people in the maritime domain are familiar with this. But these are like numbers that describe how the vehicle drag should behave in different orientations. We also added a thruster plug-in so you can just give your propeller diameter, the fluid density and the coefficient which is again something that is an engineering parameter. I just add it as a plug-in. We have also a plug-in for buoyancy. This is a world level plug-in so it doesn't affect all objects in the system. We had a uniform buoyancy plug-in for quite some time but we also introduced what is known as a graded buoyancy for systems like the ocean where you have an ocean interface between above the ocean is air so you don't have much fluid density like air has a density of about one kg per meter cube. And below it is water so that's about 1,000 kg per meter cube. And you can even do multiple layers and that's kind of useful for maritime use case where for instance the density of water actually changes as you go up and down with certainty and temperature. So having separate densities can be a useful trait for various groups. So buoyancy actually is probably the most important force here because it creates what's called a writing moment and this is what makes your vehicle stable. So based on where your center of buoyancy which is where the center of volume is and the center of gravity these should be at two different spots. So for an underwater vehicle the buoyancy should be above the gravity and so it says V above G and this creates like a turning moment. So if you push the vehicle to one side it will restore itself up straight. If you have it the opposite way it will create a lot of oscillations and your vehicle will be very unstable. It'll capsize and you know. So this is something that actually we needed to simulate. And for most systems it's pretty simple because like you have a fixed object that's all but in ours it's a bit annoying because one is the center of gravity can be shifted by the shiftable mass factory crack and the variable buoyancy system can also adjust the buoyancy of the system. So in order to solve this problem small errors, you know like the mass distribution is super important and you know we are given a diagram like this. We don't actually have the accurate measurements for volumes and CAD data that we need. For instance this front head actually gets flooded with water which means that even though in the CAD it is say like 12 kg in real life when it's submerged it could be something like 30 kg. We don't have the data of how much water comes in. So we actually have to sort of that calculate things. So we have to sort of automate this process. We created this model and tool which sort of takes in the properties of the vehicle, the certain offsets we need and the fluid density and calculates the desired buoyancy parameters that we want. This might seem like a bit strange but it is necessary to make sure that the vehicle is controllable. And keep it, this simulation actually is supposed to run at 100 times real time speed. So small errors will look very amplified in the system. So we had to automate this and build it up. But the question now is okay, we've built the system with different plugins, different buoyancy systems. So how do we actually validate this physics behavior? So we want to have simulation in the loop. So basically that's the controller that's running, which is Mabari's controller and then there's Gazebo running, right? And the controller should believe that it is living on its own vehicle and there shouldn't be too much change between the controller and the real vehicle. So all the changes is the interfacing the drivers that interface with electronics versus the drivers that interface with Gazebo. So we want to do this and the thing is the physics fidelity should be good enough to control that the control system can be used in the simulator and in real life. So one of the issues was validating this. So we actually came up with a test strategy. We needed, we had unit test integration tests. So unit test being vehicle, simple things like if the thruster turns on, the vehicle should move forward or if the mass shifter is, the mass moves, the vehicle should pitch. This is done independent without the controller. And then we have integration tests, which do the same things, but with the controller. This is to debug any like issues that we have in terms of things like sign convention units, because two different teams actually function to that might not communicate very well where you have like them saying that, okay, forward is in on the X axis and we're like forward is on the, well for us it was actually the Z axis is split. And they were using any convention and we were using E and U convention, which is East North up and North East out. So this can lead to a lot of confusion, which is why we have these two tests in place. And then the thing is testing the stuff is, it was initially quite hard, but we actually did not something called test fixtures in ignition that actually makes it easy to write tests. So for instance, for it. So now you can write a unit test very simply for your robot or your system, and anything as using what we call test fixtures. And here you can see example of the API. So you just tell the test fixture that you have this SDF file, which sort of is what your robot description is. And the world description comes from the SDF. You create like this post update, which is run every single time that your simulation step completes, your post update will be called. And here you can actually do things like, oh, what is my current velocity? What is the current acceleration? These are things you can query in the post update step. And then you can finalize it. So for instance, like for the master theory, we would test like, okay, what is my pitch right now? Am I pitch too much? Am I, is my pitch stable? Am I oscillating like crazy? Am I doing something funny, like an injector pole? This is used to validate the accuracy of the system. Finally, you have this fixture.server that run, that just says that, okay, let's run the test. And you can have assertions throughout the thing. The really cool thing is that we can load this up and it can be run headless in GitHub actions. So whenever we make a change to our simulator, every, all the physics gets tested against their controllers and we validate that the simulator is actually working or not. Right now, if you go to our repository, there's a big X because of something that broke last week. But this has been a huge time saver and has been very useful in helping us debug issues, integration issues. Now I've talked specifically about the one robot, but they're really interested in using many of these robots to do coordinated missions. So they actually have, maybe they will deploy up, sometimes up to 20 robots to do coordinated missions where they search different areas, talk to each other, coordinate behaviors, and they might even want to introduce different types of robots in the simulator. So we actually developed an acoustic communication simulator that builds on top of the Xevo. Unlike traditional wireless comms, the acoustics to communications, one thing to note is your acoustics is very slow. Very low bandwidth. So you can expect only limited by the speed of sound. So unlike, like, you know, radio waves which travel at the speed of light, acoustics takes time. So if I am located 100 meters, it can take up to a few milliseconds to actually get the data across, even just like because of physical limitations. And it can get a lot of seconds of looking at kilometer scale worlds, which is what these guys work with at Mabari. So that is a problem. So we need to simulate that. The other thing about acoustics is thanks to the slow speed, there is a flip side, which is we can use it for localization. And by that, I mean, we can measure the time of flight between two sensors. So here is a simple diagram of how this works. So robot, let's say the robot A is on the tag timestamp, the TSP. It transmits data at TSP. The robot, the other robot receives it at TRP. It responds, it takes some time to respond TSR and it sends TRR. So you can actually guess the time of flight using such an equation as shown above. This is an example of what is called two-way ranging. It is also used nowadays in UWB in other places, but it's been used in maritime for a very long time. So we need to build something like this. Apart from that, acoustic models can be different for different systems, right? You could, so we might only want to simulate very simple behaviors with where direct line of sight happens, but there might be researchers who are actually interested in things like, what if I have a very rocky terrain and the acoustics echoes, then there'd be multipath reflections. So we developed a very simple model, which is direct line of sight, but we also developed the infrastructure that allows this model to be loaded separately. And currently, I think this is going to land by the end of the week upstream in Kazibo. We'll be doing that. So I'd like to show you a demo where we're actually using, and I'm running this in the slide because I have never had luck with running live demos over wifi like Chrome will die if I run the simulator together with Chrome. I'm going to run this live, yeah? Sorry to interrupt you, but I think we lost your video. So if you can reassure your video feed, that'll be great. I mean, the screen, the slides are there, but just the camera. Okay, give me a moment. Thanks. Interesting. Can you see me? Yeah. Yeah, that works, thanks. All right, all right, sorry about that. All right, so I'm back, coming back to acoustic. So here we're actually using the real life quote that runs on this real robot and we've loaded it up in the simulator. It's hooked to our simulator and there'll be two robots here. So we're going to spawn two robots. One, this first one, and then the second one. This first one is going to, the second one is supposed to circle the first one, I think. So yeah, so the second one will start circling this first one and this first one is going to move and you'll notice the second one will adjust its circling around this first one. So this is a common behavior they use for studying, studying the sort of the different layers of the ocean. So here you can see actually the two robots are communicating with each other. What is really cool here is, I mean, it's not very fancy graphics or anything that I have to show you here, but what is really, really cool is that we can actually, this is their code running on the simulator. So the new gazebo set is ready to actually support such features. Next, we'll just let this go through and yes, as I said, it's recording because if I run gazebo together with Chrome, like I'm pretty sure my audio is going to drop and things are going to crash and burn. So this is running 100 times real time speed just for your information. Let's see. So, okay, if you've liked this talk, the code is available here at osrf-r-u-v. If you wanna get involved in maritime robotics as a new working group, I'll send you a link just now, later on. There are other cool things I haven't shown you and I was not sure if I'd have the time to do it. So we've recently added waves, wind. So waves is a very nice visual feature that can be seen. Ocean currents, depth elevation maps, there's a lot more that I have not covered in this talk because I've only got a limited time span to speak. But if you are interested in implementing in this field, few things to note, gazebo can be installed. The latest gazebo can be installed. We can join our communities. And there's also, as I said, a new maritime working group where the focus is really on underwater robotics and how we can use open source tools to follow the underwater robotics field. There's a QR code, you can scan that, you can join that. You can join our matrix chat. And finally, I just like to say thank you. I'll show you another shark video with cooler graphics. This was shown at Rosconn. This is using like our test data test. So it's not using actual code to run the stuff, but you can see an army of these robots. So I'll leave the floor open to questions now.