 So we're right on time here. I guess it's our time to shine. Hi, everyone. Welcome to the Open Source Drone Summit hosted by the Drone Code Foundation. We hope you're enjoying your time at the Open Source Summit. We're super excited to be here today. We have lots of great content prepared for you. So let's just jump right into it, right? So give me a moment while I share my screen. Awesome. So let's just get right into it. Introducing the PX4 community. I'm going to be giving you today a quick overview of the PX4 autopilot project and the amazing community of developers and companies around it. But before we continue, though, let me introduce myself. Just a little bit about myself. My name is Ramon Roche. I'm a software engineer. I'm employed by the Linux Foundation as a program manager for Drone Code. I've been working in drones for almost eight years now. And I started in a small startup called 3DR way back in 2013. We used to sell the first PX4 kits and also the IY kits for drones. We went on to release the very first smart drone in the industry, the 3DR solo, using Open Source. And in 2016, thanks to the relationships and experience that I was looking enough to have a 3DR, I joined Drone Code as a program manager. You can find me on social media. Send me an invite, follow me. Let me know how I did today. All right, so why am I here? What is Drone Code? So Drone Code is the vendor neutral foundation for Open Source drone projects. We are a US-based nonprofit. We provide open source governance and infrastructure and services to software and hardware projects. We're primarily funded through yearly memberships by organizations who support drone ecosystem standards. And Drone Code is part of the Linux Foundation. So we're actively driving open standards and helping member companies to create new opportunities within the ecosystem. If you look at the Linux Foundation and the collaborative projects, we're much like them. We have umbrella of Open Source projects within Drone Code and I'm going to introduce all of them today. So just a quick overview of our membership. We have lots of engaged members, some of which include Otherian, NXP, Microsoft, 96Wars, 3DR, AirMap, Unique, Ubify, Wingtra, Subax, et cetera. All right, so how did we get here? It's been a long time coming and let me just give you a quick overview of the Open Source story behind the project. So it all started as a huge academic project at ETH Shurik. The story of our humble community can be traced back to all the way back to 2008 at the student lab at ETH Shurik. So there's a group of students that was led by Dr. Lawrence Mayer. They were trying to make drones fly autonomously using computer vision. In that lab, they created the first hardware and software prototypes and released them as Open Source back in 2008. So back then, there weren't a lot of public information or resources on drones. So in just two years, the project had gained worldwide academic adoption. So just two years, the project took off tremendously. That by 2011, they created the PX4 autopilot. So they put all of their efforts into the PX4 autopilot. And lucky enough by 2013, the drone DIY boom bus in full swing. So there was a lot of companies that got created and the drone industry matured as a whole. Just like my previous employer, 3D Robotics, lots of companies started looking into the Open Source drones as an alternative in the drone industry. And some of those companies formed the Drunko Foundation in 2014. By 2016, we managed to get broader adoption in the consumer and commercial markets. I'm gonna give you a few examples of what that means in the next slides. By 2017, PX4 had gained computer vision and obstacle avoidance. In 2018, we gained our MAP SDK, which is the SDK for mapling-based drones. In 2019, the project celebrated the first developer summit. We hosted that at Zurich. And we also gained, very importantly, ROS2 support. So that's full ROS support within PX4. So you're able to control PX4 from within ROS. And in 2020, which is this year, we were lucky enough to work with our members in the Drunko Foundation as part of a special interest group, which I'm gonna be speaking of a little bit more in detail later. We developed some of the leading standard flight controllers. We also have two other specs that we are leading right now. The community and the market adopters have been responding tremendously to the Open Source projects. I'm going to give you a quick overview of the PX4 adopters. So I'm gonna go quickly through this because we don't have quite a lot of time and there's a lot of them. So just quickly, just so that you know the diversity of what PX4 can give you. So this is a hexacopter by Unique. We have a vertical take-off and landing vehicle. We have another quadcopter with cutting-edge standards. More take-off and landing, vertical take-off and landings. Small reconnaissance drones, bigger drones. If you can see the drone in the background just so they can measure the size. There's, I think, you can see a couple of people on the top left just so you can measure them. We have way bigger drones. This is a quadcopter that has carrying a huge payload. We also have delivery drones with vertical take-off and landing. We have huge endurance mission flight drones. Those are drones that can last up to 70 minutes on the air. We have medical delivery drones. We really support all type of vehicles within the PX4 stack. All right, so what is this stack? So the complete stack is the PX4 autopilot and the surrounding technologies of open source projects that got created within the project. So it's not enough to just have one, the open source autopilot. When the project got created, it went hand-in-hand with the mapping protocol, Q-ground control, the ground control station and the PIXOCK project. We later added one of the most advanced components which is the map SDK. So right now I'm gonna focus on the PX4 autopilot real quick. All right, so it's an open source autopilot for unmanned vehicles with the very permissive BSD3 license. We support many types of drones from multi copters to be told like the ones I show you in the previous slides. It's been around for more than a decade and has more than 600 contributors over across the globe with a community of thousands of developers enthusiasts. So the autopilot is the core of the community and it's where everything starts and where everything ends. We like to make the analogy that PX4 autopilot is like the Linux kernel of the open source drones ecosystem. All right, so let's keep going with the open source drone stack. So the next up, very importantly, we have a map link. So what is mapping? So quickly, mapping is a trusted lightweight messaging protocol. Our next presenter is going to be speaking a little bit more of that into mapping in detail. How do you use it and leverage it with using the map SDK? But it's really important for you to understand that map link is the language to drones speak. So they speak map link within the drone. So from drone component to drone component, you can speak map link from a vehicle to a ground control station or mobile app. You can speak map link and that's the way how you control it. That's how you see the telemetry and that's being adopted by leading industry vendors, not just by open source projects like PX4. This has been adopted by other open source, sorry, drone industry vendors. Next up, we got Q-Ground Control. Q-Ground Control has actually quite an interesting story. Map link has a specification needed to reference implementation. So the idea behind Q-Ground Control was to create the reference implementation and a UI for drones. So the team set out to create a cross-platform UI engine and they are utilizing QT and thanks to QT, we are cross-platform. So we got compatibility with iOS, Android, Linux and all the major OS vendors. It's fully customizable so vendors can take the shell of Q-Ground Control and theme it to their needs. So it's a very widely adopted being commercial and enterprise vendors. What can you do with mission, with Ground Control Station? You can do mission planning for autonomous flights. So you can tune, set up, configure and every aspect of your drone, you can control it through Q-Ground Control. Like I said, it's made specifically for map links. So you are able to leverage that and get access to all of the telemetry, including live video that you can get out of a drone. So it's a really important part of the stack. Next up, we got the hardware. Software architecture and the software stack, it couldn't be complete without having a hardware to run on. So the Pixoc project that got created that actually launched all of this is a set of hardware specifications and hardware controllers and it's flight controllers initially. And then this year, we're introducing smart battery management, the specifications and payload specifications. We're going really in-depth to cutting-edge technology here. There's so many examples of drones, sorry, the Pixoc products right here. We got all the way from the initial versions to the cutting-edge sky node from a drone code member, Autarian. It's really important to note that the standards that are currently in here are being developed within the drone code foundations, part of the Pixoc special interest group. And lastly, the map SDK. So I'm gonna touch lightly in the map SDK because our next speaker is going to give you a tutorial. Map SDK is a set of libraries that provides a high-level API to Maplink. What this means is that you're able to leverage the specification without needing to re-implement across different language stacks. So with a single Maplink implementation, you got one backend and you got a set of platforms and language reported. So that team was smart enough to pick up on URPC backend and that allowed them to create automatically some front-ends. They have Python support, Java as Swift. There's experimental JavaScript and Rust support as well. It's a consistent set of futures and very stable API. The stack could not be complete without a huge community supporting it. I wanna conclude my talk today by discussing the community because it's a really important part of any open-source project and we're no different than any other. So the community of the PX4 ecosystem is divided between professional developers, drone system integrators, component manufacturers, drone hobbyists, enthusiasts. We have researchers and students. So we have open-source maintainers. There's a huge community. And in 2019, we met for the first time in CERC. We were really lucky enough to be able to host an event for more than 200 developers. In 2019, it was our first time meeting as a community. It was a really exciting time for the PX4 community and everyone was very enthusiastic about it. The second year, which is 2020, we celebrated the event virtually. You can go to YouTube channel, the PX4 Autopilot channel. You can look up all of the sessions from 2019 and 2020. This year, we had lots of views. We have all of the community coming up to see us. Before I let you go, I want to talk to you about the community coordination efforts and what the resources are within the PX4 ecosystem. So the methods of the community coordination that we primarily use are GitHub, of course. So who is not on GitHub these days, the PX4 Autopilot firmware, you can find it on GitHub right now. We have more than, I think, 10,000 forks, lots of stars. The organizations that you might want to go and check out are the PX4 Oregon GitHub, the MapLink Oregon GitHub, and the Dronecode Oregon GitHub. The most important projects are the ones that are part of the stack. So that's the PX4 Autopilot, but that doesn't mean that it is the only project you're going to find in that organization. You're going to find other sub-components of that PX4 stack. So the PX4 stack is a huge stack. It has navigation, estimation. We got middleware. There's a lot of separation of concerns. It has a very modular architecture. And thanks to that, we're really able to leverage the experience from developers and contributors from across the range. So a lot of times contributors come here from different areas. We got roboticists. We got software engineers. We got hardware engineers. We also got graphic designers, user interface designers. We have all the way across the stack. So we're a very, very broad community and we're always very welcoming. We have a weekly developer call that happens on Wednesdays I think 6 p.m. European Central European Standard Time. There's a huge community that is on our forums. We have a discourse forums at discuss.px4.io. I'm going to be distributing these slides from my presentation. I'm going to make sure that there's links to all of the resources over there. We have a huge Slack team with more than 6,000 registered users. We got around an average of 500 active users on Slack day to day. So I encourage everyone to join. If you're looking to get started with drones, feel free to drop in, drop a line. If you're confused, you don't know where to get started. We're really friendly. We're always onboarding new users. We have a huge contributor base and have a PX4 ambassador program that is actually helping us but we volunteering some of their time and picking up some of the onboarding of the new user. So I encourage you to look that up. We have a monthly newsletter with all the monthly resources for drone code and PX4 developers. And of course we are all over social media. You can find PX4 on the bottom, Twitter, Facebook, LinkedIn, as well as Dronecode Foundation and Twitter, Facebook and LinkedIn. We're here and we are really happy to be here at the open-source drone summit today. I want to give a little more time to the community for Q&A. I know this was really fast. I wish we had more time with you today but I want to answer some of your questions so you have them. So this is a great time to ask them. I want to give you a few minutes to come up with a few questions before we move to our next speaker. All right, looks like we are getting some questions in. How are we doing on time? All right, looks like we don't have questions today. No problem. I guess we can move on to our next speaker. I can introduce Gonzalo. He's going to be speaking of controlling PX4 drones with Map SDK. And thank you for your time today. I'm really glad to be here. Please enjoy the rest of the open-source drone summit. I'll see you in the last session, which is the panel where all the speakers are going to be at. And I'm looking forward to that. Thank you. We look like we have one question. So I'm going to take one minute to answer this question. So it looks like does weather impact communication between drones? No, it can impact communication, but no, it doesn't impact that. Depends on the telemetry solution that you're using. All right, so let's move on with our next speaker. Thank you for your questions. Gong, are you ready? We can see your screen. Yeah. So, hi, Gong. Hi, everyone. Hi, everyone. Thanks a lot, Ramon, for your awesome presentation. It's always good to see your awesome presentations. You always contribute a lot to the PX4 ecosystem, and you can explain things very good. So I'm Gonzalo Atanasio. I'm from Portugal, and I'm an electro-technical engineer with a master in computers. I'm currently working at the Army, and I'm a drone enthusiast. And regarding that last part, I'm here to show you how to control PX4 drones with Mev SDK. But before jumping into the cool stuff to the live demo, I want to tell you about Mev SDK. So basically, sorry, basically Mev SDK is a Mevelink library. Mevelink is a communication protocol like Ramon said, and Mev SDK have APIs in different programming languages in Python, Swift, Java, JavaScript, Go, and C-Sharp, and it can run on multiple platforms, like Linux, Mac OS, Windows, Android, and iOS. This library provides a simple API for managing one or more vehicles, the programmatic access to the vehicle information, telemetry, control over missions, movement, and other operations. So this library typically can run on a vehicle-based companion computer, on a ground station, or even on a mobile device. This is because typically to do some art computing stuff, you need more CPU power. So if you want to do avoidance, route planning, or computer vision, you want to get the information from the drone sensors and do the art computations in a companion computer like Raspberry Pi, and then give the actions to the drone in the result of that computations. So if you find that Mav SDK is missing any plugin or something like that you want to implement, you can always do it using C++, and then it will replicate to all the languages, but I will get that into about a minute. Right now, I want you to understand that Mav SDK uses Mavelink, but Mavelink can be used alone. Like Ramon said, Q-Ground Control uses directly Mavelink, and Mavelink is the language that drones speak, so you don't need Mav SDK to use Mavelink, but Mav SDK gives you a layer of abstraction. It's simpler to use than use directly Mavelink. And for example here, Q-Ground Control uses directly Mavelink, and like Ramon said, Q-Ground Control provides a full flight control and vehicle setup for PX4, have a very cool UI interface, and I will show you a bit of it during the demo. So on the right, we have Q-Ground Control using Mavelink directly to talk to the drone, but here on the left, we can have Mav SDK that uses Mavelink and gives you a layer of abstraction because it's easier to code using Mav SDK than using Mavelink directly. I'm not saying to you do not use Mavelink, but if you use Mav SDK, you can do a lot of stuff pretty quickly. And one example of that, you can implement a new ground station, a bit like Q-Ground Control, or you can write a Python script to do the computation stuff I just told you about before and then send actions back to the drone. So a bit of language bindings. So like Ramon said, you have it available in a lot of languages, but if you want to use Mav SDK in C++ directly, you don't need anything else. You just import the library and you can get information from the drone and send actions to the drone. But if you want to use it in any other language like Java, Swift, Python, Go, C-sharp, or JavaScript, you need to start a Mav SDK server. You will see how easy it is during the demo, but for now, just understand that the Mav SDK server is used to expose the GRPC language bindings to the Mav SDK C++ specific methods. So this GRPC uses proto files to just translate the C++ methods into the language specific methods. And this is very good for maintaining the code because if you just want to have a new functionality, you just implement it in C++ and then update the proto file and you will get these functionalities in all the languages. If you want to learn more about this, you can see the Jonas Vodern talk in the PX4 Web Summit 2020. He explains in the live demo how you can do this, how you can implement a new functionality in Mav SDK C++ and replicate it to the other languages. It's very cool, I recommend you to see it. So now let's jump into the live demo. First of all, it's very difficult to do a live demo with the real drone. So we are going to use a simulation. And to run a simulation on your computer, you can do one of the things. You can install natively the tool chain on your PC or you can run a Docker image already built by Jonas Vodern, one of the Mav SDK maintainers. Thank you, Jonas. And this Docker image already has Ubuntu installed with all the dependencies and the simulators you need to run it. By default, this Docker image runs gazebo. So let's try and paste it. This is going to be very fast because I already downloaded the Docker image. So it's going to run the code and you will not get any graphical user interface because this is using gazebo add less. Add less means that you will not get a graphical user interface, but the simulation is running on the background and you can have share of it. And because you just got a PX4 shell and you have a PX4 shell and it's always good practice to test the commands before doing anything else. So let's try and do commander takeoff. And the drone is staking off and now let's do commander land. And the drone is going to land. You are not seeing anything because you don't have a graphical user interface, but I will show you right in a moment, I will build the code locally or you can get the graphical user interface using a simulation. So like I said, to run the code locally and the simulation locally, you need to install the tool chain. For that, you can go to px4.io and here you have a button called documentation. You just go in, you go to the PX4 developer guide because we are developers and you go to getting started, tool chain installation. You select your operating system. You go to Linux, it's mine. I'm using Linux 20.04. I go to Ubuntu, Debian Linux. You just clone this repo. You run the script and this script by magic will install all the dependencies you need and these two simulators, Gazebo and JMAV Sim. And in the Docker image, we used Gazebo. So now let's try and use JMAV Sim. To build it, you just go into the firmware folder of the clone repo and you do make PX4 Seedle. Seedle means software in the loop, means we are using a simulated environment and now we will specify the simulator. We will say JMAV Sim and the code will run. The simulation is going to start and you will get a very cool graphical user interface. Cool, right? The drone in a nice green field. And now we want to test the PS4 with the PX4 Shell to send some commands to the drone before doing anything else. So let's do commander takeoff. Okay, let's do it. The drone is armed, the drone is taking off and now let's land it and the drone is landing. Pretty cool, right? Now you can see the things happening in real time. So now I just want to show you a bit of Q-Ground Control because it is awesome. And you can have Q-Ground Control here on a Linux computer or on a Windows computer or on a Mac OS computer or even you can have it like Ramon said on iOS or even an Android phone. When I'm using my real drone and when I'm flying my real drone I always like to have Q-Ground Control in my mobile phone instead of bringing a computer to the field. So here I'm showing you Q-Ground Control on a Linux but you can run it on your phone as long as you have telemetry configured on your drone and you can connect it to your phone, it's okay. So here you have the drone position in the first menu and you can see the drone in a nice green field and you can perform some actions on the drone using Q-Ground Control. Let's try and take off. We can specify like 5.3 meters. We slide it and take off is detected, it's going to take off and as soon as it reaches 4.3 meters it's going to hold position. And now we want to land it. We can see the information here on the right. We have the compass, we have the altitude, the ground speed, the flight time. And I'm going to show you real fast what are the menus of Q-Ground Control. So the first menu just have the Q-Ground Control settings. The second menu have the PX4 settings and PX4 configuration. You can see a summary here. You can upload the latest firmware here. You can select your airframe. You can calibrate your sensors here, calibrate your radio controller, select flight modes, power consumption, test the motors, configure safety measures. This is very important. You can tune your drone, some camera settings and configure some specific parameters. In the third tab, you can make missions and upload them to the drone. I'm going to do one very fast because it's a very awesome feature. So let's do and take off the drone. Let's add a waypoint here. Let's add another waypoint here and let's say return to launch right now. And we can see that the mission is going to run a five meters altitude. It's going to be a pretty fast one. So upload required. Let's upload it to the drone, done. And now let's start the mission in the main screen. The drone is going to take off until it reaches the desired altitude. In this case, five meters. Then it's going to the first waypoint to the second waypoint. And then it's going to get some altitude because of safety measures and it will return to launch and land on the same spot it took off. So let's let it run. But for now, this talk is about Mav SDK, right? And using code to control a drone. That's the cool stuff because we are developers. So let's try and do it. First of all, I always like to start a Python virtual environment. So let's do Python dash M vmvm. You don't need to do this. Okay. I like to do it, but you don't need to do this. So to start it, you just do source dot dash vmvm and bin and activate. And you can see it's activated. Here on the left, you've got vmvm information. And now you can check your Python version. By doing Python dash v. And I got 3.8.5. It's desired that you have 3.6 or more because Mav SDK uses a library called the sync IO that it's only available on 3.6 or more. Okay. So now we want to install Mav SDK and we will do pip install Mav SDK. And it's done pretty fast, right? But now, like I said, Mav SDK uses another library called a sync IO. This library allows you to make a synchronous requests and wait for its response because when you are talking to a drone, you want to send action, but you want to await its response to make sure it's implemented or executed correctly. So to do that, you need to run an event loop of a sync IO. And if you want to write this in a Rappel console, you need a synchronous Rappel console. And for that, you need to install another library called a your console. And it's done. So now instead of doing Python, I'm going to do a Python and you will get a console running and a sync IO event loop. And we can await calls. So first of all, let's write our first Mav SDK commands. Let's do import Mav SDK. And now, like I said, we need to start the Mav SDK server. And for that, we are going to do drone equal Mav SDK dot system. And now we need to connect to the drone. So as the drone, sorry, we need to await drone dot connect. And we connected to the drone. We can see here on the top, we have a partner IP on local host. It's true because we are running Mav SDK server on local host. So now let's await and let's do drone dot action dot arm because we need to arm the drone before take off. So let's do arm. And now let's do take off. And the drone is taking off. Pretty cool, right? It just commanded our drone using Python code. And now let's land it. await drone dot action dot land. And we landed the drone. So now we wrote this code using a Ripple terminal but what if you want to write this code using a Python script instead of a Ripple terminal? It's a little bit different. So I'm going to show you how to do it. You don't have to use PyCharm. I'm using PyCharm, but you can use VS code or Adam or your favorite IDE. I use PyCharm for Python because I find it very helpful regarding the awesome intelligence. So I'm going to create a new file. I'm going to call it take off and land dot bye. And I'm going to start by importing Mav SDK. And here you have intelligence. So it's easier to write. And I'm going to import the sync IO2. So we need to implement an asynchronous function. We're going to do a sync def run. And now we need to do the thing we did before. We need to start the Mav SDK server. So I'm going to drone equals Mav SDK dot system. And here we have intelligence. So we can check the input parameter of system. We have two. We have the Mav SDK server address and the port. For the Mav SDK server address by default, it's localhost. And the port is 551. So let's keep it that way. And now we have to await drone to connect. So let's do drone dot connect. And we can say, we can see that it takes up one parameter system address. And by default, the system address is UDP localhost, 14, 540 port. And this is one of the ports where Mav link is running on the drone. So let's also keep it that way. And for now, because we are running a script, we need to make sure that the drone is connected. So we will do a sync for state in drone dot core dot connection state. And we will check if state dot is connected. We can print something like awesome. Drone is connected. And we break the loop. And now that the drone is connected, we want to perform the same actions we did before. Like we need to await drone dot action dot arm. And next, we want to await drone dot action dot takeoff. As you can see, we have intelligence. It's much easier to write. And now we want to hold position for 10 seconds because we are writing a script. It's running very fast. And if we do takeoff and land, it will takeoff and land immediately. We will not see almost anything happening to the drone. So we want to wait for 10 seconds. And to do that, we are going to use an asyncIO function called sleep. And it takes up the number of seconds we want to sleep. We will say 10 because it's Friday. And we are tired. So let's await now the drone dot action dot land. And we will land the drone. But now we just declared an asynchronous function. We need to call it to run it. And for that, we are going to declare a main function in a very Python-ish way. And now we need to get an event loop from asyncIO. And we need to say loop dot run until complete, the function we just implemented. And our script is done. Now we just have to run it and see if it runs. Python, takeoff and land dot pi. So awesome, drone is connected. It's going to takeoff. And the demo gods are with us. So 10 seconds. And it's going to land. Very cool, right? So this is very simple, right? But if you want to do some more complex stuff, you don't have to start from scratch. You can always go to the Mav SDK repo maintained by Julian O's and Jonas Vodern. And you can see here a folder called Examples. And here you can check all the examples it has here, like calibration, camera, geofence, skimball, and even one called mission dot pi. This mission dot pi does almost the same thing we did using QtronControl. It uploads a mission to the drone and the drone runs that mission. And I have it here locally. So I can show you. I can do Python and I can do mission dot pi. Right into drone to connect, upload the mission, arming and starting the mission. I can show you here the code very fast. And here it's done the same thing we did. We connect to the drone. We will wait. We make sure that the drone is connected. We declare some mission items and append them to enlist. We upload the mission to the drone. We arm the drone. We start the mission and we wait for the mission to hand. So if you want to do complex stuff, you don't always need to start from scratch. You can always check the examples first if anyone already did this and start from there. So now let's get back to the presentation. And I want to ask you, why my VSTK? You just saw what I did. And do you find it difficult? It's not difficult. It's very easy. Easy to install, easy to use. This was the thing that catch me in the beginning. I was trying my VSTK and I thought, this is it. This is that easy. You can run a simulation and command the drone in a couple of minutes. You don't need to have all the tool chain installed if you want. You just need to run the Docker image and have Python installed and you can run it. So as you can see, it's easy to install. It's easy to use. As a very stable API, there are a lot of production environments already using my VSTK. You can trust it. And as you can see, it's cross-platform and multi-language. So if you don't like Python, you can use it in Java or Swift. And if you don't like Linux, you can run it on Windows or Mac OS. So you can try it whatever operating system you are using and in the language you like the most. And as you can see, it's very accessible. So if you want to implement a new functionality, you just write it in C++. You update the proto files and the language bindings are auto-generated. Like I said before, if you want to learn about this, just watch the Jonas Voderm Talk in the Web Summit 2020. It's very insightful. So just use the code. And a bit of bot documentation. I know Ramon already talked about this. So I'm just going to show you where to find it. These links I have here. You just have to go to px4.io. You go to documentation. And here on the bottom, you have the px4 developer guide that will show you how to get started with the px4 codes and how to contribute, how to build a code locally, run your first application, and stuff like that. So if you just want to learn the QGroundControl user guide how to use QGroundControl while the man is explained, you can go to the QGroundControl user guide. And if you want to know how to develop for QGroundControl using QT, like Ramon told you about, you can go to QGroundControl developer guide. If you want to know about Mavelink, how the protocol is implemented, how the bytes are used, the messages, and stuff like that, you can go to the Mavelink guide. And if you will want to learn more about Mavelstk, you can go to the Mavelstk guide. It will show you all the languages implementations. It will point you to the repositories on GitHub, and you can check them thoroughly. So you have also drone code camera manager. Here, the documentation we are right now, it's the user guide. And here you can learn how to fly a drone, how to choose your airframe, how to choose hardware, how to choose your flight modes. You can learn a lot about in the user point of view. So just go and check it. It's very cool. And last but not least, like Ramon said, we are a very awesome community. So if you have any doubts about any of that, you can just go to the Slack channel or discuss forum and ask us for stuff. We are going to help you. And if you find any bugs, you can always try and fix them and submit the pull request on GitHub and the maintainers will review it and will help you through it. Or if you find any issue you can solve, you can always open an issue on GitHub. And our community is awesome and we would like to get more people involved. And if you want to get involved, just talk to us. Every pro wants a noob. So don't be afraid to ask questions. None of the questions are not meaningful. All the questions are meaningful. So if you want, just send them. Right now I'm here available for Q&A too. So if you want, just pop the questions in the Q&A and I will try and answer them. So I'm here for you. I hope you liked the presentation and the live demo. So in the first example, there was a wait of 10 seconds to not have landing immediately after takeoff. Can you give an example to go to the next command after the first is finished? For instance, instead of takeoff, you have to takeoff and climb to X-meter altitude. Yeah, I think it's really important here to let the audience know that the complexity on the map link spec side has been mostly taken away by the MAP SDK. So it's also hiding some of the implementation details and some of the interfaces on the action for the MAP SDK like takeoff and landing have default parameters. So you can actually set takeoff with 10 meters, takeoff with 100 meters and you can say that or that's actually coming in all the way down from PX4. PX4 determines how you takeoff, how you land. Like in one of the examples, Gong showed you that the drone was going to take like ramp up and start going up before it landed that's a safety feature. So those are all configurable by parameters and MAP SDK does magic in the backend to hide some of that stuff for you, but you are able to expose that back and control it again. Yeah, true story. So thank you, Ramon, for the answer. Yeah, it's true. You can specify the altitude you want and you can specify the safety measures. So it's a defaults behavior because of the safety measures implemented like Ramon said, and you can always specify it. And you can see all about in the documentation of the MAP SDK Python, you have the methods explained and the parameters it takes and how are the default values implemented. So let's go to the next question. Can we run the code in a virtual machine? Does it require a lot of memory and disk space? No, it does not require a lot of memory and disk space, my laptop is pretty old, as six years old and this is not a Mac book or something like that. So I have run this virtualization with four gigs of RAM and dual core CPU, you can run it. Probably you will not get very complex worlds using gazebo or something, but if you just use jmf-sim or simple worlds, you don't need a lot of RAM to run it. So try it, try the Docker image first if you want to try it. And if you are okay with it, just jump into installing the tool chain and run it directly on your machine. That's the advice I get to you. So next question, very nice demo, thanks. I am curious if there is any manual or to create drone from scratch with Debian Linux system, what extension cards to use and how to set up software, et cetera. I have our time to find out to connect to DX6I to base board. Okay, I think I have an AI6B, so I don't have that specific hardware, but if you want to create a drone from scratch with Debian Linux, I don't know if you are asking about hardware in the loop. If you are asking about hardware in the loop that is running a simulation with hardware connected to your PC, there are a lot of tutorials there for you to follow in the PX for documentation. I tried it, you just connect your board with the hardware connected to it and you can run the simulation in your PC and get the information from your real sensors in your drone and have the simulator running and you can perform operations using Qground Control or Mav SDK. And I don't know if this is what you are talking about. So if you want to specify it, if I can answer it more correctly. And I just want to extend on your answer and calling it if you allow me. The, there exists a lot of the DIY kits right now to assemble, that allow you to assemble a quick drone. So a lot of the drone code, there's a couple of drone code members that offer kits, like one of them is Hollywood. They have a really good kit, including everything you need, motors, telemetry, a flight controller and the airframe. There's also another one by a gold member in XP. They actually have an awesome development challenge called Harbor Games. And they also offer their flight controller, airframe and all of the components needed to assemble your drone. It's really important to note that the PX4 autopilot runs on an art source called Apache Nautics. So the flight controllers will run on the flight controller unit. And then if you want to add a companion computer, you can do that through a serial port. And on the later, the most recent versions of TextLock where you can use a more high speed connection back to your companion computer. So you can have that configuration. So it depends on the topology that you'd like to run. So you can have your companion computer running on top of your drone or you can have that companion computer sitting on your desktop when you're piloting and you can have a telemetry radio like the DX6I that you were mentioning. Yeah, true story. And you were asking about real hardware. Yeah, I didn't understand the question correctly. Ramon understood it correctly. So you wanted to have a companion computer. Yeah, you can have a companion computer. You don't need to install the autopilot on the companion computer. You just need, for example, a raceberry pi, for example, install it Debian. And then you can install ROS, use MAV ROS, or you can use MAV SDK, install Python and use MAV SDK and connect that companion computer directly to the bigsock or you can have it look on your desk like Ramon said. So yes, you can do it. You don't need to install PX4 on it. PX4, you're running on an autopilot, so you just need to install the tools you want to give information to your autopilot using Mavelink, OK? I don't know if you understand it and if I can explain it a little better. Yeah, and I think it's also mentioned in 96 boards. And 96 boards is also a member and they're actually going working right now and releasing a mezzanine board that includes the FMU spec. So they're really pixel compatible mezzanine board by 96 boards. And I also want to throw in in there that Gumsticks recently launched Raspberry Pi version of a FMU v6 spec. So that's a PX4 bundled within a Raspberry Pi 4. So that's a really cool board. I encourage you to go and check it out. It's like Gumsticks and that's one of the latest Raspberry Pi for the Compact version that we released. So I think there are no more questions. So I'm going to pass my word to Travis Botelico. So I hope you like my presentation. I'm here in the end for some more questions if you have him. So thanks a lot, Ramon, again. And sorry, sorry Travis. I think I got one more question here. I'm going to answer it really fast. Newbie question, do you only do flying drones or are you also sailing and riding drones? Yeah, you can install PX4 on all kinds of robots. You can install it in robots too, not only in flying drones. You have aquatic drones too and land drones too. So it's not explicitly to flying drones. Correct. We support underwater vehicles as well as rover. And they're supported all the way down to simulation as well. Yes. Gong, thank you for your awesome presentation and your awesome demo. I love your fearless attitude towards demos. And I think it's one of the things that runs within the MAP SDK community. And I want to keep encouraging that. Thank you for your time. If you have more questions for Gong, he's going to be available together with me and the rest of the speakers on the panel that is going to be hosted after Travis talk. So next up, we got Travis from Model AI. He is going to be giving us a quick presentation on a quick 360 overview of how to get started with PX4. Go ahead, Travis. Thank you. All right. You guys able to see my screen? Let me know if I'm not. Well, thanks for having me. My name is Travis Podolico. I'm an engineer at Mill AI. And kind of going off that one of those last questions about companion computers and hardware, I'm going to dive into that a little bit deeper right now. My main goal today is to describe the benefits that we've experienced using the drone code hosted projects with respect to bringing products to market. And we're using products in the autonomous systems area. So a quick background on myself. My career has been around embedded systems, product development, electronics manufacturing. Up until about a year and a half ago, I was focused mainly solving problems in the industrial IoT space. I worked for SKF bearings for a decade or more. So I'm kind of a newbie in the field of robotics and drones. And really, the use of the drone code hosted projects has been crucial to reducing the learning curve for me. It's been awesome and a little bit more specific. I've been responsible for bringing up PX4 on our hardware, which I'm going to show you in a little bit. So as a silver member of Drone Code, Mill AI contributes hardware and software development to advance the growing ecosystem of the open source PX4 ecosystem. So we build drone perception and communication systems. And while the whole time leveraging the drone code hosted projects that Ramon and Ghana have been talking about, we have one of our products is Voxel Flight, which I'll show. And it's a combination of a PX4 flight controller and then also a companion computer. And it enables drones with autonomy. And our use cases are in the military, delivery, asset inspection, every day it's like a new use case, which is pretty, pretty cool. So at Mill AI, it's 18 folks and we're heavily skewed towards the engineering side of things. So what I want to do is walk you through a quick theoretical or not theoretical, but a quick product release cycle as a story to show you the benefits that we had through the Drone Code project. So I'm going to start off with a quick hardware design stage. So here's a few of our products. A key component on the left is the flight controller sub-system. And that's essentially what you saw in Ghana's demo flying around. And we've leaned heavily on Drone Code hosted projects throughout. So when designing hardware, like any complex system, there are choices that can be made during the design phase. And that could limit potential problems in the future. So Drone Code has a hardware working group. And our team has been involved with that. And the goal of that group is defining open standards to promote interoperability. And it works closely with the PIXOC project. So the standards provide readily available hardware specs and guidelines for drone systems development. And what we did is we leveraged two of those standards, the first being the PIXOC revision V5X standard. And we use that as a mechanism to lay out our system architecture. It defines the connections between the microcontroller, the different peripherals. And it helped us to drive component selection. The other standard is the PIXOC connector standard. And you can imagine when you're exposing pins to the various connectors that you could see on those boards or lots of plugs. It ensures a high level of interoperability with the various vendors of different hardware that could plug into those products. And by adopting these standards, we've reduced the amount of time needed to gather requirements and to design. And at the same time, we're taking risk away. And what I'll show next is it really helps with firmware development and bring up by following these standards. So at this point, let's pretend that we've designed our hardware using the standards from the last page. And then you'll move into kind of the during manufacturing or after design and manufacturing, you'll start to be working on the firmware, right? So in my case, this was the first time I was exposed to really anything in the drone code projects. It was I got hardware given to me and said, let's bring the firmware up. So I was really a newbie, never used PX4. So I want to give an example of the usability of the project. And really, we had a vehicle flying within a few weeks after starting. And that's not my skill. That's really the amount of resources and ease of use provided by the PX4 code. So a little bit busy here. But I think it's well worth spending some time on this slide. The PX4 firmware, it's very feature rich. I try to capture some of the high level functional blocks that you might want to or might need to develop on your own when you're developing an autopilot on an embedded system. And if nothing else, the key takeaway is the limited amount of effort required to bring up the board when using PX4. The main body of work for us is what you could consider the board support package area. This is where you let the firmware know where the pins are located, basically the pinout of the micro controller. So if you're doing any kind of embedded development, that's your typical board.h. And within the PX4 code base, there's multiple boards to reference. So what we did is, at the time, the FMEU V5X was coming out. And we kind of did a copy paste of that template and then had to do some small tweaks, mainly because we weren't 100% in line with the standards due to our own constraints. And then as mentioned before, PX4 is running on top of NuttX, the real-time operating system. And that has your typical menu config. So you could kind of run that and then set up the underlying stuff as well. So it's very usable. The next area of work you could think of is in the drivers area. So PX4 has a lot of drivers, and it's ready to support most of your common components. For example, we have three IMUs. Two of those were ready out of the box, one we had to get a driver going. And we had the hardware. We didn't have the driver in place. And we sent our hardware out to the PX4's main architect, Dan Agar, and he actually wrote the driver for us. So that was very cool. And we implemented a new barometer driver and then a power module. But in most of the cases, all the drivers were there. So the driver architecture, it's super clean. And it's operating on this published sub-subscribed style message bus. So it's very D-coupled and easy to get in there and to start working on. There's tons of references in the drivers area. So for like a magnetometer, barometer, IMUs, heaters, it's awesome. There's tons of drivers. And lots of times you get kind of copy, paste, and tweak if you need to add a new component. So we barely did any work in the drivers. And then after that, we had to work on the bootloader, which really was a minimal amount of work. So in order to get the firmware update mechanism in place to allow, we saw earlier, Q-ground control as a firmware update mechanism. So we had to add our board to the bootloader. But it was under a few lines of code there. The rest of the code was used out of the box. So you can see on the bottom right of the slide, there is my GitHub handle for PX4. And 14 commits only under 7,000 lines of code. And we got a board up and out to market. So what we use out of the box is the actual flight controller. This is where the super smart people like gone in the previous talk, they're doing a lot of work. We didn't touch that. We used it out of the box. That's huge effort saved. You get a proven firmware update system. So you could use Q-ground control. And you could push your firmware changes through that. So the users out in the community will automatically get that. You get a really robust logging system where you can get all the details from all the sensors, everything that you need. And you get a proven message bus. The architecture is very nice. Like I mentioned before, it's a decoupled kind of system, very clean. Huge thing you learned earlier that there's the Madlink protocol that we're using out of the box. And so this prevents you from having to roll your own, spend all the time learning what you're missing as you build features in. So you can just take that out of the box and use it. And it covers most of the use cases that you would need. Another cool thing is the ecosystem set up with CI servers. Like I mentioned earlier, we shipped our hardware out to Dan Agar and he put our hardware in the CI rack. So every time there's a pull request, the code base is ran on multiple targets. And you can find problems early. OK, so I'll just say that the firmware bring up, it was a piece of cake. So you saw it was like 7,000 lines of code and we got the board going. So after we did that, we kind of need to move into the system validation part of the product development. And again, we've leveraged the drone code hosted projects. So when you're bringing up a new product, you can spend a lot of time building up your own tools to validate your design. So QGround Control, it offers extensive visualization and configuration. And we use this extensively during driver validation. Through QGround Control, you could also get access to the NutX shell. So if you want to get down to like the terminal, which actually you saw gone do, during development and validation, that's a very cool thing. You can get to like the PS command top. You can kind of see where resources are going. There's a huge parameter set that's available throughout the projects. And it allows you to customize and tweak settings. You can imagine on an IMU, you could have things like digital filters and PID settings. And so there's workflows to handle all these tweaks and tuning. So one thing which I didn't realize for a while was you can change PID settings mid-flight. So you're flying a drone in our net downstairs. And you could adjust the feedback controller settings. And it gets pretty, pretty, pretty wild. So it's pretty, pretty cool. And a very, very awesome thing is that the end users in this product space, they know these tools very well. So you can rely on the drone code-hosted projects being in their workflows. And you really just need to ship hardware out. And they'll be comfortable with it. So I could send. I have never met Gon before these talks. But I could send him our hardware. And he would be instantly capable to use it because of the ecosystem. And that's very cool. So this is just kind of a graphical overview. And it's an example of how we're using a bring your own hardware approach where the main components of the architecture for the standard use cases are brought to you by the drone code-hosted projects. So this is our Voxland 500. It's like a fully built autonomous drone for developers. But the core pieces you could see there were using PIXOC standard compliant connectors. And that allows us to connect things like a Hollywood G, G, PS, and MAG to it. We have the PX4 firmware running essentially out of the box, just our board support packages in there. And I'm going to show you. We have the Mav SDK or Mavros. We could run that in Docker on this guy. And throughout the whole time, we're using QGround Control as a kind of a front end for visualizing and keeping track of things. And that's communicating over the Maverick protocol. So kind of leaning on the why I feel it's awesome to have this ecosystem is, here's an example, we shipped our hardware down to the drone code test team down in Tijuana. And we did ship them hardware. And hopefully you guys can see this video. It allows us to, again, because of the ecosystem, it's such a feature rich setup. And the support is awesome. We can send hardware out and people know how to use the product. So this is us getting field validation from a team we never met in person. We just shipped the drone down. I think Ramon has it in his office right now. And yeah, so I think that's a huge benefit. So we saw this graphic by Ramon a little bit ago. But it's a similar experience that I've had with most of our customers, which is a pleasure. And it's the onboarding process is already out there and out in the world. It's known and it's a usable process. So what you get is this potential user base that's already familiar with the product, the workflows, and the experience. And that by itself has been a huge time save. And we've benefited from that a lot. So at this point, you saw that the drone was flying around. It's in a usable state. So in our simplified product release cycle, we have the fundamental vehicle and ground control station working. So for most of the use cases, you'd be set. So a limited amount of work, followed the hardware standards, used the PX4 ecosystem and the drunk code hosted projects like QGround Control to kind of validate. And now we have a functional product. So at Model AI, we're focusing on drone perception and communication systems. And so I'm going to describe a few of the use cases that we're working on now and go a little deeper into the flight controller and companion computer system. So I'm a geek and a lot of geeky data here. So bear with me. But this is our voxel flight at a high level. On the left side, we have the PX4 based flight controller. So this is using like a typical ARM Cortex. It's an M7 processor, 216 megahertz. You have a bunch of sensors and interfaces over there. So it's three IMUs, five UARTs, two I squared C, GPIO, a bunch of interfaces. So that's where all those connectors are, right? And what we do is we kind of marry this flight controller with a companion computer in one. And that's our voxel based companion computer. And this has a bit more computational power. It's using a Snapdragon 821, which is a quad core processor. It has a Adreno GPU on board. It allows us to do things like object detection using TensorFlow Lite at like 720p, 20 frames per second. We have two DSPs. One of those DSPs that can actually run PX4, but we're not doing that at this point, but it's possible. So that's kind of, that gives us some horsepower on the companion computer side to do things for vision. So to help with perception, we have several camera interfaces on there as well. We have three MIPI CSI2 interfaces. We have USB UVC support, which is basically a USB camera. We could do HDMI input and we could do 4K 30 video capture, H264, 265 hardware acceleration on there. The companion computer itself adds two more IMUs and even more UART, I squared C, G, GPIO. It's got built-in Wi-Fi, which you can use to connect to Q down control. And you can also slap on these mezzanine boards on top that add on things like 4G, LTE, and another thing which is micro hard for like point-to-point communications. So this guy's, it's size, weight, power, cost optimized. So it's coming in at under 24 grams and it's less, using less than 10 watts of power. So it's kind of a small form factor meant for drones with a lot of computational power there. So that's kind of the hardware geeky specs. So how does Drone Code, how do the Drone Code hosted projects fit into the system? We're using PX4 out of the box. So we've just added our board support layer and what we do is we can communicate again with the Mavelink protocol and we're doing that through an internal UART bus. So at like 921 kbaud rate, we have the flight controller communicating with the companion computer using Mavelink out of the box. So with respect to the companion computer software, we're using a Yachto built Linux, which I think this crowd should be familiar with. And we have an open source kernel that is available and you could build that and modify as needed. A very, a feature I like is we have Docker on target. So you could run Docker on here. So you could get like Ubuntu on there, Alpine built in is Ross Indigo into the image, OpenCV is built in and then that's kind of baked into the image. And then on top of that, we have a suite of open source tools hosted on GitLab where we have our production applications and examples they're all available and it shows how we support things like indoor navigation, obstacle avoidance. We use able tags for relocalization. We're gonna have a demo of that here shortly. There's examples of how to utilize the various camera interfaces, hardware acceleration methods, examples of how you would run code on the CPU, the GPU or the DSPs, examples of how to use the modem add-ons. And then we have a collection of Docker images like we have a Docker image for Mav SDK, C++, Mav SDK, Python, Mav Ross. So you could pull those to Docker pull from our Google Cloud registry hosted Dockers and then you could put that on target and then you could have Mav SDK on target. So on the companion computer. So kind of the high level from this is in general, we're trying to make all the work, all the code well documented and open source and best kind of best practices of how to use the hardware and make that available to anybody to use. So in light of trying to make things easy to use, we want to also just provide you with or provide people with hardware that they could kind of take out of the box and use. So on the left side, we've taken that box of flight, which is a flight controller and companion computer and we put it into what we call the flight deck, which basically gives you the perception upfront and that's, you get stereo cameras at 45 degree downward facing tracking camera with the fisheye lens and you have a 4K camera for FPV or like object detection. And that's in like a vibration isolated mount and you could mount that onto your own drone or a rover or not a sub, I think it would get too wet. But the idea here is you could take that and then put it onto a drone or a robot and then get vision from it. And then the next step is just, we have basically a Holleyboro S500 frame that we put our flight deck on top of and kind of build it up and test it and put it in a box and ship it out. And that's our Voxel M5 500. So before drones, I wasn't just doing embedded development on kind of static products stuck to a pump in a factory. But when we would, when you'd like say you're using like a Nordic NRF blue tooth module, you would buy a little development board and then you start working on that development board at your desk and then you'd start working on the features and then build up what the code would look like and then you could build your own hardware and then get your code running on that. So that's kind of the same idea that we're trying to do here. Give a pre-canned kind of working solution and you could develop code on top of that for your own feature set that you would need. So I think next we will talk about some of those. So here's a video that's showing our lab downstairs and as the video goes, hopefully it's showing up okay for you guys. What we're doing here is we have a GPS denied environment. So there's no way to get a location from a radio and we're using perception to kind of track where the vehicle is throughout flight. So you could get kind of an X, Y, Z in space. And the other thing we're doing here is the feature is called visual inertial odometry. So odometry is kind of like an odometer in a car which you could use to tell how far you're going. So a VIO, it tells you where you've gone since you've started, but not exactly where you are in space. So using a fiducial markers like Apple tag which you see on that crate right there, it's almost like a QR code. Using the same perception system, we can see that tag and then re-localize in space to exactly where that tag says the drone is. And you can imagine a use case being inside of a shopping center and you're going down like the aisles of a shopping center. And so the drone can tell where it is X, Y, Z in space while flying down the aisle. You get like a 1% drift. So after 100 meters, you might be one meter off and you could use these April tags to kind of re-localize and kind of snap back. So what you'd be able to do is fly a drone throughout an indoor area with precise low location. And the way we're doing that is using PX4 out of the box. We have a Mavros running inside of a Docker and Mavros is kind of acting as the flight manager or commander and it's telling the drone where to fly, like go 10 meters forward and then go 10 meters to the right. And then the visual perception system is keeping track of, okay, where am I? And then the April tags are basically saying, okay, you think you're here, but I'm gonna correct you a little tiny bit every so often. So we've heavily used the PX4 firmware for all the actual flying and then the Mavros for communicating throughout. And it's been a nice kind of flight there. I'm gonna do another demo here. So this is another use case that's very common in its collision prevention. So basically you don't want to fly into something, right? So that's something in this video is the whiteboard in the background. And what we're doing here is our companion computer is using the stereo cameras kind of like human eyes. And it can find depth from stereo just like your eyes and your brain do. And we can calculate distances right in front of us and we will pass a Mavelink packet over to PX4. It's called obstacle distance and it's basically an array of distances in front of us. The drone could be tipping forward and we could see still forward like this and calculate what's actually in front of us to prevent the drone from hitting something. So modal AI isn't doing the collision prevention, we're doing the object detection. So passing that data over to PX4 and then PX4 is handling the rest by limiting the sticks. So you can imagine PX4 or the ecosystem gives you a bunch of parameters. So you could tweak like how far away from the object you're allowed to fly. So out of the box you could get that kind of setup through PX4, Mavelink and Q-ground control. Another use case is we use Mavelink over 4G LTE. So we mentioned that, Gon mentioned that we have video streaming capabilities in Q-ground control and there's actually a H.264 video decoding in Q-ground control. So what we can do is we could take like a 4K FPV video and we could encode it H.264, send it over wireless to Q-ground control and we could visualize the actual flight as it's happening. The other thing we're doing is all the command and control features are happening through Mavelink over the same wireless link. So what we have here is Eric is in the office he's setting up the mission and flying it and then about 15 miles away our flight test area we have like a pilot in command that's out there watching because legally we can't do this yet, but we're starting up the mission from remote and then flying the mission and then being able to stream the video over and we're using other open source stuff like OpenVVPN is running on target and we have OpenVPN server which is acting as kind of the mechanism to allow us to go over the internet and then control a drone. So if there was no legal ramifications for this you could basically fly wherever you have a cellular signal. So there are three kind of quick examples of how we're augmenting the already awesome ecosystem set up by drone code. I really do have to say that I think my favorite part of all this is the community and they've Ramon and gone already mentioned it but it's really great. There's Slack channels which I'm on daily we mentioned like 6,000 folks on there and it's a group of people like really wanting to help and to like spread knowledge. So it's, I've learned a ton. There are, I have a funny story. I think we have time. I had a bug in the PX4 firmware after we kind of got our hardware going and the main man for the NuttX integration into PX4 David Sidrain, he only lives a couple of hours away for me so I drove out to his house and he helped me fix a bug. So that's kind of showing the community how awesome it is and there's the weekly developers calls joining on that. I do that just to try to keep up. The development is very fast and new features are being added a lot and it's a great way to stay in sync with the team and then there's always a time at the end to ask questions and you go from people like me who are newbie who's like what's a magnetometer doing in the system to the actual pros who are like creating the whole architecture. So you have this mix of people who are very capable and willing to help. So that's been awesome. So kind of the main takeaways I would say my hope was for people who are developers out there who don't know how to develop on drone systems or aren't comfortable with that or kind of afraid of the learning curve. I was there like a year and a half ago. So I started as an embedded kind of guy not knowing what anything about drones was and then a year and a half later where I've released product using the drone code hosted products. So I hope I ease any kind of potential kind of like worry about getting going because it's a great setup to a start. So if you are kind of looking to develop hardware there are all the key takeaways are to reference the standards. It will really save you time because there's a very good chance you could do like no coding or nearly no coding if you follow the hardware standards to a T. If you're selecting like IMUs or stuff like that you can see what's in the PXR code base and then you can leverage the code that's already there which is awesome saves tons of time. When you're bringing up from the firmware there's references to use. So if you need to see how to bring the board support package in there's a copy paste and start to tweak and then adding drivers look for examples as templates and then leverage the weekly dev calls and then in system validation I would say just look throughout the there's so many tools to use. We use in our production line lots of the tool so there's tons of Python scripts for like uploading the firmware, Mavelink connections. So there's all these little nuggets of treasures that are throughout the projects that you can use and then tune into like the PX4 Slack channel and ask questions and then if you want to start to tweak things like we're doing there's just a huge playground of tools like Math SDK which allows you to do lots of fun things like Don showed you. So that's basically all I had to say. I wanna say thank you for inviting me. This QR code you could scan and it takes you to some links and you can learn more about mobile AI if you're interested but yeah, that's it. Let me know if there's any questions. Awesome, thank you Travis. So any questions from the viewers? And while we wait for those questions, thanks for your time today. It was really awesome presentation and how to go to market with PX4. It's really refreshing to get a perspective from a manufacturer that is integrating most of all the open source projects and most of these standards. So thank you again for coming in. If there's no questions, I think we can jump into the panel. Jinger, are you available? So maybe we can like skip ahead, jump into the panel and maybe the audience will get some more questions while we're discussing. Yeah, absolutely. So hi everyone, my name is Jinger Zhang. I am the community manager working at the Drone Code Foundation. Like what Ramon is saying at Drone Code we really helped to build the sustainable ecosystem for open source drones. I'm really happy to be here today and to host this panel with the speakers. I think we have everybody here and I think we are doing a drone queue right now. Maybe I should pull up my drone as well. So first of all, I'd actually like to ask Patricia in the background to help me with a pool. We wanna be pretty interactive with the audiences. The pool is gonna be about what your background is. How would you describe yourself on Patricia if you could help me operate the pool? Yeah, this is just to help us understand the demographic, the audiences a bit better. You have a couple of options, software engineer, embedded system engineer, hardware engineer, or product manager or if you're in business development. Well, we'll give just one or two minutes for us to collect the pool. Awesome, thank you, engineer. Yeah, well, we wait for your answers. There's just one more option. Well, ready to fly vehicles. This is Holary Games by NXP and this is the M500 Voxel that Travis was showing you right here. I have a Holybro QAV 250. Awesome. I like that one, it's a smaller one. Much more compact. And you have to check the video from Model AI website where they build all the walls around the drones. That is pretty awesome. I would like to test that one there. Very good. Google Model AI and find the website and find that video. Okay, well, Tricia's working on the pool. Thank you for participating if you have filled out the pool. Let's get started with the panel. I actually have a bunch of questions already lined up. One thing really cool about this is that one thing, a really quick plug, we saw there are some questions regarding Ross, regarding, you know, Matt Ross, Ramon, would you like to actually talk a little bit about the upcoming event we have at Rossworld because we have actually a full track of content there to do really in-depth discussion. Yeah, for sure. Thank you, Junior, thanks for the reminder. So in November 12th, the Ross community is going to be celebrating their yearly event. This year is going to be named Rossworld 2020 and they graciously invited the Drunk of Foundation and we brought six great sessions with five speakers. We have all the way from getting started with PX4, using Ross1, Ross2, all the interfaces and all the toolkits that we offer as a community, like Travis was mentioning, we have lots of different projects within our ecosystem, not just the main repositories and then all of those are going to be covered by our speakers during the Rossworld track. So yeah, go check that out if you're interested in using PX4 and Ross. We have a really tight relationship with Ross community and we have a really great interface that we've been developing over the years. So encourage everyone to go and look that up. If you go to our social media, Drunk of Foundation and LinkedIn, Twitter, Facebook, you'll be able to look up the full track of the Rossworld 2020 schedule for Drunk of. Awesome. Thank you, Ramon. So nice. Okay. And also so the, maybe just a little bit more about just what Mavros is and a little bit briefing just so the audience can also understand. Yeah, yeah, definitely. So we mentioned that PX4 has a couple of interfaces where you can talk to them. One of them is Mavling that we've been discussing throughout the track today. So Mavling is our main interfaces is what we like to choose as early factor interface but in the initial times when we tried to integrate with Ross, the way that the developers did that was by translating from Mavling to Ross. So there's a utility called Mavros that does that for you. So it connects through PX4 or to any Mavling speaking drones and it translates the Mavling messages back to Ross messages. And that's what we use for Ross one. It's really built into the PX4 ecosystem. We're leveraging that every day. You can use like a SIBO simulator and leverage all the Ross toolkit to work with PX4. Out of the box, nothing needs to be modified. It's already there. And now since 2019, we have a Ross to support with faster TPS. So PX4 has a middleware embedded between the architecture. So thanks to UORP, which is a PubSub middleware that PX4 uses. We have a very modular architecture. So thanks to that, we're able to make that compatible with VDS and speak directly to Ross too. So there's a now interface for more real-time connections. So you would take that a step ahead from MAP SDK where you will need a more real-time connection. So you would like to define like flight, more specific flight modes. You have more tight control loops over the autopilot. That's what you would be using. And yeah, we're going to be talking more in-depth about those in the Rolls-Roll 2020, November 12th. That's a free conference. Make sure you register for that. And it'll be awesome to see you there as well. Great. Then Ramon, you actually mentioned one of the great safety feature and awesome feature of PX4, which is the different flight modes. I want to ask the question in terms of for flight mode, what are the usual flight modes in your regular development? You actually usually use. Travis, like for model AI in your testing? Oh yeah, so I personally will do a lot of manual flight and then we use a, because we're flying, I fly inside almost all the time. And so it's a position hold and we could use that with the perception to kind of hold our vehicle steady. So the M500 development drone will go through the flight test using that. One of the cool features we use is the off-board mode. And what we do is one of the products that's running as a Linux service is VoxelVision PX4. This is kind of the mavelink proxy which will take mavelink from PX4 and then shove it out network over UDP and then vice versa. So that same Linux service that's running on the drone, it will be listening for the modes to change using mavelink and when we detect off-board mode, then we use a little C program to send the mavelink commands to make the drone fly in a figure eight. So what we do is during like the production test of the M500, we'll put it into a figure eight mode and let it run there for 20 minutes until the battery dies. And then that will let us know that things are running well. And then all of our outdoor flights, we're using the mission mode, like hold mode. So we kind of touch like a lot of them. So very, very cool feature sets there. Great, yeah, great, great to hear that. And I think one of the other key element that a lot of the audience probably also interested in and we would love to touch on is the safety aspect. And I'd like to touch upon, not only just from Travis, you can talk about how much testing you do on your products before shipping and also, Ramon, maybe we can also introduce to the audience how drone code runs our side of software plus hardware testing and the flight testing aspect of it. So we'll actually go with Travis again first. Go ahead, Travis. Perfect, so at the board level up, we are interfacing to the NuttX shell. And so using the built-in tools, you can run things like sensor status through NuttX shell. And then over a serial port, we get dumps of data. So every one of our flight controllers will go through production tests in which we validate the IMUs, Brometer, the flash memory, the SD card. So every peripheral, every single connector, we can hit with shell commands basically in the NuttX shell and then get a response and then log that for every single flight controller. So again, that's using the built-in tools that are provided. We didn't have to develop anything. I think I wrote a little PX4 program which shows up as a command line program basically. I think it's called MoodleEyeTest. And then I buzz out all the connectors with that. But otherwise, we're using kind of the onboard command line inner interface to test every single sensor. And then there's another concept called hardware in the loop which we utilize and we run PX4 on our actual hardware and we send simulated sensor data into it. And because of the decoupled nature of PX4, it allows for a testable system, right? And the output of running simulated sensors on the hardware is the same to the outside world as a normal drone. So hardware in the loop allows us to simulate our hardware. And once again, it's code that I didn't have to write and it was already set up by the app community. So really we're using the tools that people have gone have created and we're using that in our production line. Awesome, that's great to hear Travis. Then I also like to mention that the drone code as part of the foundation, what the services that we offer to the open source community, we're hoping out with the CI infrastructure, there's a test rack, one of the main lead maintainers has in their garage. We have a huge tech rack with the hardware that we support wouldn't be exposed. So every time we add a new hardware that we maintain, we typically ask the manufacturers that ship us a board like Travis was mentioning, that's all it takes your ship as a board. We'll add it into the CI, make sure that every pull request gets tested. So we add that to our Jenkins pipeline. So it's in continuous integration testing. And we have a test suite that actually uses map leverages the map SDK. So every time you send the PR, it will actually go through that test suite using map SDK, run a couple of tests, verifying some of the functionality and run it real time in hardware using hardware in the loop. So that's one of the safety nets that we have to not introduce regressions. Of course that like with every big mature open source project, it's impossible to catch them all, but we do our best and we also have another layer of safety that is the drone code flight test team. So the flight test team is actually going out to the field and testing some of the biggest change sets that we think are going to be more impactful to the code base. So they have vehicles all the way from quadcopters to vertical and takeoff and landing vehicles. We make sure and guarantee that the safety and the feature set is actually doing what you're actually set is going to do and it doesn't introduce any type of regressions. But I think really importantly here to mention is that the tutorial that Gong just showed you on how to use map SDK, that's essentially how you could write tests for PX4. Use write about the SDK script and test it against the simulator, make sure it does what you think it's gonna do and then just submit the pull request and actually fill out the requirements for the pull request. And the GitHub actions is going to actually use on one other of our tools that is actually really cool that I don't think we mentioned is the flight review. We have a cloud infrastructure where you upload the logs. So PX4 has a log format that's called ULog and after each flight has started and you armed your drone and ULog file gets created and has very high sensitivity data, high rate data that allows you to map and plot every type of sensor within your drone. So if you go to flight review right now, it's logs.px4.io. You'll see that there's a way to upload your logs and also browse through the community published logs. So you can, you get an option to upload it privately or publicly. You choose to do it publicly. You share that data with the dev which is tremendously helpful because we host all of those logs on Nest 3 and we're able to run machine learning on them and learn a lot of what the community is doing what some of the edge cases are when some of the hardcore issues are and that's one of the biggest tools that we have out there for the community. So yeah, that's one of the things, some of the things that we are doing actively in the, from Drone Code and from the PX4 maintainer side to guarantee that there is always stable releases and that we're not introducing regressions as we go. Like Travis and Gong mentioned this is a very active community. So we get like a lot of pull requests day to day and it's hard sometimes as the communities grow to keep up with those. So we're constantly introducing new tooling and I encourage everyone to go to the PX4 developer call Wednesdays at 6 p.m. your central time and you'll get the latest per component on PX4. So we go through the quick agenda and we'll be like, okay so what's new for system architecture? What's new for multi-copter? What's new for estimation? What's new for control? What's new for the map SDK? And then we go through all of those and we explain why it got introduced. Maybe it's a bug that we wanna talk and that's where you would get to know where we're introducing new things where there's a shift in tooling. Maybe we're introducing some of the new stuff again. You get to know about the release cycles and you get to meet the PX4 maintainers. So yeah, that's a great question, Jinger. Thank you. Okay, awesome. And I think we're launching our second pool since we talk about that we're actually trying to on the second pool and is this your first time learning about PX4 drone development stack? If you have heard of us before, if you have not, we hope today's content was useful to you and that you would join for the code and stay for the community and of course, if you already know about PX4 and are part of the community, we hope today's content is also help you you would know with some more insight. Tricia, we did launch the pool, right? It's coming. All right, so while we wait for the next question on the poll, I think there's a question on the Q&A section by anonymous attendee. I think it, thank you, enjoyed really much the map SDK section. I did as well, Gong did a great job. So he's got a Durantul Friday controller. Nerd to test. Is there a recommended recently cheap drone that will be capable of this? So if you are using a Durantul, I would recommend a kit like the one that had Gong showed the, I forget the name of that kid. Gong, what's the name of the kit? QAV250. QAV250 by Holover. Yeah. Yeah, it's unrated. You have to assemble it, but it requires minimal time or effort from your side and you're able to put in your Durantul in there and you're able to fly your drone. So it will be much less than a thousand. It'll be below 500. So while we're on the subject, maybe I could also just showcase, just show to the community, to the audiences, how to navigate to some of these resources on PX4's website. So if you go to px4.io, one of the question was, does PX4 run like other things like rovers or sailing boats? I think that was one of the question. If you go into ecosystem, you actually get a list of all the different drones that runs PX4, including blimps, you'll find some underwater vehicle as well. That shows you how versatile the open source project is. And also, I think there was another question regarding predefined board that already runs PX4 where you can easily find in the compatible hardware under the autopilot list. So once you get into your development environment, your PX4, you'll be able to see all these boards and you can go and find their documentation as well. Meanwhile, another really good resource for people to get started is under community, under projects. It will take you to actually our community hub, which is on Haxster, that has a dozen of different tutorials, different community projects that's also based on PX4, to very good resource for you to get inspirations and things like that. So go back to the question to add on to what Ramon is saying, what people could actually do is also go to getting started. And then see all the Duffkits that we have available in the ecosystem, ranging from what GONE has already mentioned, including Model AI stuff kit and some other HolyBrow stuff kit and there are many others in our ecosystem. So... I like to make a special mention there. There's a lot of options for Duffkits. We have the IFOS, the Dracoar by Ubify, the Inspire Flight ones. There's also the Harbor Games Flight Kit. If you go to HarborGames.com, there's actually an open challenge with Haxster.io where if you get the kit and you accept to participate in the challenge, you get a huge discount on this kit. This is like a thousands of dollars and you pay something like 200 or 300 bucks and you get the whole kit and then you also get the chance to participate in that challenge, which is helping with drones how to solve problems in the pandemic. And that's a really cool challenge. I encourage everyone to go and check that out. Great. We hope these resources will help you to get started with drones. And actually it was very interesting to hear Travis's mentioning that a year and a half ago, when he came in as an embedded systems engineer and how quickly by joining the open source community and help him and his company to really accelerate their go-to-market with the product and everything. So gone, maybe you could also talk a little bit about your experience, how you got started with PX4 and your experience as, you know, because you also started not long ago as well, right? Yeah, right. I started a year and a half ago too. So it's not my full-time job. I do not work with embedded stuff and drone on my daily basis. I'm a full stock developer. I normally do backend for websites and stuff like that, but my thesis was in embedded stuff. So I have some background in it. So I started it by having a colleague in my work that worked with PX4 and he worked with Mavros. His name is Nuno Marx. He was my mentor and he introduced me to Ginger and we started getting along and Ginger started to tell me how to try this and try that. And I started trying things and I liked it very much. Then I got a drone from Olebro and I built the drone and that was the turning point when I started seeing the drone fly, tried to fly it, crash it several times. It will crash it. You need to be sure of that. And if you use manual modes, probably you are going to crash it, but if you use the autopilots in auto modes, it's very difficult to crash it unless you just send it to a wall. If you are on a open field, it's very difficult to crash it in auto mode. So I always like these kind of things before drones. I always liked Erdoganos and that kind of robot things and RGB LEDs, but I never went into drones before talking to Ginger and Nuno. But when I got to it, I really liked it. Then I started trying high level stuff like Mav SDK and Mavros. And that really is something because it gives you a great layer of abstraction. You don't need to know the firmware and all the algorithms implemented in the autopilot. I don't know them. And I never contributed to the firmware directly. So if you want to build algorithms or computer vision or stuff like that, you don't really need to know about the autopilot implementation. You can use Mav SDK, Mavros, interact with the drone and you don't really need to know the low level kind of things. That is my perspective. But if you want to learn it, you can always learn it. There are a lot of people that are going to help you learn it, so that is my experience. Now I'm trying to do cool stuff with the drone. I'm trying to build bridges instead of using the normal telemetry and kind of stuff like that. So that's my story. Very cool. Thank you. And the poor result just came in and it seems like we have 50% of the audiences today are software engineer and about 35% are embedded system engineer. So HopeGone's tutorial really helped you to understand getting started with open source drone development. Like you can start from hardware or you can start from the software simulator side and if you start with a simulator, you probably don't have to have the fear of crashing a railroad drone like right off the bat. It's one of those really awesome getting started tool and you'll be able to use it actually throughout your way as you advance in the whole technical stat. And it seems like we launched another poll. So are you working on a drone related project or product right now? And it seems there's about 10% says yes but there's another about 90% said no but generally interested. So hopefully today's content give you kind of like an idea of where to get started, where the community is. We are a global community. Part of actually also in our ambassador program is translation project. The translation project is also open source, hosted on open source platform called Crowden to enable the translation of the ground station software, the user guide or be delivered in local language to reach our different parts of the global ecosystem. Talking about the global ecosystem while we still have like four more minutes here left in the summit. Roman, like you mentioned about the flagship annual developer summit, would you like to talk a little bit more about that and this year we also hosted virtually? Yeah, definitely, definitely. So the PX4 Developer Summit is an awesome event. We hosted the first one, like I mentioned on my talk in 2019. To be completely transparent and open here, we're friends, right? We were expecting somewhere around like 40 people to join us in, I mean, it was Zurich. We were asking people to go to Zurich to take a flight to Switzerland. And to our surprise, more than 200 registered for them for the event and they showed up, like we had one or two that didn't come to the event, all the whole great sign up came to the event. I think it was more around a huge party around open source. We had sessions around getting started with PX4, the map SDK, we introduced the new Ross interfaces. It was the first time we met as a community to be honest and it was really cool. Everyone was really helpful and cheerful. It was a time where we actually, I personally realized how big and impactful this project was. Some of the maintainers, the open source maintainers especially, you get into this monotonous role of maintenance in the project where you're just answering GitHub issues, PRs, forum posts and all of that. And sometimes you don't really notice where the questions are coming from. And we were getting, but we got people from all the continents. We got people from like dozens of countries and they were speaking a lot of languages. It's really important that the fact that you mentioned the translation project because we have translations in Chinese, in Korean, Spanish and Portuguese in German. And the community is really worldwide. This year, we wanted to house the event in person again. Obviously we couldn't because of the worldwide situation right now, but we hosted our event virtually. And to our surprise, more than 1600 people registered for that. And to this date, we have more than 50,000 views in those videos. We're really proud of that. And we had more than 50, I think it's around 56 speakers this year. That was a huge event. I encourage everyone again to go look that up on the PX4 autopilot channel on YouTube. All the sessions are free. They are around 30 to 40 minutes long. They're very informative, very technical, but all the slides and everything are shared. So you can find links to those on the description of the video so you can follow along with them. Most of those sessions and the videos, we made them following the documentation. So we post them then back to the docs and you can find them in docs.px4.io or on the developer guide and all that. And the PX4 maintainers generally are really helpful and they're really trying to push things forward. So whenever you go to use the next year's event, make sure you pay attention to the agenda because those are the things that are gonna be discussing in the community or in the industry for the next months. So the things that we were talking about in 2019 are the things that are being implemented right now in the market and industry. So we're like 12 months ahead of time in development cycle here. Like we see technology that is not reaching market yet in PX4 upstream. So I really can't stress this enough. Like the event to be for PX4 and open source drones is the PX4 Developer Summit and it's going to be hosted next year again. Hopefully we can do it in person. If not, we'll host it virtually again and it's probably going to be around July next year. Thank you so much. And we hope that you got a lot of useful information today from this open source drone summit. Our time is up and we have all the information here. You can visit the website, follow us on social, join our community on our SAC. We will be there, really easy to find us and hope that whether you are working on a drone project or are in general like interested, we welcome to you to the community, to our open source community. And thank you again today for all your awesome presentations. Travis, Gon, and Ramon. And thank you. We will see you around. Thank you open source summit. It was awesome to be here. See you around. Bye. Bye. Thank you.